Scarcely a day passes when I don't receive a report from some analyst or research organisation informing me of how a new product has saved a bunch of companies a massive sum of money, or how a product has been identified as a leader or innovator in their chosen market niche. But can we trust these reports?
Tagged With analyst
Predicting the future is near impossible -- but that doesn‘t stop us all from having a red hot go. Human beings have been predicting the future since the beginning of history and the results range from the hilarious to the downright uncanny.
One thing all future predictions have in common: they‘re rooted in our current understanding of how the world works. It‘s difficult to escape that mindset. We have no idea how technology will evolve, so our ideas are connected to the technology of today.
Global IT spending is expected to dip this year, dropping 0.3% year-on-year, as the UK market deals with the economic uncertainty caused by the Brexit, according to analyst firm Gartner. But IT spending is set to pick up a bit in 2017, growing 2.9% to $4.56 trillion. Gartner also looked at whether the upcoming US presidential election will have an impact on IT spending.
Public cloud services spending in Australia is set to reach $5.55 billion (US$4.18 billion) by the end of the year, while the global market is expected to hit $276.73 billion (US$208.6 billion). But surprisingly, a large number of organisations still have no plans to use cloud services, according to analyst firm Gartner. Read on to find out more.
No new technology since the dawn of the internet has captured the imagination like blockchain. Designed to run unregulated electronic currency, the blockchain is promoted by many as having far broader potential in government, identity, voting, corporate administration and healthcare, to name just some of the proposed use cases. But these grand designs misunderstand what blockchain actually does. Blockchain is certainly important and valuable, as an inspiration for brand new internet protocols and infrastructure. But it’s a lot like the Wright Brothers' Flyer, the first powered aeroplane. It’s wondrous but impractical. Read on to find out more.
Digital transformation is a term that gets tossed around a lot these days. It's often associated with claims that organisations must undergo a journey to become truly digital enterprises if they want to remain competitive. Sounds vague as hell. So what does it really mean to be a digital enterprise? Let's find out.
Ransomware-as-a-service isn't new and speaks volumes about just how sophisticated the cybercriminal operations behind them have become -- they run like businesses. But a ransomware called Cerber takes this idea to a new level as it operates more like a franchise. We spoke to a CheckPoint security expert about the Cerber ransomware.
Industry experts are always banging on about how IT professionals should learn to understand their business as a whole, and not just the technology side of things. Easier said than done. IT leaders should change the way they communicate to their teams in order to help members develop business acumen.
Bring-your-own-device (BYOD) has contributed to increased occurrences of shadow IT, that is, the use of IT systems and services by employees that have not been approved by their organisations. Some workers are even creating their own apps and using them to work more efficiently. But shadow IT has the potential to become a security risk and one way to mitigate this is by making employees accountable for the mobile apps they bring into the organisation.
Knowledge is power. So if data equates to knowledge, having a lot of it will naturally allow us to make better decisions that will lead to wealth and success, right? Certainly many organisations are using big data to bolster revenue and bring about overall improvements. But big data isn't the silver bullet to business woes and you can still make bad decisions even with all the right information at your fingertips. We take a look at how big data can lead to bad decisions.
As organisations wake up to the fact that technology plays an important role in how they can remain competitive, they will be motivated to make fundamental changes in how business is conducted. This often comes in the form of a business or digital transformation project to change the way their company operate to make the most of technology. But these changes could alienate employees if not managed correctly.
One of the cardinal sins of software development is bloating an app with too many features. It's tempting to add a whole bunch of customer-facing features in to address the needs of everybody. But we humans are easily overwhelmed by too many choices, which is why it's important to keep apps lean. If you've built an app that is bursting from the seams with features, IT analyst firm Gartner has a few tips on how to put it on a diet.
The phrase "cognitive computing" is often bandied about when discussing artificial intelligence, data mining and deep machine learning. But what does it actually mean? During Nvidia's GTC technology conference, IBM Watson's chief technology officer Rob High gave a perfect distillation of this complex topic.
Microsoft's latest exercise in demonstrating the power of artificial intelligence (AI) went hilariously wrong when it unleashed Tay, an AI for Twitter that was meant to emulate a teenage girl, onto the Twittersphere. She was supposed to engage with people on Twitter like a real person, learning from interactions with users. Perhaps Tay did this too well because she went on to tweet a ton of inappropriate tweets before Microsoft pulled the plug on her. So what went wrong and what can we learn from all of this? Let's find out.
In the next few days, humanity's ego is likely to take another hit when the world champion of the ancient Chinese game Go is beaten by a computer. Currently Lee Sedol – the Roger Federer of Go – has lost two matches to Google's AlphaGo program in their best-of-five series. If AlphaGo wins just one more of the remaining three matches, humanity will again be vanquished.
The last few years have seen numerous studies pointing to a bleak future with technology-induced unemployment on the rise. For example, a pivotal 2013 study by researchers at the University of Oxford found that of 702 unique job types in the United States economy, around 47% were at high risk of computerisation. This was backed up by similar findings in Australia suggesting 44% of occupations – representing more than five million jobs – were at risk over the coming 10 to 15 years. Is the situation really so dire? Are we heading towards mass unemployment as computers and robots do all the work?