Organizations and individuals around the world spend billions of dollars each year providing and using information technology. New capabilities, standards, and products appear every day. Many involve incompatible or even competing specifications and interfaces. Remember VHS versus Betamax? How about Blu-ray versus HD DVD? Which technology is the right choice for you or your organization? Working with ACUTA, researchers at Murray State University (MSU) collected and analyzed scientific, academic, industry, and popular data in order to predict
technology trends. The target timeframe for the predictions was two to five years.
If this were an easy task, readily approachable with a standard scientific method, making IT investments would be much simpler.
There have been many famous technology forecast blunders, such as Tom Watson, chairman of IBM, stating in 1943, “I think there is a market for maybe five computers,” and Bill Gates, Microsoft, in 1981 saying, “640K ought
to be enough for everyone.” The predictions that follow are personal opinion about the most important technology trends for the next two to five years (your results may vary).
A basic but essential analytical assumption was that the more often a technology term appeared in literature, on the Web, or in a conversation, the more likely it was to be a viable and important technology trend. As an example, the concept of cloud computing appeared in one or two articles in technology publications more than five years ago. One or two years later, it might have been in 10 articles per year. Currently, the phrase cloud computing probably appears in 100 articles per week.
This kind of progression is a clear indicator of emerging importance. The trick is to identify the emerging trends early and know which are important.