Gartner: IT’s Top 10 Technology Trends

October 18, 2011 Off By David
Object Storage
Grazed from PC Magazine.  Author: Michael J. Miller.

In a session at Gartner Symposium yesterday, Gartner’s David Cearley listed the top ten strategic technology trends that the research and advisory company think will impact IT departments most of the next year.

The list (as shown) consists of media tablets, mobile-centric applications, contextual and social user experience, the Internet of Things, app stores and marketplaces, next-generation analytics, big data, in-memory computing, extreme low-energy servers, and cloud computing…


Cearley explained that the inexorable push of technology, as exemplified by Moore’s Law of improving semiconductor density and Metcalfe’s Law of improving value of networks as they get larger, was leading to the "digitization of everything." 

He then detailed the specific trends:

Gartner Top 10 Trends

 

"Say hello to the post-PC era," he said. He expects "Bring Your Own Technology" will remain the norm, as people use their own phones and media tablets, though this leads to security and management challenges. He also expects no single platform, form factor, or technology to dominate, meaning Windows’ share of client devices will shrink steadily even though Windows may grow in absolute numbers.

He anticipates iOS, Android, and Windows to be the successful operating systems for media tablets and said companies will need business-to-employee and business-to-consumer programs. 

Mobile-centric applications and interfaces are also a coming trend, Cearley said. The interface will not just have touch and gesture support, but will more often have search, voice, and video as inputs. All these things are possible now, but there’s often a lag time between when things are possible and when they actually happen; what is possible is sometimes not socially acceptable.

Overall, simple and focused mobile applications will be crucially important not only for consumer applications, but for enterprise applications, as well. As a result, application development will change too, as multiplatform support is inevitable. HTML5 helps but is no panacea.

Apps store and marketplaces naturally follow with requirements for both the consumer experience, like easy app discover and search, and for the enterprise needs, like license management and verification. Cearley discussed using an app store as a way of enforcing governance rules for organizations or particular groups. Next year, he said, will  lead the way for enterprise app store adoption, and it will likely become more mainstream in 2013 and 2014.

The Internet of Things is leading to the Internet of everything. Already, 50 percent of Internet connections are "things" rather than people, Cearley said, and this trend grows at a rapid pace because the cost of connected electronics devices is dropping very quickly. He asked: what would you look for if the entire world was instrumented? He answered with the concept of parking meters that broadcasted whether the spot was
occupied.  

By 2015, he said, companies will need unified oversight of all of their Internet-connected technologies, but CIOs will need to orchestrate, not own, this data. For now, companies should get into a "what if" mindset, thinking about what the company can do with all the available data.

Cearley said contextual and social user experiences will become more important, particularly using features such as identity, time, location, social networks, and sensors like GPS devices and near-field communications (NFC). A lot of this will be driven by mobile devices, particularly in 2012 and 2013. From 2014 to 2018, he said, this will lead to "pervasive context" with users tracked by "context brokers" that analyze everything they do to enable new interactions.

Cearley dropped cloud computing to the bottom of the list this year. It’s still important, but as everyone looks at cloud computing and it’s important they look at other things as well.

On cloud computing, he particularly highlighted the emergence of marketplaces and brokerages and emphasized hybrid security, management, and governance. Some new cloud features, notably "DevOps," bring development and operations together to enable continuous delivery of new features and "logical" multi-tenancy with dedicated execution environments on top of standard software. Cloud-centric design will become a necessity, but as more people move to the cloud, we’ll see the cloud failing to live up to its hype.

Next-generation analytics is driving new ways of organizations to make decisions. This trend in turn drives business intelligence to more people within an organization.

As part of this, Cearley talked about using pattern recognition to optimize, simulate, and predict, as there was a move from historic research to predictive analytics.

Big data is very much tied in with analytics and the "Internet of Things," but what’s different now is that we are reaching some breaking points. The concept of a single data warehouse is not working, and we’ll need multiple warehouses seen as a "virtual data warehouse." In this area, he particularly highlighted the Apache Hadoop open-source project, often connected to more proprietary robust distributed file systems.

Cearley talked about in-memory Flash and RAM for performance. Flash memory is getting cheaper and the move to cloud-based services is making Flash more acceptable on the client. On the server, it’s much more important as Flash becomes a new tier of storage.   

More interesting, he said, is how memory is changing applications. This includes a number of different technologies, like in-memory data management, low-latency messaging, and in-memory application platforms—all of which are separate things. Most of these technologies have been technically feasible for a long time, but now we’re getting enough memory in systems to make it practical. We also see a lot of innovation from the big cloud providers who had to use in-memory techniques because of their scale. All this may result in applications that are designed differently, using in-memory real-time analytics instead of data warehouses.

Finally, Cearley hit on extreme low-energy servers—using Atom or ARM-based chips to create solutions with slow but tiny servers. Select applications can yield big savings, notably the "big data" applications and Hadoop. However, most applications and situations would be a bad fit for such servers.