Beyond Now: Tech Predictions for 2017
The cameras on a pilot model of an Uber self-driving car are displayed at the Uber Advanced Technologies Center, Sept. 13, in Pittsburgh, Penn. (Angelo Merendino/AFP/Getty Images)
IEEE Computer Society has reported its predictions for the top tech trends for 2017, and looks ahead to the next five years for the important technology developments that will impact our lives by 2022, writes Dejan Milojicic. – @siliconeer #siliconeer #SiliconValley #California #DejanMilojicic #IEEE #IEEEComputerSociety #Technology #TechPredictions #TiE #TiESV
The 2017 Consumer Electronics Show just concluded in Las Vegas and it promises a whole new gamut of tech. Here’s what IEEE Computer Society reports as technology trends that will reach adoption in 2017:
Industrial IoT: With many millions of IoT sensors deployed in dozens of industrial strength real-world applications, this is one of the largest and most impactful arenas for big data analytics in 2017.
Self-driving Cars: In Silicon Valley, one can easily see up to three self-driving cars on the same street. While adoption is less likely in general use, the broader adoption will likely occur in constrained environments such as airports and factories.
Artificial Intelligence, Machine Learning, Cognitive Computing: These overlapping areas are a fundamental requirement for big data analytics and for other areas of control and management. Machine learning, and deep learning in particular are quickly transitioning from research lab to commodity products. On the software side, advanced engines and libraries from industry leaders, such as Facebook and Google, are making it to open source. On the hardware side, we see continually improving performance and scalability from existing technologies (CPUs and GPUs), as well as emerging accelerators. Consequently, writing domain-specific applications that can learn, adapt, and process complex and noisy inputs in near real time is easier than ever and a wide range of new applications is emerging.
5G: While it is unlikely that 5G will have immediate adoption in the next year, its roadmaps and standards are being developed, influencing the applications that will eventually evolve. Also, some early-use cases of deployment are being pursued.
Accelerators: While looking at the long term, the ending of Moore’s law is being addressed by novel technologies such as those covered by rebooting computing (see bullet 1 in 5 Year Trends below), heterogeneous computing founded on accelerators enables the stretching of performance boundaries in today’s technologies.
Disaggregated Memory – Fabric–attached Nonvolatile Memory (NVM): While NVM has achieved mixed success in productization in the past year, the number of companies working in this arena, be it on materials, architecture, or software, makes it a certain candidate for imminent adoption. Fast, nonvolatile storage bridges the gap between RAM and SSD’s, with a performance-cost ratio lying somewhere in between. This fast, nonvolatile storage will be initially configured either as “a disc,” accessed by the OS like any other permanent storage device, or as “RAM” in DIMM slots, accessed by the OS as memory. But once the hardware and OS support is fully figured out, this technology will open the door to new applications that aren’t currently available.
Sensors Everywhere and Edge Compute: From smart transportation and smart homes, to retail innovations, surveillance, sports and entertainment, and industrial IoT, we are starting to see intelligence being aggressively deployed at the edge. With intelligence comes the need to compute at the edge, and a variety of edge compute offerings are opening up new disruptive opportunities.
Blockchain (beyond Bitcoin): While known as the technology behind Bitcoin, Blockchain has far more disruptive uses, potentially changing the way in which we implement processes like voting, financial transactions, title and ownership, anti-counterfeiting, and digital rights managements, securing these processes without the need (and bottleneck) of a central authority.
Hyper-converged Systems: Also known as “software-defined everything,” hyper-converged systems are bundles of hardware and software that contain elements of compute, storage and networking together with an orchestration system that lets IT administrators manage them using cloud tools and dev/ops practices. While they have been on the roadmap for major IT players for the last three to five years, we see major adoption trends that may cause their growth to explode in 2017.
Technology trends that will reach adoption in by 2022:
Rebooting Computing (includes quantum computing): The end of Moore’s law has resulted in the end of the ITRS (International Technology Roadmap for Semiconductors) and its transformation into IRDS (International Roadmap for Devices and Systems), focusing on new technologies, such as quantum computing, neuromorphic, adiabatic, and many others.
Human Brain Interface: There are many types of interfaces developed, but the one that can be most impactful is human brain interface that can drive and control machines directly. This will be enabled by the rebooting computing technologies above but will also require separate innovation to connect the human brain to hardware.
Capabilities – Hardware protection: Protecting data at rest and flight requires more sophisticated security technologies based on more robust hardware protection, such as capabilities. Capabilities had been popular in the 1960s but were abandoned in favor of paging which was sufficient when physical memory was small. Rapid advances in memory, interconnects, and processors, as well as requirements of big data applications, open up new opportunities for capabilities.
The Year of Exascale: The scientific community is starting to converge on 2022 being the year where they can expect the first wave of Exascale systems to be deployed. An Exascale machine would almost double the performance of all of 2016’s top 500 supercomputers put together, enabling breakthroughs in scientific fields such as weather, genomics, life sciences, energy, and manufacturing.
NVM Reaches Maturity: There are indicators that the long-predicted adoption of NVM is coming and by 2022, we’ll be at least in the second or third generation of true nonvolatile memory devices that will change the entire memory-storage hierarchy, and associated software stack, across the IT industry.
Silicon Photonics Becomes a Reality: While bridging technologies (such as VCSEL-based photonics) may be sufficient to address the needs for the next five years, we see 2022 as the pivot point where highly integrated silicon photonics components will be necessary to meet the combined cost, energy, and performance requirements of Exascale systems.
Smart NICs: Networking equipment, such as the kind seeing explosive growth in data centers, is becoming more commoditized and open. Ever more sophisticated chips in network interface cards (NICs) allow more offloading of traditional networking tasks from the CPU to the NIC, including encryption, compression, package management, etc. We’ve seen this trend before with graphics cards: commodity specialized hardware mated with good library support enabled an explosion of applications and libraries in domains far from graphics, earning the nickname “GPGPU.” Similarly, GPNICs may allow newly accelerated software to take advantage of the unique hardware properties of NICs, both within classical network applications, such as key-value stores, and in new domains, such as text processing.
Power Conservative Multicores: Integrated processor cores on a chip will go over hundreds and thousands for top 500 and green 500 HPC machines. With more processors on a chip, memory architectures and data transfer will become key technologies in hardware. In software, a parallelizing compiler that allows users to employ the many cores efficiently and easily will reduce rapidly increasing software development costs. Automatic power reduction with the collaboration of the architecture and compiler will become crucial to apply clock or power gating or frequency and voltage lowering to idle processor cores.
Contributors to these predictions include: Paolo Faraboschi, Hewlett Packard Enterprise, fellow; Eitan Frachtenberg, data scientist; Hironori Kasahara, IEEE Computer Society president-elect; Phil Laplante, professor, Penn State University; Dejan Milojicic, Hewlett Packard Enterprise, distinguished technologist, and IEEE Computer Society past president; and John Walz, IEEE Computer Society past president.