At any given time, a technology or two captures the zeitgeist. A few years ago it was social media and mobile that everybody was talking about. These days it’s machine learning and block chain. Everywhere you look, consulting firms are issuing reports, conferences are being held and new “experts” are being anointed.
In a sense, there’s nothing wrong with that. Social media and mobile computing really did change the world and, clearly, the impact of artificial intelligence and distributed database architectures will be substantial. Every enterprise needs to understand these technologies and how they will impact its business.
Still we need to remember that we always get disrupted by what we can’t see. The truth is that the next big thing always starts out looking like nothing at all. That’s why it’s so disruptive. If we saw it coming, it wouldn’t be. So here are three technologies you may not of heard about, but you should start paying attention to. The fate of your business may depend on it.
1. New Computing Architectures
In the April 19th issue of Electronics in 1965, Intel Co-Founder Gordon Moore published an article that observed the number of transistors on a silicon chip were doubling roughly every two years. Over the past half century, that consistent doubling of computing power, now known as Moore’s Law, has driven the digital revolution.
Today, however, that process has slowed and it will soon it come to a complete halt. There are only so many transistors you can cram onto a silicon wafer before subatomic effects come into play and make it impossible for the technology to function. Experts disagree on exactly when this will happen, but it’s pretty clear that it will be sometime within the next five years.
There are, of course, a number of ways to improve chip performance other than increasing the number of transistors, such as FPGA, ASIC and 3D stacking. Yet those are merely stopgaps and are unlikely to take us more than a decade or so into the future. To continue to advance technology over the next 50 years, we need fundamentally new architectures like quantum computing and neuromorphic chips.
The good news is that these architectures are very advanced in their development and we should start seeing a commercial impact within 5-10 years. The bad news is that, being fundamentally new architectures, nobody really knows how to use them yet. We are, in a sense, back to the early days of computing, with tons of potential but little idea how to actualize it.
2. Genetic Engineering
While computer scientists have been developing software languages over the past 50 years, biologist have been trying to understand a far more pervasive kind of code, the genetic code. For the most part, things have gone slowly. Although there has been significant scientific progress, the impact of that advancement has been relatively paltry.
That began to change in 2003 with the completion of the Human Genome Project. For the first time, we began to truly understand how DNA interacts with our biology, which led to other efforts, such as the Cancer Genome Atlas, as well as tangible advancements in agriculture. For the first time, genomics became more than mere scientific inquiry, but a source of new applications
Now, a new technology called CRISPR, is allowing scientists to edit genes at will. In fact, because the technology is simple enough for even amateur biologists to use, we can expect genetic engineering to become much more widespread across industries. Early applications include liquid fuels from sunshine and genomic vaccines.
“CRISPR is accelerating everything we do with genomics,” Megan Hochstrasser of the Innovative Genomics Initiative at Cal Berkeley told me, “from cancer research to engineering disease resistant crops and many other applications that haven’t yet come to the fore. Probably the most exciting aspect is that CRISPR is so cheap and easy to use, it will have a democratizing effect, where more can be done with less. We’re really just getting started.”
3. Materials Science
Traditionally, the way you improved a material to build a product has been a process of trial and error. You changed the ingredients or the process by which you made it and saw what happened. For example, at some point a medieval blacksmith figured out that annealing iron would make better swords.
Today, coming up with better materials is a multi-billion business. Consider the challenges that Boeing faced when designing its new Dreamliner. How do you significantly increase the performance of an airplane, a decades old technology? Yet by discovering new composite materials, the company was able to reduce weight by 40,000 pounds and fuel use by 20%.
With this in mind, the Materials Genome Initiative is building databases of material properties like strength, density and other things, and also includes computer models to predict what processes will achieve the qualities a manufacturer is looking for. As a government program, it is also able to make the data widely available for anyone who wants to use it, not just billion dollar companies like Boeing.
“Our goal is to speed up the development of new materials by making clear the relationship between materials, how they are processed and what properties are likely to result,” Jim Warren, Director of the Materials Genome program told me. “My hope is that the Materials Genome will accelerate innovation in just about every industry America competes in.”
It’s Better To Prepare Than Adapt
For the past few decades, great emphasis has been put on agility and adaptation. When a new technology, like social media, mobile computing or artificial intelligence begins to disrupt the marketplace, firms rush to figure out what it means and adapt their strategies accordingly. If they could do that a bit faster than the competition, they would win.
Today, however, we’re entering a new era of innovation that will look much more like the 50s and 60s than it will the 90s and aughts. The central challenge will no longer be to dream up new applications based on improved versions of old technologies, but to understand fundamentally new paradigms.
That’s why over the next few decades, it will be more important to prepare than adapt. How will you work with new computing architectures? How will fast, cheap genetic engineering affect your industry? What should you be doing to explore new materials that can significantly increase performance and lower costs? These are just some of the questions we will grapple with.
Not all who wander are lost. The challenge is to wander with purpose.
Source: Digital Tonto