Every now and then, I spend some amount of time thinking about the more philosophical aspects of IT, trying to see the bigger picture. I find the attempt to break from the complexity and detail of my daily routine extremely useful for my constant personal development. I am one of those firm believers in the idea that surviving and then thriving in modern IT requires a lot of looking behind the horizon.
My dear friends from Codecamp invited me to do a keynote at their conference in Timisoara and we agreed it was a good idea to build the talk around my thoughts about the future of IT. Attempting to distill these thoughts into a 45 minutes session was a little bit harder than I initially envisioned đ (the slide deck is here) This article is a follow-up to the keynote providing a written version of these thoughts. Just to make things clear, this is not an attempt to perform an exhaustive factual analysis of short, medium, and long-term developments in our industry. It is merely a short inventory of stuff I'm thinking about which you, my reader, will hopefully find interesting.
Turns out my entire vision about the future of IT can be synthesized around a question, a challenge, and an expectation. The question is âHow fast will computing evolve?â. The challenge is âavoid misusing the power of computingâ. The expectation is to âhelp humanity advance, and, ultimately, surviveâ. In case you find the whole text simply too long, let me state the conclusion right here:
My simplified take on the future of IT? In a nutshell, itâs defined by one question, one challenge, and one expectation. How fast will computing evolve? Probably at the same vivid pace in the case of classical computing. Hopefully, fast enough to become viable, in the case of quantum computing. How can we avoid misusing the power of computing, especially in the fortunate case when quantum computing becomes viable? By educating young generations from the ground up to be aware of the pitfalls and to keep an eye on whatâs happening, no matter what their individual field of expertise is. Will IT help humanity advance, and ultimately, survive? A strong yes if quantum computing becomes viable, a question mark if it turns out weâre stuck with classical computing only.
Now, if youâre still interested, letâs get into one more level of detail.
Back in my college years (some 20 years ago), the center of the computing universe was the CPU. We were of course looking to graphics cards as well, but only to get those extra frames in the games (weâre still doing that, by the way đ). I was in my very early days of Machine Learning and one of the first things I learned was that the class of algorithms commonly referred as neural nets was not of particularly great interest as they almost reached their potential. Almost nobody back then imagined that, fast forwards 20 years, the humble GPU core will be on the cutting edge of spectacular advancements in deep learning. And this is already somehow old news. Today weâre talking about stuff like Soft DPUs (mostly FPGAs â Field Programmable Gate Arrays) and Hard DPUs (like TPUs â TensorFlow Processing Units). And weâre also seeing more and more ASICs (Application Specific Integrated Circuits). While the all-purpose CPU is evolving as well (mostly in the direction of multi-core), weâre seeing more and more of our software moving into high-performance, dedicated hardware. This is the modern reincarnation of Mooreâs law which (at least in this form) seems quite alive and well.
And then there is data. According to a study performed a few years ago, by 2020 weâre going to pass worldwide the mark of 44 zettabytes (44 trillion GB), up from a âmereâ 4.4 zettabytes in 2013. It doesnât even matter how accurate the study will prove to be. By now (2018) itâs quite clear that at least the magnitude of the increase is aligned with reality. There is no end in sight here. It sure looks like in the era of cloud computing, the race to build bigger, faster, and cheaper storage is just gearing up.
The inevitable price weâre paying for this super-fast pace is complexity. I like to show my audiences an image with the standard advanced analytics pattern in Microsoft Azure (slide 9). Itâs a bit scary because itâs complex, has lots of moving parts, and almost every single pictogram in it can be the topic of days-long workshops. And, most importantly, thereâs nothing wrong with it. This is reality. In fact, if you change the name Azure with, say, Amazon, and the service names with their counterparts, you get pretty much the same result. A realistic, sane, and utterly complex architecture. Move down one level from architecture to programming, and you see the same level of complexity. For the sake of simplicity (no pun intended), letâs narrow the discussion to the realm of data science only. The number of choices is quite daunting. A rich choice of libraries is available with overlapping capabilities, different update and maintenance cycles, some backed up by industry giants, and some simple open source projects with a few contributors. Combining all this into a sound DevOps process is a task even the most seasoned managers fear. And when you add into the equation the fact that the data science part must be seamlessly integrated with other, classical areas (like user experience, backend APIs, and the like), the complexity of the task fully reveals itself.
This is not a characteristic of data science or data science-related projects only. There are many voices across the industry complaining about how complex (and sometimes apparently inefficient) software development in general still is. The truth is, the more we delve into a world of cloud computing, artificial intelligence, and IoT, the more complexity weâll get. There is complexity in platforms, in features, and in tools. Simply put, there is complexity in choices. One of the most important challenges a person faces in modern IT is learning to cope with complexity.
Speaking about the human factor, the reality is weâre not capable of producing enough of this resource to feed the ever-increasing needs of modern in IT. In data science only, most studies agree that by the mid-2020s there will be a lack of skilled personnel exceeding 200,000 professionals worldwide (some studies advance this number for US only). For some reason, the education arm of our society does not have the capability of coping with the speed at which things happen in the industry. One can only imagine that if this speed further increases, the pressure on education will increase as well. The question then remains â what to do about the increasing gap? One other interesting side of this is how this problem applies to people already working in the field. They do face the same problem, with almost the same intensity. How do I keep up with the pace? How do I make sure my knowledge does not quickly become obsolete? How do I cope with all the choices? In the world of instantly available compute resources, DevOps and release cycles measured in weeks, virtually everybody faces these questions which can be summed up in a simple way â How do I adapt?
These spectacular advancements in computing power and software are finally enabling us to make some progress in the way weâre interacting with them. For decades, weâve been stuck in the initial paradigm of looking at rectangular screens. Even the mighty smartphones of the last decade are powerful prisoners of this paradigm. Weâre just reaching decent levels of quality with Virtual Reality (VR) and weâre scratching the surface of Mixed Reality (MR). The devices are still bulky, the limitations are significant. Yet, once you spend a few tens of minutes in Microsoftâs HoloLens for example, you realize both the potential and the disruption to come. If the speed at which computing evolve will continue to increase, itâs safe to assume that in a decade or two, the ways we interact with computers will improve significantly.
Obviously, these are only a few aspects that are tied to the core question â How fast will computing evolve? You can both challenge the way I see them and think about many more. This does not change the core of the discussion though. The pace at which computing in general will evolve will determine the evolution of all the aspects I mentioned: complexity, limitations of the human factor, and interaction patterns. Thatâs the reason I ended up with this question as one of the three pillars of my vision about the future of IT.
Humanity has a long history of being unable to cope with the big advancements of science and technology. Donât get me wrong, Iâm a true believer and a vivid supporter of the critical need for such advancements. In the same time, I am trying to be as objective and realistic as possible. In my opinion, there is an intrinsic connection between technological advancement and conflict throughout our entire history.
Before moving on, let me make an important note. I consider misusing the power of computing a challenge, for obvious reasons. Properly using the power of computing is falling under the category of expectations rather than challenges. Itâs because of this that I am focusing more on the misuse (which, by its very nature, is a negativist approach). Itâs not because Iâm a negativist that sees nothing but trouble around. On the contrary, I am a highly optimistic person in general. With respect to IT, I firmly believe it is one of the foundational domains that can carry us humans into the next millennia of our existence, helping us to survive and thrive. Having said that, I believe we need to be realistic and, most importantly, watch and learn from our history.
I consider the technological advancements of the past ten years to have the same order of magnitude as Einsteinâs famous E=mc2. They might not be as groundbreaking from a purely scientific point of view, but they surely have a comparable aggregated impact on humanity, if not bigger. They've already changed our social behavior, extended significantly the area of threats weâre exposed to, and opened countless new doors to those who seek to do harm. And they brought immense power to our fingertips.
Whether itâs our privacy, our safety, or our money, they all are under potential threat by the misuse of technology. The number one threat I see is lack of awareness. Most technology consumers today are almost completely ignorant of these threats. The countless public stories about leaked accounts and passwords are generating some awareness, but itâs way too little. I believe we still need to do a lot in this area. And it must start with educating our little ones about the pitfalls of using technology. If you ask me, this kind of education became at least as important as teaching speaking, writing, or math.
When it comes to misusing the power of computing, the core responsibility is shared between technology providers, governments, and public opinion. The ones that create technology and the ones that have the power to regulate the use of it are in the driving seat, while public opinion is critical to keep the checks and balances in place. The harsh reality is that for both technology providers and governments it is quite difficult to live up to the expectation. Thatâs why the balance is so fragile and delicate. And thatâs why we see almost every day living examples of technology misuse.
Being a commercial market player, a technology provider will always be inclined to use any means to keep and improve its market position. And if itâs so successful that it either creates a market of its own to dominate or disrupts and existing market to the point it dominates it, the inclination is that much more powerful. Add to this the inevitable loopholes in the technology that allow others to misuse its technology and you get a comprehensive picture of the high-level threats. Unfortunately, there are not many global technology providers that publicly assume a mission that includes fighting those inclinations. Being a public example in this area is one of the reasons I respect and admire Microsoftâs CEO, Satya Nadella.
Governments on the other side, have a natural tendency towards gaining more grip and more control. The power of IT comes like a glove to those within governmental structures who seek ever more control, sometimes even at the price of severely invading the privacy of their own citizens and of other governmentâs citizens. Fortunately, both commercial market players and governments are subject of public opinion scrutiny which puts, to a certain extent, the breaks on these tendencies. There is though another area, which is the somewhat gray area of technology providers that work for governments or under their jurisdiction. This category of providers is mostly shielded from the scrutiny of public opinion. Avoiding severe misuse of technology in this case eventually boils down to the individual responsibility of people placed in key positions within those organizations. Consequently, itâs a very high risk bet, especially because this gray area exists in every single state on the globe, be it a democracy, a non-democracy, or even a rogue state.
To make things very real and tangible, I like to give three categories of examples:
For sure, the list can go on and on and on. The more we make our daily lives dependent on technology, the longer the list gets. To live up to the daunting challenge of misusing the power of computing we need to focus on two areas: how we build technology and how we watch over its use. In both cases, education must play a fundamental role. The new generations of IT specialists must be educated from the ground up with a mindset focused around safety and security as well. The new generations of citizens must be educated to understand the pitfalls and threats of technology misuse and to keep a critical and watchful eye on its use. Itâs the only way we can live up to this fundamental challenge on the long term.
This is the point where Iâll get a bit philosophical, kind of unavoidable when talking about the future đ. I believe humanity needs to achieve a few core goals to get a chance for long term survival:
Most of these goals eventually boil down to these fundamental research categories:
Itâs hard to argue that IT doesnât play a fundamental role in either of this. Iâd say none of this would even be possible without advances in computing power and software. Think about CERN. Think about cancer research. Think about climate modeling. Think about any kind of activity involving science and research. One way or the other, itâs powered by IT.
While thinking about âHow fast will computing evolve?â, Iâve tried to paint an accurate picture of the spectacular advancements that are developing these days. Time to get one more step behind and attempt to see a bigger picture. All the advancements I talked about can be gathered under a big umbrella named âclassical computingâ. Letâs oversimplify the discussion and define classical computing as any kind of computing that eventually ends up working with two distinct states, 0 and 1.
Despite its awesome power and constant evolution, classical computing is severely limited in its capability of carrying us towards achieving the core goals I mentioned a few moments ago. Thereâs an example used by John Azariah and Julie Love from Microsoft in their talk at Build 2018 which I absolutely love. They show the âwell-knownâ caffeine molecule (well-known by folks in IT that is đ) which is a small molecule whose chemical properties can be calculated by todayâs supercomputers. A slightly bigger molecule like the FeMoco molecule becomes practically unsolvable using the same computing power. The FeMoco molecule plays a central role in a fundamental natural process called nitrogen fixation (the conversion of atmospheric N2 into NH3 â ammonia). It is also called ânatureâs fertilizerâ because it is the cofactor of nitrogenase, the enzyme that catalyzes the process. Understanding the inner workings of the FeMoco molecule translates into the capability of spectacularly improve the process through we currently produce fertilizers, process which is one of the major sources of pollution. Unfortunately, to study the molecule with classical computing resources would literally mean billions of years.
Enters quantum computing, which promises spectacular improvements in the computing power needed to study such natural processes. An oversimplified definition of quantum computing would say it is the kind of computing that eventually ends up working with an infinity of states between 0 and 1, embodied in a qubit. It is a simplistic definition because it doesnât speak about other quantum phenomena like superposition and entanglement which are involved as well in quantum computing. More (much more đ) on this will follow in future posts. For now, letâs just say that quantum computing is aligned with the way in which nature works in way that classical computing never was and will never be. Because of this, it brings the promise of being a core catalyst in the process of achieving humankindâs goals for long-term survival.
Youâve probably already heard how quantum computing has the capability of bringing the kind of computing power that will dwarf the combined capacity of all of todayâs datacenters on the planet. While this is certainly true for certain areas of applicability, itâs also quite certain that it will not replace classical computing (at least not for the foreseeable future). I regard quantum computing as an enormously powerful companion to classical computing, one that will enable us to foray into areas that are currently out of reach. For other practical examples, take a look at this list of problems that could be solved by a quantum computer, published by Microsoft.
One can ask why did I mention quantum computing here, instead of the other sections (the question or the challenge). It is after all valid to ask how fast will be get to quantum computers that work at scale or how to avoid misusing their potentially mind-bending power. Well, itâs because I regard working quantum computing as a necessary condition that the IT of the future must meet to be a true enabler for humankindâs long-term goals. If itâs both necessary and sufficient, remains to be seen.
So, this is my simplified take on the future of it. In a nutshell, itâs defined by one question, one challenge, and one expectation. How fast will computing evolve? Probably at the same vivid pace in the case of classical computing. Hopefully, fast enough to become viable, in the case of quantum computing. How can we avoid misusing the power of computing, especially in the fortunate case when quantum computing becomes viable? By educating young generations from the ground up to be aware of the pitfalls and to keep an eye on whatâs happening, no matter what their individual field of expertise is. Will IT help humanity advance, and ultimately, survive? A strong yes if quantum computing becomes viable, a question mark if it turns out weâre stuck with classical computing only.