A few thoughts on the future of IT

The Keynote for Codecamp Timisoara 2018

Posted by Ciprian on October 06, 2018

Every now and then, I spend some amount of time thinking about the more philosophical aspects of IT, trying to see the bigger picture. I find the attempt to break from the complexity and detail of my daily routine extremely useful for my constant personal development. I am one of those firm believers in the idea that surviving and then thriving in modern IT requires a lot of looking behind the horizon.

My dear friends from Codecamp invited me to do a keynote at their conference in Timisoara and we agreed it was a good idea to build the talk around my thoughts about the future of IT. Attempting to distill these thoughts into a 45 minutes session was a little bit harder than I initially envisioned 😊 (the slide deck is here) This article is a follow-up to the keynote providing a written version of these thoughts. Just to make things clear, this is not an attempt to perform an exhaustive factual analysis of short, medium, and long-term developments in our industry. It is merely a short inventory of stuff I'm thinking about which you, my reader, will hopefully find interesting.

Turns out my entire vision about the future of IT can be synthesized around a question, a challenge, and an expectation. The question is “How fast will computing evolve?”. The challenge is “avoid misusing the power of computing”. The expectation is to “help humanity advance, and, ultimately, survive”. In case you find the whole text simply too long, let me state the conclusion right here:

My simplified take on the future of IT? In a nutshell, it’s defined by one question, one challenge, and one expectation. How fast will computing evolve? Probably at the same vivid pace in the case of classical computing. Hopefully, fast enough to become viable, in the case of quantum computing. How can we avoid misusing the power of computing, especially in the fortunate case when quantum computing becomes viable? By educating young generations from the ground up to be aware of the pitfalls and to keep an eye on what’s happening, no matter what their individual field of expertise is. Will IT help humanity advance, and ultimately, survive? A strong yes if quantum computing becomes viable, a question mark if it turns out we’re stuck with classical computing only.

Now, if you’re still interested, let’s get into one more level of detail.

How fast will computing evolve?

Back in my college years (some 20 years ago), the center of the computing universe was the CPU. We were of course looking to graphics cards as well, but only to get those extra frames in the games (we’re still doing that, by the way 😊). I was in my very early days of Machine Learning and one of the first things I learned was that the class of algorithms commonly referred as neural nets was not of particularly great interest as they almost reached their potential. Almost nobody back then imagined that, fast forwards 20 years, the humble GPU core will be on the cutting edge of spectacular advancements in deep learning. And this is already somehow old news. Today we’re talking about stuff like Soft DPUs (mostly FPGAs – Field Programmable Gate Arrays) and Hard DPUs (like TPUs – TensorFlow Processing Units). And we’re also seeing more and more ASICs (Application Specific Integrated Circuits). While the all-purpose CPU is evolving as well (mostly in the direction of multi-core), we’re seeing more and more of our software moving into high-performance, dedicated hardware. This is the modern reincarnation of Moore’s law which (at least in this form) seems quite alive and well.

And then there is data. According to a study performed a few years ago, by 2020 we’re going to pass worldwide the mark of 44 zettabytes (44 trillion GB), up from a “mere” 4.4 zettabytes in 2013. It doesn’t even matter how accurate the study will prove to be. By now (2018) it’s quite clear that at least the magnitude of the increase is aligned with reality. There is no end in sight here. It sure looks like in the era of cloud computing, the race to build bigger, faster, and cheaper storage is just gearing up.

The inevitable price we’re paying for this super-fast pace is complexity. I like to show my audiences an image with the standard advanced analytics pattern in Microsoft Azure (slide 9). It’s a bit scary because it’s complex, has lots of moving parts, and almost every single pictogram in it can be the topic of days-long workshops. And, most importantly, there’s nothing wrong with it. This is reality. In fact, if you change the name Azure with, say, Amazon, and the service names with their counterparts, you get pretty much the same result. A realistic, sane, and utterly complex architecture. Move down one level from architecture to programming, and you see the same level of complexity. For the sake of simplicity (no pun intended), let’s narrow the discussion to the realm of data science only. The number of choices is quite daunting. A rich choice of libraries is available with overlapping capabilities, different update and maintenance cycles, some backed up by industry giants, and some simple open source projects with a few contributors. Combining all this into a sound DevOps process is a task even the most seasoned managers fear. And when you add into the equation the fact that the data science part must be seamlessly integrated with other, classical areas (like user experience, backend APIs, and the like), the complexity of the task fully reveals itself.

This is not a characteristic of data science or data science-related projects only. There are many voices across the industry complaining about how complex (and sometimes apparently inefficient) software development in general still is. The truth is, the more we delve into a world of cloud computing, artificial intelligence, and IoT, the more complexity we’ll get. There is complexity in platforms, in features, and in tools. Simply put, there is complexity in choices. One of the most important challenges a person faces in modern IT is learning to cope with complexity.

Speaking about the human factor, the reality is we’re not capable of producing enough of this resource to feed the ever-increasing needs of modern in IT. In data science only, most studies agree that by the mid-2020s there will be a lack of skilled personnel exceeding 200,000 professionals worldwide (some studies advance this number for US only). For some reason, the education arm of our society does not have the capability of coping with the speed at which things happen in the industry. One can only imagine that if this speed further increases, the pressure on education will increase as well. The question then remains – what to do about the increasing gap? One other interesting side of this is how this problem applies to people already working in the field. They do face the same problem, with almost the same intensity. How do I keep up with the pace? How do I make sure my knowledge does not quickly become obsolete? How do I cope with all the choices? In the world of instantly available compute resources, DevOps and release cycles measured in weeks, virtually everybody faces these questions which can be summed up in a simple way – How do I adapt?

These spectacular advancements in computing power and software are finally enabling us to make some progress in the way we’re interacting with them. For decades, we’ve been stuck in the initial paradigm of looking at rectangular screens. Even the mighty smartphones of the last decade are powerful prisoners of this paradigm. We’re just reaching decent levels of quality with Virtual Reality (VR) and we’re scratching the surface of Mixed Reality (MR). The devices are still bulky, the limitations are significant. Yet, once you spend a few tens of minutes in Microsoft’s HoloLens for example, you realize both the potential and the disruption to come. If the speed at which computing evolve will continue to increase, it’s safe to assume that in a decade or two, the ways we interact with computers will improve significantly.

Obviously, these are only a few aspects that are tied to the core question – How fast will computing evolve? You can both challenge the way I see them and think about many more. This does not change the core of the discussion though. The pace at which computing in general will evolve will determine the evolution of all the aspects I mentioned: complexity, limitations of the human factor, and interaction patterns. That’s the reason I ended up with this question as one of the three pillars of my vision about the future of IT.

Avoid misusing the power of computing

Humanity has a long history of being unable to cope with the big advancements of science and technology. Don’t get me wrong, I’m a true believer and a vivid supporter of the critical need for such advancements. In the same time, I am trying to be as objective and realistic as possible. In my opinion, there is an intrinsic connection between technological advancement and conflict throughout our entire history.

Before moving on, let me make an important note. I consider misusing the power of computing a challenge, for obvious reasons. Properly using the power of computing is falling under the category of expectations rather than challenges. It’s because of this that I am focusing more on the misuse (which, by its very nature, is a negativist approach). It’s not because I’m a negativist that sees nothing but trouble around. On the contrary, I am a highly optimistic person in general. With respect to IT, I firmly believe it is one of the foundational domains that can carry us humans into the next millennia of our existence, helping us to survive and thrive. Having said that, I believe we need to be realistic and, most importantly, watch and learn from our history.

I consider the technological advancements of the past ten years to have the same order of magnitude as Einstein’s famous E=mc2. They might not be as groundbreaking from a purely scientific point of view, but they surely have a comparable aggregated impact on humanity, if not bigger. They've already changed our social behavior, extended significantly the area of threats we’re exposed to, and opened countless new doors to those who seek to do harm. And they brought immense power to our fingertips.

Whether it’s our privacy, our safety, or our money, they all are under potential threat by the misuse of technology. The number one threat I see is lack of awareness. Most technology consumers today are almost completely ignorant of these threats. The countless public stories about leaked accounts and passwords are generating some awareness, but it’s way too little. I believe we still need to do a lot in this area. And it must start with educating our little ones about the pitfalls of using technology. If you ask me, this kind of education became at least as important as teaching speaking, writing, or math.

When it comes to misusing the power of computing, the core responsibility is shared between technology providers, governments, and public opinion. The ones that create technology and the ones that have the power to regulate the use of it are in the driving seat, while public opinion is critical to keep the checks and balances in place. The harsh reality is that for both technology providers and governments it is quite difficult to live up to the expectation. That’s why the balance is so fragile and delicate. And that’s why we see almost every day living examples of technology misuse.

Being a commercial market player, a technology provider will always be inclined to use any means to keep and improve its market position. And if it’s so successful that it either creates a market of its own to dominate or disrupts and existing market to the point it dominates it, the inclination is that much more powerful. Add to this the inevitable loopholes in the technology that allow others to misuse its technology and you get a comprehensive picture of the high-level threats. Unfortunately, there are not many global technology providers that publicly assume a mission that includes fighting those inclinations. Being a public example in this area is one of the reasons I respect and admire Microsoft’s CEO, Satya Nadella.

Governments on the other side, have a natural tendency towards gaining more grip and more control. The power of IT comes like a glove to those within governmental structures who seek ever more control, sometimes even at the price of severely invading the privacy of their own citizens and of other government’s citizens. Fortunately, both commercial market players and governments are subject of public opinion scrutiny which puts, to a certain extent, the breaks on these tendencies. There is though another area, which is the somewhat gray area of technology providers that work for governments or under their jurisdiction. This category of providers is mostly shielded from the scrutiny of public opinion. Avoiding severe misuse of technology in this case eventually boils down to the individual responsibility of people placed in key positions within those organizations. Consequently, it’s a very high risk bet, especially because this gray area exists in every single state on the globe, be it a democracy, a non-democracy, or even a rogue state.

To make things very real and tangible, I like to give three categories of examples:

  • Machine learning adversarial attacks in image recognition which build up on the core difference between how the human eye perceives an image and how a machine learning model does. The immediate “application” of this type of attacks is hacking road signs to foul self-driving cars, but there are clearly many more applications. Open AI provides some interesting examples.
  • Social attacks on high-profile public platforms ranging from identity theft up to alleged meddling with the core process of democracy – elections. The Cambridge Analytica case and recent US elections are some of the most prominent examples here.
  • Attacks on platforms involved in money-related transactions. Bitcoin theft is the posterchild of this phenomenon.

For sure, the list can go on and on and on. The more we make our daily lives dependent on technology, the longer the list gets. To live up to the daunting challenge of misusing the power of computing we need to focus on two areas: how we build technology and how we watch over its use. In both cases, education must play a fundamental role. The new generations of IT specialists must be educated from the ground up with a mindset focused around safety and security as well. The new generations of citizens must be educated to understand the pitfalls and threats of technology misuse and to keep a critical and watchful eye on its use. It’s the only way we can live up to this fundamental challenge on the long term.

Help humanity advance, and, ultimately, survive

This is the point where I’ll get a bit philosophical, kind of unavoidable when talking about the future 😊. I believe humanity needs to achieve a few core goals to get a chance for long term survival:

  • Harness efficient sources of energy (nuclear fusion sure seems a good candidate)
  • Avoid starving by planet over-population and irreversible habitat degradation and resource consumption
  • Avoid suicide by uncontained global warfare
  • Advance rapidly in medicine to pe prepared to fight potential global pandemics
  • Become capable of efficient deep-space travel
  • Develop working counter-measures against out-of-earth threats (no, I’m not thinking here about the little green men, I’m rather thinking about the mile-wide metallic asteroid)

Most of these goals eventually boil down to these fundamental research categories:

  • Understand physical phenomena at different levels (molecular, atomic, sub-atomic)
  • Develop new substances and new materials

It’s hard to argue that IT doesn’t play a fundamental role in either of this. I’d say none of this would even be possible without advances in computing power and software. Think about CERN. Think about cancer research. Think about climate modeling. Think about any kind of activity involving science and research. One way or the other, it’s powered by IT.

While thinking about “How fast will computing evolve?”, I’ve tried to paint an accurate picture of the spectacular advancements that are developing these days. Time to get one more step behind and attempt to see a bigger picture. All the advancements I talked about can be gathered under a big umbrella named “classical computing”. Let’s oversimplify the discussion and define classical computing as any kind of computing that eventually ends up working with two distinct states, 0 and 1.

Despite its awesome power and constant evolution, classical computing is severely limited in its capability of carrying us towards achieving the core goals I mentioned a few moments ago. There’s an example used by John Azariah and Julie Love from Microsoft in their talk at Build 2018 which I absolutely love. They show the “well-known” caffeine molecule (well-known by folks in IT that is 😊) which is a small molecule whose chemical properties can be calculated by today’s supercomputers. A slightly bigger molecule like the FeMoco molecule becomes practically unsolvable using the same computing power. The FeMoco molecule plays a central role in a fundamental natural process called nitrogen fixation (the conversion of atmospheric N2 into NH3 – ammonia). It is also called “nature’s fertilizer” because it is the cofactor of nitrogenase, the enzyme that catalyzes the process. Understanding the inner workings of the FeMoco molecule translates into the capability of spectacularly improve the process through we currently produce fertilizers, process which is one of the major sources of pollution. Unfortunately, to study the molecule with classical computing resources would literally mean billions of years.

Enters quantum computing, which promises spectacular improvements in the computing power needed to study such natural processes. An oversimplified definition of quantum computing would say it is the kind of computing that eventually ends up working with an infinity of states between 0 and 1, embodied in a qubit. It is a simplistic definition because it doesn’t speak about other quantum phenomena like superposition and entanglement which are involved as well in quantum computing. More (much more 😊) on this will follow in future posts. For now, let’s just say that quantum computing is aligned with the way in which nature works in way that classical computing never was and will never be. Because of this, it brings the promise of being a core catalyst in the process of achieving humankind’s goals for long-term survival.

You’ve probably already heard how quantum computing has the capability of bringing the kind of computing power that will dwarf the combined capacity of all of today’s datacenters on the planet. While this is certainly true for certain areas of applicability, it’s also quite certain that it will not replace classical computing (at least not for the foreseeable future). I regard quantum computing as an enormously powerful companion to classical computing, one that will enable us to foray into areas that are currently out of reach. For other practical examples, take a look at this list of problems that could be solved by a quantum computer, published by Microsoft.

One can ask why did I mention quantum computing here, instead of the other sections (the question or the challenge). It is after all valid to ask how fast will be get to quantum computers that work at scale or how to avoid misusing their potentially mind-bending power. Well, it’s because I regard working quantum computing as a necessary condition that the IT of the future must meet to be a true enabler for humankind’s long-term goals. If it’s both necessary and sufficient, remains to be seen.

So, this is my simplified take on the future of it. In a nutshell, it’s defined by one question, one challenge, and one expectation. How fast will computing evolve? Probably at the same vivid pace in the case of classical computing. Hopefully, fast enough to become viable, in the case of quantum computing. How can we avoid misusing the power of computing, especially in the fortunate case when quantum computing becomes viable? By educating young generations from the ground up to be aware of the pitfalls and to keep an eye on what’s happening, no matter what their individual field of expertise is. Will IT help humanity advance, and ultimately, survive? A strong yes if quantum computing becomes viable, a question mark if it turns out we’re stuck with classical computing only.