Accelerated Abstraction: The Inevitable Rise of Low-Code / No-Code Technology, and Beyond
Because who needs to understand how the Internet really works, anyway?
Since the personal computer was invented, abstraction away from the inner workings has been a part of the process. At first, you had to source and build your own hardware. Then, Steve Jobs and Steve Wozniak began creating kits that you could construct yourself - the Apple I. Even then, purchaser of these machines “bought just the circuit board for the Apple I, but he had to supply the keyboard, monitor, transformer, and even the case in which to put the computer.” The Apple II, when it was introduced, included those other physical components, but the user was still responsible for programming the machine.
Early Abstractions
Then, operating systems began to be introduced to help abstract this portion of the computer away, eventually leading to things famous iterations such as Bill Gates and Paul Allen’s introduction of MS-DOS and Steve Jobs commercializing the Xerox Palo Alto Research Center-developed “graphical user interface” or GUI. Prior to the introduction of the GUI, the interactions with the computer were all text-based and run from the command line (which still happens today, though is generally reserved for highly technical “power” users).
As WIRED Magazine recounts the history, “a young man named Steve Jobs, looking for new ideas to work into future iterations of the Apple computer, traded US $1 million in stock options to Xerox for a detailed tour of their facilities and current projects. One of the things Xerox showed Jobs was the Alto, which sported a GUI and a three-button mouse. When Jobs saw this prototype, he had an epiphany and set out to bring the GUI to the public.”
It didn’t work right away, but by 1984, the Apple Macintosh became a revolution in the computer industry, and “It wasn't long before Microsoft, headed up by one Bill Gates, entered the GUI game.” We are largely living on the lineage of the decisions today - whether through our desktop and laptops, or through tablets and mobile phones. The underpinnings of these computing devices, however, have continued to become further and further away from the user - which, as we will see, can create both problems and opportunities.
Abstraction Through Centralization - or, Everything Old is New Again
Prior to the introduction of the personal computers, most computing was conducted on large, expensive centralized machines - often called mainframes. While the personal computer famously put much of this power onto people’s desks (hence the term “desktop” computer), there were certainly computational tasks where desktops simply weren’t up to snuff.
This required larger, more powerful computers called “servers” - either for serving content (such as web pages or videos) up to other machines, or to be networked together to harness their collective processing or storage power. These servers, however, still carried physical requirements - they had to be purchased (and were not cheap), installed in special racks, with particular power, cooling, and management requirements.
Initially, these significant requirements for implementing servers pushed their use back into the categories of organizations that previously had mainframes (think universities and large corporations). Eventually, business models cropped up for providers to centralize the physical infrastructure (power, network connectivity, etc.) and offer “colocation” space, where you could simply rent a rack, or two, or ten for your servers in a shared data center. While this model continues today, the evolution of abstraction was marching on - and “virtual machines” were right around the corner.
Abstraction Goes Virtual, then as-a-Service
As the computing power of servers continued to increase, it became possible for these computers to run “virtual machines” - or multiple distinct operating systems on a shared piece of hardware - allowing one physical server to fulfill the roles of multiple servers at the same time by sharing its hardware resources (processing, memory, networking, etc.) across multiple “virtual machines.”
Like the abstraction process and business models before, large organizations started by deploying their own instances of these virtual machines on existing servers, and other companies derived business models to share the centralized resources across multiple customers. This concept is as the core of what is today referred to generally as “the Cloud,” though as we will soon see, it gets even…cloudier…from here, at least in terms of abstraction.
Arriving first in the flavor of “Software-as-a-Service”, or SaaS, but quickly growing to include Platform-as-a-Service and Infrastructure-as-a-Service, cloud providers pushed rapidly to abstract away the requirement for customers to deploy (and maintain) their own virtual machines, instead doing that for customers and dynamically managing the underlying compute and storage resources. Today, much of the modern Internet is built on this model, and Cloud Service Providers (or CSPs) are the backbone for most consumer-facing technologies. Even tech heavyweights such as Apple have decided it’s in their best business interest to spread their cloud deployment across the three major CSPs (Google, Amazon, and Microsoft) instead of building their own solution.
This move to shared computing began to introduce challenges for security and governance efforts in corporate America, where IT departments were no longer in direct control of both the software and hardware on which their business ran. While you may have implemented a secure deployment of your virtual machine, it was possible that risk could be unknowingly introduced via underlying vulnerabilities on the tools providing the virtualization or in the software that was running in a virtual machine that was sharing a physical machine. These challenges intensified with the introduction of “as-a-Service” offerings, but no longer having to update, patch, and otherwise maintain the virtual machines has driven significant efficiencies on technology spend.
As it has always been in computing, however, the levels of abstraction continue, as do the challenges and opportunities.
Low-Code and No-Code
Low-Code and No-Code solutions (known colloquially in the industry as “LC/NC”) are lightweight software development platforms that use visual interfaces (not dissimilar to the previous transition from command lines to Graphical User Interfaces). Prior to the introduction of these platforms, companies and users still required either developers or development skills to create software on top of the hardware platforms that had already been abstracted away into the cloud. Now, however, incredibly powerful software can be created without requiring much, if any, coding knowledge.
Make no mistake - I believe this evolution hold tremendous potential to deliver outsized value to businesses that can harness it, and will rapidly democratize the ability to create software that can automate routine tasks, accelerate repeatable work, rapidly prototype new ideas, and even potentially run at a level of quality that we’ve grown used to taking years of effort and millions of dollars.
LC/NC solutions push the value of the technology into the business logic, rather than the source code, and are able to quickly leverage existing services and integrations to rapidly deliver capabilities that would’ve been previously out-of-reach for most. This “logic layer” represents the latest evolution of computing’s abstraction - where most users (even those building the technology) are no longer required to interact with the underlying technology such as servers, databases, etc. While many will rejoice at this evolution, there are others - particularly those tasked with security, compliance, or governance roles - who will be facing unprecedented difficulties in maintaining a defensible posture moving forward.
Today, even if your businesses runs completely in the cloud, LC/NC solutions bring new challenges. The notion of “shadow IT” isn’t new to most enterprises - and cloud solutions that can be purchased outside of the normal procurement channels via credit card are common. Expanding this to LC/NC solutions, where data is flowing to and from all sorts of different integrations, and running on all sorts of underlying cloud infrastructure, and an unsecured S3 bucket quickly becomes the least of the security team’s worries.
Expanding the Human Opportunity
From an opportunity perspective, however, these risks (which are largely present whether your technology stack is entirely on-prem and in-house or entirely in the cloud and outsourced) can be managed in such a way as to take advantage of the new-found potential of these solutions.
Partially due to my nature as an early adopter, and partial due to my nature as an optimist, I prefer to see the rise of LC/NC solutions as a way for businesses to capture the largest untapped resource of them all: employee’s tacit knowledge of how their work actually gets done. And, in fact, the more technology that’s involved in getting that work done, the more opportunity there is for value creation.
Imagine that instead of training the new hire to perform a job or set of tasks, that energy goes into building a single LC/NC. With a moderate amount of investment, a small ecosystem of LC/NC solutions can be created for a firm that serves to grow the collective impact of each individual solution because they can be chained together in ways that make the work flow better, faster, with fewer errors, and higher levels of sustainability. While there will be costs to build and run these LC/NC applications, they don’t need retirement contributions, healthcare, or regular raises. The wide-scale deployment of these platforms should allow many knowledge workers to rapidly increase their position in the value chain (and, along the way, the value they deliver for their company and capture for themselves in terms of salary).
But what happens to my DevOps teams?
It is entirely possible that the rise of LC/NC tools will cause a splintering for the developer community. In the past decade or so, DevOps and now DevSecOps have gained steam - the idea that development and operations go hand-in-hand, or that security is also added to the mix. Easily quipped about where “the team that built it, runs it,” the reality is that the role developers fill for many businesses will no longer be needed. As a result, I can see roughly three paths forward for today’s developer community:
Specialize and continue as a traditional developer. There will always be a need for custom development solutions, but they are often found in niches that require knowledge of a particular technology, programming language, or operating environment. For many developers, particularly those who are already somewhat or mostly specialized, this will be the preferred path.
Move up the abstraction stack and support LC/NC deployment. The truth is that the “Low Code” name is still a bit optimistic. These platforms still rely heavily on scripting and API technology that can still be beyond the grasp of many business unit users. The opportunity is there for developers with soft skills to serve as enablers for the deployment of these next-generation LC/NC application platforms.
Move down the abstraction stack and become of an engineer. Just because the abstraction pattern continues, doesn’t mean that the need for servers, storage, compute, virtual machines, and other infrastructure components goes away. In fact, it increases, but so do the levels of technical aptitude required to keep it all running. Developers who want to pivot more towards the networking, operations, and engineering side of things have an opportunity to move down the stack and stay relevant.
Where Abstraction Takes Us
The pattern of continued abstraction has held nearly as true as Moore’s Law, thought it’s a bit harder to measure. Next waves of abstraction will likely focus on new ways of interacting with computers that are already here, but just a touch clunky (think voice interfaces like Amazon’s Echo, Apple’s Siri). These would abstract away the screens, keyboards, mouses, and many other traditional interaction points of a computer, and wouldn’t work for every computing need.
Screens may get smaller and more portable, be it through glasses or other “augmented reality” (AR) or virtual reality (VR) headsets. These evolutions, though, will happen incrementally. Apple is already putting a tremendous amount of effort into AR solutions based on the iPhone (including building LiDAR scanners into newer models), and will no doubt be iterating from there.
Abstraction in computing, however, continues to move computers closer to the human - rather than the other way around. Each step in the abstraction has resulted in computers that are easier to use, more approachable, and whose use is ever more widespread. Expect this trend to continue to the point where the computer is nearly abstracted away, and the focus can be on solving whatever uniquely human problem is being attended to. It just might not happen any time soon.