Cloud computing was a fringe concept. Although it has been around for decades, such predecessors as thin clients and time sharing systems, cloud computing really took off with the advent of Amazon Web Services (AWS), and Microsoft Azure in late 2000s. It is now so mainstream that a core Apple service – iCloud – which is backed up by AWS assets and Azure assets – has “cloud” in its name.
On the business side, growth in the adoption of comparable cloud solutions that follow the Infrastructure-, Platform- and Software-as-a-Service service models has rapidly eaten into the share of IT spend going toward traditional hardware and software:
The 2018 Spiceworks State of IT survey found that cloud services would account for21 percent of the typical IT budget next year, rivaling the 26 percent for software and the 31 percent for hardware.Technologist Robert Cringely estimated that by the end of the 2010s, all forms of cloud computing combined could consumea majority of the roughly $1 trillion in annual IT-related expendituresin the U.S.IT research firm Gartner madesimilar projections. Cringely also predicted an 18 percent increase in the public cloud services market year-over-year, with a value of almost $250 billion for 2017. What does this mean for IT professionals?
How does the cloud computing tipping point compare to its predecessors?
Corporate IT can be thought of as having evolved through seven key periods, much like how life has changed over time. These eras are:
1. BATCH COMPUTING
Computing was a process that used punch cards and programming languages like FORTRAN from the late 19th century to the mid 20th century. This type of computing was not interactive, but it had the advantage of processing large amounts stored data on tape or cards.
2.TIMESHARING
Timesharing was a proto-cloud that was developed and introduced in the 1950s and 1960s. It allowed multiple users to simultaneously access a central CPU that had enough resources to support all of them simultaneously.
3. HOME/PERSONAL COMPUTERS
Original PCs were driven by command-line interfaces, which could efficiently execute many tasks with just the right commands but weren’t very user-friendly. Their evolution eventually led to a fourth major era. It was characterized by…
4. …GUIS
A GUI (graphical user interface) is a layer that overlays the operating system. It provides a simple and intuitive way to navigate. A GUI is when you click on or tap on icons to open applications or perform actions. GUIs are available in all major operating systems (OSes), such as Microsoft Windows or Apple macOS.
5. INTERNET PROTOCOL (IP) NETWORKS
The 1990s saw the rise of the World Wide Web. It made the internet easier to use than GUIs. The internet and IP led to a shift away from native on-machine apps to websites.
6. MOBILE
Since the introduction of the iPhone in 2007, PC-grade operating systems were reduced to run on IP-enabled mobile hardware like smartphones and tablets. These devices are the main way that people access the internet. Their computing power is often comparable with a laptop.
“Cloud is deeply interwoven with today’s OSes and applications.”
7.CLOUD
Cloud is now so deeply integrated with modern OSes, apps, and websites that many of them are essentially interfaces to cloud resources behind-the scenes. Every app, from Office 365 to Instagram, has relied on AWS and Azure at one point.
Cloud was the “next big thing” and it actually worked out. This is unlike early virtual reality headsets or 1980s artificial intelligence. It will be a key skill for IT professionals in any situation.
New Horizons Computer Learning Centers offer many cloud courses, including one inAzure. Find a location near you to view the course listings and learn more.
