The History and Development of Cloud Computing
Editor’s note: This is a guest blog post
Cloud computing and other internet-based services continue to develop rapidly, though it hasn’t exactly been a direct path to get to where we are now. The current state of the industry may seem like obvious and inevitable when we look back, but just a short time ago it would have been hard to guess that this is where this is where things were actually going.
The history of cloud computing has gone through a number of major changes that have made it more accessible and affordable. Like many other things, though, it’s important to understand where it’s been to make any sort of guess at where it’s going.
The current state of cloud computing rests on a strong internet backbone, but that isn’t how it started or where it ends. The private cloud is now an important part of many business IT infrastructures, making elements like virtualization and service-oriented architecture even more important. If we look at the development of the cloud over the years, it is easier to see why the cloud is such an integral component of modern IT solutions.
The Principles of Cloud Computing
The images of serious computing in the 50s and 60s – those pictures of row upon row of magnetic tape machines – are actually a foreshadowing of a cloud computing structure. In other words, companies were already using a lot of machines to provide more power than a single unit could and, on top of that, allowing more than one user to access the same assets.
Back in the 50s, those giant mainframes were installed in schools, government organizations, and large corporations because they were the only places that could possibly house all those machines. Even then, multiple instances of the mainframe would be inconceivable, so it became normal practice to develop “dumb terminals” that allowed multiple people to access the necessary resources. This is the same principle as modern day virtualization, which puts us on the path toward cloud computing.
Early History – Virtual Machines
The real implementation of virtual machines came in the 70s when IBM released an operating system called VM. This allowed multiple distinct computers to reside in the same processing environment, leading to the type of interactions we know call virtualization. In basic terms, it means that each individual user would have a machine with its own memory, processor, and other hardware components, but many of the resources would be shared by others.
This type of “group computing” showed companies that they could start adding network solutions without actually increasing their hardware infrastructure. It was all about provisioning the resources they already had, shifting traffic as necessary, and balancing the load on the network and bandwidth to provide better services to their customers.
The Middle Ages – The Internet Potential
Telecommunications solutions were an integral part of cloud development, and this became possible with the commercialization of the internet. The network on which it is based, though, goes back to the 60s when J.C.R. Lickliker enabled the development of ARPANET (Advanced Research Projects Agency Network). This would eventually become the forerunner of the modern internet.
The notion of connecting people all over the world to access programs and data from different locations became a real possibility. By the 70s, people were really delving into the potential suggested by those first experiments in the 60s. In 1971, for example, the first email was sent, and the U.S. Department of Defense continued developing ARPANET into the internet.
In 1979, both CompuServe Information Services and The Source both went online, showing that it was possible for commercial service providers to host internet services. Still, it wasn’t until 1993 that the Mosaic browser made the internet far more graphical – something that the average user could manage. It was soon after that when Netscape launched, and then, in 1995, both Amazon and eBay appeared.
Industrial Revolution – Affordable Computing
Part of the reason for the gap between 1979 and 1993 was that computers were still not affordable or compact enough for people to have in their homes or for companies to outfit their entire staff. The 80s saw the biggest boom in computers, with IBM putting out a range of affordable personal computers and Microsoft pushing its operating system out in a large scale.
Then, in the 90s, there was finally sufficient bandwidth available to really make the internet available to the masses, which meant that all those companies that had outfitted their staff with computers now had a valid way to connect them all. Without this kind of high-speed bandwidth and software interoperability, this type of connected computing would never have worked.
Modern History – Service-Oriented Architecture
The rise of commercial networking wasn’t an easy one, and once the first bubble burst in 2000, companies had to start rethinking their business models. The lesson learned was that no matter how much money investors threw at you, you still needed a solid business plan to survive in the long run.
In the search for new ways to monetize the internet, many companies started to realize that they could provide a service model to deliver usable solutions and resources. Salesforce.com was the company that really started this trend by pioneering the concept of delivering enterprise-class applications over a simple website.
Next, in 2002, Amazon got on board the trend with Amazon Web Services. This gave users the ability to access storage, computation solutions, and other apps through the internet. In 2006 they went further with the Elastic Compute cloud (EC2), which basically let developers rent space on their computers to store and run their own apps. It was an entire infrastructure that they delivered as a service.
By 2009, most of the industry influencers were on board, with companies like Microsoft and Google delivering apps to the average consumer as well as businesses in the form of simple, accessible services.
Owning Your Cloud
The ubiquity of cloud computing has led to an environment in which companies don’t have to go to third parties to take advantage of this resource. The technology has developed to the point that organizations can effectively deploy their own private or hybrid clouds, rather than rely on public clouds. This can potentially increase performance and decrease certain costs in this area.
More importantly, a private cloud deployment gives the IT team more visibility into the back end of their system, which is particularly useful for companies that are extremely security conscious and require direct oversight on all their assets.
Private cloud deployments are becoming more prevalent because they offer a lot of the same cost and convenience benefits, and they support various platforms while allowing the organization to maintain more control. The road to get to this point has been a long one, and while it may be difficult to predict exactly what the future holds, there are currently a lot of benefits to provisioning resources over a safe, secure, private network.
How do you use the cloud in your business? How do you think the technology will evolve over the next few years?