Loading AI tools
From Wikipedia, the free encyclopedia
The concept of the cloud computing as a platform for distributed computing traces its roots back to 1993. At that time, Apple spin-off General Magic and AT&T utilized the term in the context of their Telescript and Personal Link technologies.[1]
In an April 1994 feature by Wired, titled "Bill and Andy's Excellent Adventure II", Andy Hertzfeld elaborated on Telescript, General Magic's distributed programming language. He described the expansive potential of the cloud:
The beauty of Telescript ... is that now, instead of just having a device to program, we now have the entire Cloud out there, where a single program can go and travel to many different sources of information and create a sort of a virtual service. No one had conceived that before. The example Jim White [the designer of Telescript, X.400 and ASN.1] uses now is a date-arranging service where a software agent goes to the flower store and orders flowers and then goes to the ticket shop and gets the tickets for the show, and everything is communicated to both parties.[2]
In 1963, the Defense Advanced Research Projects Agency (DARPA) funded Project MAC, the first computer time-sharing system.[3] During the 1960s, the initial concepts of time-sharing became popularized via Remote Job Entry (RJE);[4] this terminology was mostly associated with large vendors such as IBM and DEC. Full-time-sharing solutions were available by the early 1970s on such platforms as Multics (on GE hardware), Cambridge CTSS, and the earliest UNIX ports (on DEC hardware). Yet, the "data center" model where users submitted jobs to operators to run on IBM mainframes was overwhelmingly predominant.
In the late 1980s, the invention of the world wide web led to internet expansion and on-premises data centers.[5] In the 1990s, telecommunications companies, who previously offered primarily dedicated point-to-point data circuits, began offering virtual private network (VPN) services with comparable quality of service, but at a lower cost. By switching traffic as they saw fit to balance server use, they could use overall network bandwidth more effectively.[6] They began to use the cloud symbol to denote the demarcation point between what the provider was responsible for and what users were responsible for. Cloud computing extended this boundary to cover all servers as well as the network infrastructure.[7] As computers became more diffused, scientists and technologists explored ways to make large-scale computing power available to more users through time-sharing.[6] They experimented with algorithms to optimize the infrastructure, platform, and applications, to prioritize tasks to be executed by CPUs, and to increase efficiency for end users.[8] At the same time, Application Service Providers became popular, and later evolved into Software as a Service (SaaS).[9] In 1999, Medidata launched Rave, the first electronic data capture software for clinical data.[10]
The use of the cloud metaphor for virtualized services dates at least to General Magic in 1994, where it was used to describe the universe of "places" that mobile agents in the Telescript environment could go. As described by Andy Hertzfeld:
"The beauty of Telescript," says Andy, "is that now, instead of just having a device to program, we now have the entire Cloud out there, where a single program can go and travel to many different sources of information and create a sort of a virtual service."[11]
The use of the cloud metaphor is credited to General Magic communications employee David Hoffman, based on long-standing use in networking and telecom. In addition to use by General Magic itself, it was also used in promoting AT&T's associated Personal Link Services.[12]
In 2002, Amazon established its subsidiary Amazon Web Services, which allows developers to build applications independently.[13][14]
In 2006, Amazon introduced Simple Storage Service (S3) in March and Elastic Compute Cloud (EC2) in August. These services were among the first to use server virtualization to provide IaaS on a pay-as-you-go basis. In the same year, Google launched Google Docs, a SaaS model to edit and save documents online.
In 2007, Netflix launches its online video streaming service, the first SaaS streaming site. [15] Also, IBM and Google partnered with universities-- University of Washington, Carnegie Mellon University, MIT, Stanford, University of Maryland, and UC Berkeley-- to create a research server farm. [16] This would later become the Cluster Exploratory program when the National Science Foundation funded the project in early 2008. [17]
In April of 2008, Google released the beta version of Google App Engine, a PaaS that provides a fully managed infrastructure and platform for users to create web applications.[18][19] In mid-2018, Gartner noted the potential for cloud computing to reshape the relationship between IT service consumers, users, and providers.[20]
NASA's Nebula becomes the first open-source software for deploying private and hybrid clouds in early 2009. [21] Later in the same year, The French government announced the Andromède Project to establish a national cloud computing service. The government committed €285 million to the initiative.[22][23] The initiative ultimately failed, leading to the shutdown of Cloudwatt on 1 February 2020.[24][25]
In February 2010, Microsoft launched Microsoft Azure in February, following its announcement in October 2008.[26] Five months later, Rackspace Hosting and NASA initiated an open-source cloud-software project, OpenStack. This project aimed to facilitate organizations in offering cloud-computing services on standard hardware. The early codebase was sourced from NASA's Nebula platform and Rackspace's Cloud Files platform.[27][28]
In March of 2011, IBM introduced the IBM SmartCloud framework, designed to support the Smarter Planet initiative.[29] Later that year, the US government established the Federal Risk Management Program, FedRAMP, becoming the first government-wide cloud services accreditation program with standardized risk assessment methodologies for cloud products and services. Later on October 12, iCloud was launched, allowing users to store personal information across multiple devices and share with other users.[30]
In June 2012, On June 7, Oracle announced the Oracle Cloud.[31] In May, Google Compute Engine was released in preview and subsequently rolled out into General Availability in December 2013.[32] Also in 2013, Docker launched as a PaaS model to host containers in the cloud for software development.[33]
In December 2019, Amazon launched AWS Outposts, a service that extends AWS infrastructure, services, APIs, and tools to customer datacenters, co-location spaces, or on-premises facilities.[34]
Since the global pandemic of 2020, cloud technology jumped ahead in popularity due to the level of security of data and the flexibility of working options for all employees, notably remote workers. For example, Zoom grew over 160% in 2020 alone.[35] Security and privacy are still a major concern due to security breaches and one of the main focuses of research. CloudChain, a cloud-oriented blockchain system is designed to increase the layers of security.[36]
Currently, global spending on cloud computing services has reached $706 billion and the International Data Corporation predicts it to reach $1.3 trillion by 2025.[37]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.