There’s a lot of talk about the Cloud in the media lately. It’s a term incubated in the network diagram charts of IT departments, used to depict the concept of the Internet visually. Now the Cloud, under an umbrella of definitions, has entered the vernacular of the mainstream consumer.  Apple, a company whose brand’s promise is to shield the skeptical customer from the confusion of modern tech recently named a forthcoming service iCloud.  If Apple embraces a concept, there’s a good chance it’s important and here to stay (at least for a while). It’s still a murky concept for some so what does the term “the Cloud” truly mean? And how can it save us from rainy days?

In the early days of business computing, terminals were used to access applications and data that were resident in mainframe systems.  The mainframe was a large computer capable of serving many users while centralizing the operations in a single box. Users accessed the system through inexpensive, low-powered terminals.  This architecture was used mainly to keep costs down and maintenance simple. Then PCs got cheaper and more powerful, capable of running business apps locally. We went from the green screen terms of the past to Windows and to a lesser extent Apple, computers that had the juice to power our computing experience in the towers under our desks.  Of course, the client-server architecture still existed in the form of mail and file servers but applications were typically something we installed locally.

The PC model expedited the adoption of computers globally. It meant that we no longer had to rely on a byzantine combination of keystrokes and incomprehensible menu systems to accomplish meaningful work in a computing environment. The machines were powerful enough to deliver a rich and intuitive interface and we simply pointed and clicked our way to productivity.  It also brought along with it a new set of problems. Users had to understand a file system, a task manager (Ctrl-Alt-Del and End Task) and how to install and update the Operating System and Applications they used. This model was great for adoption since users were insulated from the technology driving their experience. But they weren’t removed from the equation enough.

Cloud computing helps solve some of the remaining problems. Applications can live in the Cloud so users don’t have to constantly remember to save their files, update their software or maintain applications.  Web technology continues to improve so the applications developed for residence in the cloud bear a striking similarity to the experience of desktop apps of the PC era.  Trusting that your data is where you need it, safe and sound in an accessible web service across myriad devices (laptops, tablets, smartphones) is the core of the Cloud concept. The more we move the consumer’s focus away from how to make software work, the freer they are to be productive and creative by using it.  It’s been said that the best designs aren’t noticed by people at all and the Cloud pushes us all closer to blue skies.

Leave a Reply

Have a comment? We'd love to hear it!

12 − nine =