Over the decades, computing models have constantly alternated between two extremes. In one extreme, the device in front of you is merely a terminal that connects to a computer somewhere else. When the pendulum swings in the other direction, you have a computer that can do everything. Obviously there are some different options between these two extremes, but we still tend to gravitate toward one or the other. When the company then known as Apple Computer first started, for example, they specialized in providing something known as personal computers, or devices that could do everything for you without having to hook up to some big mainframe somewhere.
Times change.
Louis Gray, who recently became an employee of Google, wrote a blog post back in September that talked about living in the browser. A browser is the 21st century version of a dumb terminal - you use the browser to access a computer somewhere else to do stuff. For example, I'm using a browser right now to type these words, which (since I use Blogger) are being stored in some Google server somewhere.
Today's trendy term to refer to dumb terminals and mainframes is "the cloud." Companies like Amazon and Google do things in the cloud. Of course, there's always a danger in using the cloud, which I noted in a comment on Gray's post. This is (part of) what I said.
Call me a Luddite.
And Gray probably does call me a Luddite. But I digress.
My employer provides automated fingerprint identification systems, which have an entry step (scan the inked fingerprint card, capture the live-scanned prints, enter the crime scene latent print) which is followed by a processing step (compare the captured prints to the database of prints). Entry takes place on workstations with thick clients, the captured prints are then submitted to a central site, and all user access after submission (quality control, fingerprint match verification, database maintenance) takes place via a web browser-like model.
If you lose the network, you're mostly hosed. You can't submit prints, you can't perform quality control, you can't verify match comparisons, etc. However, you can still capture prints, and queue them for later submission when the network is restored. So while a police agency is somewhat crippled if the network goes down, they can still perform a limited amount of processing.
My question - if the Google cloud is inaccessible, does work stop?
I see two possible solutions: (1) have a backup mechanism (such as a thick client version of Google Docs) that allows you to continue work in a limited capacity, or (2) make sure that your network is high availability and is able to resist fire, flood, earthquake, hurricane, riot, and the Scoble Effect.
Now I'm sure that Google and other services put enough redundancy into their systems to minimize the occurrence of outages. But there is no "cloud" company - none - that can guarantee 100% availability to its users. It is literally impossible to do so.
This is an obvious drawback of cloud-based computing (or whatever term you use), but for many people, the benefits of cloud computing outweigh the costs. Gray can go anywhere in the world and access his documents, on any computer, at any time. He can collaborate with other Google employees all over the world. Let's face it, most of the services that we talk about - Google, Facebook, Twitter, and the like - use centralized servers to provide massive collaboration benefits.
Of course, most people who use Google, Facebook, and Twitter know that the application depends upon a network connection.
But what if you don't know this? What if you think you're using a stand-alone application, but in reality the application is completely dependent upon network connectivity?
Enter Siri, the new application on the iPhone 4s. Except that it actually isn't on the iPhone 4s, as many people found out when there were network problems. ComputerWorld explained:
Siri may be the most prominent feature on the iPhone 4S, but in truth it's only partly "on" your phone, as a network-related outage this past Thursday demonstrated. iPhone 4S users were cut off from their virtual assistant, their queries answered only by messages that Siri could not make a network connection....
While it's not surprising that a network problem would prevent Siri from making Internet-based searches, many users don't realize that the feature's voice-recognition depends entirely on network connectivity; an outage, therefore, means that Siri can't even perform tasks like making a calendar appointment, setting a timer or creating a reminder.
ComputerWorld explained that Siri takes advantage of huge server farms that provide much more processing power than a mobile phone ever could. This works fine - as long as the network connection is there. But many people didn't realize that a network connection was necessary.
Some time within the next few years, there will be a bit of a backlash against cloud computing. Leading techies will champion the benefits of having all of your computing capability locally, so that you're not adversely affected by poor network connections or sunspots or whatever. Someone will come up with a cute term to differentiate this from the "haphazard" world of cloud-based computing - maybe they'll call it rock computing or something.
And the pendulum will swing agin.
Thrown for a (school) loop
-
You know what they say - if you don't own your web presence, you're taking
a huge risk. For example, let's say that you decide to start the Red Green
Compa...
4 years ago