Cutting Through the Hype on Cloud Computing

Thu, Sep 6, 2012 - 9:41am

How many times have you heard something like this? "The Cloud is going to change the way we ______." Fill in the blank with any phrase from "conduct business," to "organize IT," to "live our lives." Rosy projections are the order of the day. Exuberance - some would say irrational exuberance - abounds.

Apple's doing it. Google's doing it. Thus, it must be the next big thing, right?

Well, that all depends on your perspective. Namely, what is specifically meant when the word "cloud" is thrown into the mix? That's because for nearly each utterance of the phrase, there is a distinct definition. And for each definition, there's potentially a different interest for investors.

What the Heck Is "The Cloud," Anyway?

Despite the flood of publicity on the subject, if you mention "the Cloud" or "Cloud Computing" to most people, you'll get little more than a blank look.

More computer-literate folk will know that it has to do with IT operations happening out in cyberspace, and they may be familiar with the phrase "software as a service" (SaaS). But the truth is that, even among the cognoscenti, you'll find no hard and fast definition of the Cloud.

That's primarily because cloud computing is a loose amalgamation of related concepts that have existed for some time, but to which no one had thought to apply a single name. It's a definition in search of a term, if you will.

Who coined that term is up for grabs. But the most common attribution is to Eric Schmidt of Google, who, although by no means the first to say it, at least popularized the term when he described his company's approach to SaaS as "cloud computing" at a search-engine conference in August of 2006. Amazon then included the phrase in its new Elastic Compute Cloud (EC2) service when it was launched a few weeks later. From there forward, the term was on its way into the mainstream.

The Cloud is more than just SaaS. However, Schmidt's appellation, in context, captured one of its core concepts: the notion that software - which had previously been thought of as a product that you bought and used yourself - might be better described as a service that you didn't (and had no desire to) own.

"Real-time, Internet outsourcing of operations you can't or don't want to do for yourself." That's a serviceable definition we can work from. More specifically, when most industry insiders refer to cloud computing or services, they are referring to one of a small handful of possible models:

  • Service architectures that are scalable and flexible in a way that traditional, hosted computing is not. The most prominent example in this space is Amazon, which rents out parts of its own core infrastructure on a usage-fee basis to other developers, who use the block storage, computing, database, and other services to build complex applications ranging from large websites to scientific-analysis grids.
  • Data storage and consolidation services outside of your local network. This definition can include cloud-based backup services, like Mozy and Carbonite, or specific storage platforms like Google Play Music and Amazon MP3, which each allow you to store your music collection on their servers for easy access from multiple devices.
  • Software delivered over the Internet, often via the Web. Companies like Salesforce and NetSuite have built themselves on the back of Internet connections, promising to rid companies of the complexities of managing large, complex software systems like CRM ("customer relationship management"), ERP ("enterprise resource planning"), and accounting software.

These conflicting and overlapping definitions make it hard to compare growth forecasts for the market - it's often apples to oranges. Still, we must try if we are to understand the implications for us as investors. Leading IT market-research firm IDC took a stab at a consolidated view of the whole ecosystem in June of 2010.

In its view, global revenues from public cloud-computing services are expected to grow at five times the rate of traditional IT. They are expected to hit $55.5 billion in 2014, up from $16 billion in 2009, a compound annual growth rate of 28.2%. That compares with traditional IT product growth, expected to be on the order of 5% over the same period. Although cloud computing will still only represent 12% of total IT product spending by 2014, it will account for more than a quarter of the net new IT growth.

In the Beginning

At first, the Cloud was all about data storage, which has several components. First, you have to keep the data safe. Doing so really should entail offsite backup, to safeguard from theft, fire, floods, et al. But redundancy is another important attribute of high-quality storage. By keeping local copies readily available, you can ensure that the data are there when you need them, in spite of failures of individual drives.

Then you need access, which is why renting space in the Cloud from a third party can make a lot of sense. Thanks to the Internet, the access problem is already solved - everyone can get to it. Since storage companies already manage multiple petabytes (one quadrillion bytes, or 1,000 terabytes) of data, their data centers are designed from the ground up to be redundant. They can provide complex offsite backup at a fraction of the price, since all the procedures are already in place.

Furthermore, the data can be retrieved by any computer anywhere, with multiple concurrent users an option, if that's desired. Files can either be downloaded to be worked on in house or manipulated right on the server, in many cases. They can be encrypted, either at rest or in motion. And the host can probably manage more robust security than the user, helping to insulate data from intruders.

Just how big has storage in the Cloud become? Amazon, which doesn't release a raw number of bytes stored, instead says the total number of "objects" stored by its S3 ("Simple Storage Service") cloud repository was 905 billion as of April 2012. If the average object size is 100KB, that's around 90 petabytes; if it's 1MB, that's 900 petabytes, or nearly an exabyte. And S3 is just one of many storage options out there.

From an investment perspective, however, raw, undifferentiated storage in the Cloud ultimately is a race to the bottom. Vendors will one-up (or is it one-down?) each other by cutting the price or raising storage caps until the price gets nearly to cost. Some may even make storage a loss leader for premium services, as has already started - Microsoft, Google, and even Dropbox all offer a free amount of storage in the hope that you'll use their software or buy excess storage. To invest in a provider whose sole claim to fame is their cloud-storage platform alone makes little sense.

This is why each major provider in the space is also pursuing services on top of that storage: to make its black box more attractive to prospective users, or to keep the users who do sign up from switching to another provider without losing some features they like (what the providers refer to as "stickiness" or "lock-in" depending on how hard they want to make it to leave).

Unlimited Power to the People

As important as storage considerations have been - and continue to be - it wasn't long before use of the Cloud moved well beyond them, stopping next at raw computing power.

Where that differs from storage, of course, is that it does not benefit from redundancy, offsite backup, nor other features of good or bad storage. Solving a math problem is the same done here or there. No, where scale matters for computing is when it moves out of the realm of mundane, everyday computing and becomes supercomputing.

Heretofore, computing has been defined by the limitations of one's PC, at the personal level, and by the total machine power available in a business network. Supercomputers, on the other hand, were a niche technology tapped into only by government or the largest private companies. No longer.

Amazon believes there is a lot of pent-up demand in this area and is backing that belief with a major commitment. Thus, the company hit the ground running last November, opening up its AWS ("Amazon Web Services") infrastructure to the developers of these massively parallel computing tasks. The result is a nontraditional supercomputer of sorts, a virtual cluster running on top of Amazon's huge hosting infrastructure that allows any programmer to access a supercomputing service whose hardware - were it a standalone computer cluster dedicated to the task - would rank 42nd in the top 100 list of the world's fastest computers. But, instead of being the purview of only a single corporate, government, or academic owner, it is open to all on a pay-per-use basis, and you can even do your development work on it for free.

Of course, Amazon can provide this service because it has thousands upon thousands of computing cores sitting idle for long periods of time. Its system is built to handle peak load, and any time it is not at peak, there is lots of spare capacity that can be lent out to others, for a price. But the economics of building the tools to loan it out without affecting other services only began to make sense when the company had reached enormous scale. In order to be cost-effective, you have to sell a lot of computing power.

And again, what happens when some other company makes its FLOPs available for even cheaper than Amazon's? What's to stop the whole market from racing to the bottom again? The same thing that works in data: stickiness. And when it comes to computing power, that comes in two forms:

  • Ease of programming
  • Ease of access to complementary services

Luckily for Amazon, it had built many of these tools for its own small army of software developers already. All it had to do was tweak them and turn them out to the world. Likewise, IBM and Microsoft have both been developing tools for not just their own developers but for millions of others, for years. Those companies see the market opening up and have been rapidly following Amazon down the rabbit hole, with services like Microsoft's Azure platform for computing and storage in the Cloud.

Freedom to Innovate

"Platform as a service" (PaaS) is another phrase that's moving into the mainstream. PaaS simply means that an outside entity offers a business (or individual) a complete platform, from raw computing and storage up to database tools, programming constructs, and even at times whole virtual machines and complex services like message queuing, to enable the customer to build and run custom applications without investing in any hosting hardware themselves.

In the traditional model of building business applications, each one required hardware, an operating system, a database, middleware, Web servers, and other software. That usually meant a team of network, database, and system-management experts had to be put together, in addition to the application programmers, to keep everything up and running. Inevitably, a business requirement would necessitate a change to the application, which would then kick off a lengthy development, test, and redeployment cycle... dejà vu all over again.

PaaS offers a better way in many cases - a faster, more cost-effective model for application development and delivery. Providers of the service supply everything needed to create and run applications over the Internet. Users can simply log on and take what they need without worrying about what's going on behind the curtain.

That frees corporate IT departments to focus on innovation instead of complex infrastructure. It allows organizations to take money that was formerly used just to keep things running and divert it into the development of applications that provide real business value. Now anyone with an Internet connection can build powerful applications and easily deploy them to users wherever they're located.

Everywhere and Everywhen

Finally, the Cloud greatly facilitates going mobile.

Computing is in the process of liberating itself from not only mainframes but also from desktops and even laptops. For more and more people - especially the young - their smartphones and tablets are their primary computers. These folks want instant access to their stuff, wherever they are, whenever they want it, and delivered to whatever device is in their hands at the moment. That stresses fixed-location computing. But in the Cloud, no problem.

Apple has a mobile application, called iCloud, that provides the user with an invisible online repository for email, contacts, calendars, documents, and app data. All connected iOS devices and computers collectively sync to and pull information from this central server on a regular basis, automatically. And it backs up all data and restores it without a need to plug into a Mac or PC.

Right now services such as iCloud, which cater to the personal user, are the norm. But moving business apps into the Cloud is also gaining momentum, as corporate interest in going mobile accelerates. According to the latest Strategy Analytics Wireless Enterprise Strategies forecast, the global corporate mobile SaaS market stood at $1.2 billion in 2011 and is projected to grow to $3.7 billion by 2016, a five-year CAGR of 25.3%.

What's the Downside?

We like to have direct control over the things we're responsible for. That's just human nature. When we give that control over to strangers at some distant location, it's going to make us nervous. If something goes wrong, we're helpless to step in and fix it. And we will have to cede some of the credit when things go right.

That kind of unease will never go away. But if you and/or your company are thinking about getting involved in cloud computing, you should be aware that that step should not be undertaken without some consideration of the negatives.

Overall, most of the concerns about cloud computing come down to the issues of reliability, complexity, security, and cost. The Cloud is being sold by vendors and in the media as some magical fairyland of carefree computing. It's not. While it can be highly useful, it is still a complex resource that requires knowledge, understanding, and hard work to manage so as not to create even more problems than you solve.

The one thing that you absolutely have to trust is that you will have access to that information at all times. Moreover, you must have confidence that your data will never be lost or altered in any way. Reliability is paramount.

Leading cloud service providers guarantee the integrity of whatever they store and process for you, storing multiple copies of each object across multiple locations.

Regarding security, a cloud provider should offer, at a minimum: highly controlled access to the physical site; separation of the network; isolation of the server hardware; and isolation of storage. It also has the added advantage that customers have no need to access the servers and networking gear, and in that sense it can be even more secure than traditional facilities.

Second, there is data security. Cloud data facilities are a more tempting target for intruders than smaller facilities. In response, companies have adopted isolation techniques, such as "Multiprotocol Label Switching" (MPLS) and encryption, to maintain the isolation and security of every packet entering, residing on, and/or leaving their networks. The scale of these operations also allows for significantly more investment in security policing and countermeasures than almost any large company could afford. They can buy the very best of the best.

In truth, though, nothing can be made perfectly safe. In the Information Age, that's a given.

The next issue is complexity, by which we mean: does the Cloud actually simplify IT matters? It would seem to, based on the success stories the industry pumps out. But the answer is not quite that clear-cut. For one thing, managing a cloud-based system is never a matter of trimming the IT department down to one guy. It requires plenty of expertise and differing skill sets. It's foolishly wishful thinking to believe that these systems manage themselves.

A BusinessWeek article adds that "not all apps are right for the cloud. Those relying on clustered servers, for example, aren't good fits for cloud environments where they share resources with other customers, ... because they require identical configuration of each server and large dedicated bandwidth among servers, which can't always be guaranteed by a cloud vendor."

Finally, there's cost. Does moving to the Cloud always save money? There is some disagreement on this, but the consensus seems to be that the word "usually" is more accurate than "always," with some cloud applications more conducive to savings than others. The relative development and maintenance costs have to be carefully studied. And such things as current licensing and support fees must be taken into account, since customers could pay significantly more to their commercial software vendors by deploying their software in the Cloud than they would internally.

But reduced expenses should not be the only consideration, even though the Cloud could offer big savings there. Adoption of cloud computing can also cut costs by expediting the time to market for innovative ideas. And that can be of equal or greater importance to sharpening a business's competitive edge.

The Future

The Cloud is neither a panacea that will immediately change everything for the better, nor is it a solution of the one-kind-fits-all variety; it's a tool of potentially great value whose ultimate importance is yet to be determined. It won't replace everything else. But one certainty is that its growth will continue at breakneck speed as the natural human resistance to change breaks down.

Just as people were once adamant that they'd never put their credit card information online and eventually relented, so it is with entrusting all of one's secrets to an off-site custodian. At first, it's totally scary. Then it becomes commonplace. Then everyone claims they were for it all along.

Right now, as noted earlier, there are few ways to invest directly in the Cloud that make sense. But we're watching developments closely, and as cloud computing expands, opportunities to profit from the technology will surely arise.

Alex Daley, editor of Casey Extraordinary Technology and Casey Research's chief investment strategist, will reveal some of his favorite tech plays - and more - at the Navigating the Politicized Economy Summit in Carlsbad, California, September 7-9. He'll be joined by the rest of the Casey Research editors, as well as a host of renowned financial luminaries and government insiders, including resource investing legend Rick Rule, former US Comptroller General David Walker, and - of course - our own Doug Casey.

Not attending? No problem. You can still hear every word of their priceless advice - as well as all of the recorded presentations (over 20 hours' worth) - with the Summit Audio Collection. Add this invaluable set to your investment library before the Summit ends and enjoy early-bird pricing.

Bits & Bytes

Imaging at a Trillion Frames per Second (TED)

Have you ever wondered how the fastest thing in the universe - light - looks in slow motion? Well, thanks to researchers from MIT's Media Lab, you need not wonder any longer. The group, led by Ramesh Raskar, has created a camera that allows us to see the world at a trillion frames per second. This new type of photography, called femto-photography, is an imaging technique so fast that you can create slow-motion videos of light. The technology may someday be used to build cameras that can look "around" corners or see inside the body without X-rays. The link above takes you to Raskar's enthralling recent talk at TED Global 2012.

Apple vs. Samsung: What the Verdict Means for Tech (ABC News)

You've probably heard by now that a verdict was reached on Friday in the epic Apple vs. Samsung trial. The jury handed Apple more than $1 billion in damages and found that Samsung willfully infringed on its patents. Samsung then saw its market capitalization fall by $12 billion during trading hours on Monday, reflecting a decline of 7.5% according to S&P Capital IQ and the Wall Street Journal. At this point, it's not clear whether the hype surrounding the verdict is overblown. Piper Jaffray analyst Gene Munster noted that the verdict does not appear to affect newer devices like the Galaxy S III. He expects only minor (not meaningful) interruption in Samsung device sales in the US. But what is certain is that this is not the last we've heard on this case; and Apple's competitors outside of Samsung are going to have to take a good, long look at their products, to make certain they don't meet a similar fate. Tech reporter Joanna Stern provides some commenta ry on what this could mean for the mobile-tech world.

Spintronics: Essential for the Next Generation of Data-Storage Devices (Brookhaven National Laboratory News)

Each of the various technologies used for data storage today involves some sort of compromise. DRAMs ("dynamic random-access memory") are very high density (i.e., large memory size) and relatively cheap, but they need to be constantly refreshed and therefore use significantly more power than static memories. SRAMs ("static random-access memory") do not need to be refreshed, since they hold their memory sitting idle (they do require a minimal amount of power or the memory will disappear). SRAMs are also the fastest semiconductor memory, but they are not as good as DRAMs for large blocks of memory because their silicon area footprint is much larger. Meanwhile, the various forms of PROM ("programmable read-only memory"), like Flash, that hold their memory without power (i.e., are forms of nonvolatile memory) are expensive and relatively slow. Rotating memories, like hard disks, are some of the lowest-cost methods to deliver mass storage, but they require moving parts and therefore lack the inherent reliability of a semiconductor memory. But what if we didn't have to compromise? What if we could have the density of DRAM, the speed of SRAM, and the nonvolatility of Flash all in one inexpensive and inherently reliable universal memory technology? Well, it might be possible, thanks to a new field called "spintronics" - a technology that relies on electron spin rather than charge to acquire, store, and transmit data. A group of scientists from the Brookhaven National Laboratory has measured a key effect of electron spin essential to engineering the next generation of high-performing mobile devices.

About the Author

Editor: Casey's Gold & Resource Report
dhornig [at] caseyresearch [dot] com ()
randomness