Strategies for Cloud Computing, Michael Jackson's Sony Website
Native Americans had no trouble believing that creatures from the spiritual world roamed at will among those of the physical. At night, these visitors became shape shifters, transforming themselves from the coyote, the bear, or the raven into a spirit form, then changing back again at daybreak.
Cloud computing is nothing if not similarly amorphous. The cloud’s hard-edged, warehouse-sized data centers accessible on the Internet, filled with seven-foot-tall racks of pizza box servers, seem concrete enough. But when an individual end user accesses a server in the cloud, the server has the ability to take on or shed processing cycles from CPUs and use more memory or less, as needed. The user’s cloud machine expands according to her needs and shrinks when peak processing is over. It may be on one side of the data center one moment and on the opposite the next. The end user hasn’t slowed down what she’s doing; the shift in servers occurs without her realizing it. In the cloud, the computer becomes a shape shifter. It’s not limited by the box it arrived in; instead, it’s elastic.
When you need a computing resource to serve you, but you don’t know how much of it you’re going to need, this special characteristic of the cloud — elasticity — will serve you well. To see this elasticity in action, take the example of Greg Taylor, senior system engineer at Sony Music Entertainment, who is responsible for the computing infrastructure that supports the Web sites of thousands of recording artists and hundreds of individual artists’ online stores. In 2009, Taylor felt that he had adequate monitoring systems and surplus capacity built into his infrastructure. At theMichaelJackson.com store, for example, he could handle the shopping transactions and record comments from 200 shoppers at a time on the store’s site.
Upon the star’s unexpected death on June 25, 2009, the site was suddenly overwhelmed with people who wanted to buy his music or simply wished to congregate with other grieving fans and leave a comment. Sony Music saw an influx of more than a million people trying to access the Michael Jackson music store over the next 24 hours. Many wanted to post comments but could not. The servers stayed up, but not everyone who wanted to find album details could be served that information, and indeed, many would-be purchasers could not buy because traffic overwhelmed what was already “a very database intensive” site.
Other surges were felt around the Internet. The Twitter broadcasting site was overwhelmed by users’ tweets and slowed to a standstill. TicketMaster in London slowed to a crawl. Yahoo! was staggered by 16.4 million site visitors in the 24 hours, compared to a previous peak of 15.1 million on Election Day.
“Our site became the water cooler for everyone wanting to remember Michael Jackson,” Taylor recalled in an interview four months later.
Sony Music’s top management told Taylor that it was not acceptable to have traffic trying to reach a company music site and have would-be customers left hanging, with no response from an overwhelmed site. With 200 individual artists’ e-commerce sites engaged in capturing both transactions and user feedback, Taylor had a large problem that couldn’t be solved in the conventional way: buy a lot more servers, more network bandwidth, and more storage, and throw the mat the problem. If he had followed this route, most of that expensive equipment would have sat unused in Sony’s own corporate data center. What’s a senior system engineer to do?
Taylor has since re-architected the Michael Jackson store, AC/DC’s online store, and other popular artists’ sites so that traffic can be split into two streams when necessary: those who are buying music (conducting transactions) and those who are just seeking information. The transactions remain on the core store site hosted on Sony’s dedicated servers, but visitors who are seeking read-only content, such as background on an artist and his albums, can be shunted off to the multitenant servers in the cloud. Many cloud customers in addition to Sony Music share those servers, keeping the costs for the music company low.
The cloud service that Taylor chose was Amazon Web Services’ Elastic Compute Cloud (EC2). In the future, Sony will build each artist’s store in tandem, with an e-commerce site and a related but separate information-serving site in EC2. When the e-commerce site starts to get overloaded, the latter can expand to meet nearly any foreseeable traffic count, thanks to the elasticity of the cloud.
SMBs are getting the analytical capabilities to drive faster decisions based on better data
BI For the Small-Medium Business
As traffic at any artist’s Web site builds up to a point where the site can’t handle more, new visitors get shunted over to the read-only cloud site, where they can at least find information and identify something that they want to buy. Under the Amazon agreement, cloud servers will scale up to handle as many as 3.5 to 5 million visitors per day, if the occasion ever arises that they need to. In a big traffic spike, a visitor might not be able to purchase an album immediately but will never go away miffed at not being served at all.
The new architecture reflects a changing world where online activities and social networking have taken on added importance. Sony management wants Taylor to be ready for the changes in customer behavior. In the past, there would have been less opportunity for the news of a pop star’s death to spread so fast or to result in such a spontaneous outpouring of grief and comment at a well-known music site. If the need arises again, Taylor is in a position to fire up 10 more servers in the cloud as soon as traffic starts to build.
Such elasticity is one of the things that distinguish cloud computing from large corporate data centers. Many data centers include a specially engineered elastic capacity reserved for a select few users, such as major customers who are trying to make purchases on a site that is already busy with browsing visitors. In some cases, more servers are engaged to handle the traffic. But it’s also possible for the information seekers to experience delays or even get booted off the site until the buyers have completed their transactions. In the cloud, however, there’s no need to turn away desired traffic. Additional “virtual machines” can be fired up quickly to handle all comers.