Ashlee covering Sun doing the Amazon EC2 with Caroline:
The concept of utility computing has technically been around since the mainframe days. You fire up a large machine and then rent out time to different users.
The utility model, however, gave way to client-server computing during the major data center build out that has been underway for decades. Companies became accustomed to running their own infrastructures and largely relied on placing one or a limited number of applications on each physical server.
Thanks to massive increases in compute power, storage and bandwidth and more attention to virtualization software a revitalized form of utility computing is taking hold. Some companies like Salesforce.com rent out a specific software utility – customer relationship management software. Others like Amazon.com and more recently IBM – the time-sharing king – look to offer up their data centers for a wide variety of tasks, including storage and application hosting.
Sun has spent the last couple of years trying to get a variety of utility computing services going. But it has mostly focused on more familiar big business customers with processor and storage renting by the hour or gigabyte. Hey, Pixar, do you need to run some huge rendering job? Send the work over to our super-charged cluster instead of buying your own hardware. Valero, you've got some oil exploration data to crunch? Boy, do we have the hardware for you.
Unfortunately for Sun, regulatory concerns slowed the delivery of these services with the US wanting to make sure data remained secure and isolated for each customer and that overseas terrorists couldn't tap into a supercomputer for, say, weapons designs or the most efficient ways to spread a chemical weapon through a crowd.
Link here and Caroline here.
Comments
Post a Comment