Skip to main content

Microsoft's Azure a danger for storage vendors?

Yup! Dark days, Chris. All Storage vendors are going to die! Now seriously, lets sit back and think about it. What we may have forgotten, is:

  • Data Center build-up is great, there will be Data Centers that communities will choose. Reasons will be a lot more, a lot local and yet cloud happy data centers, that will be quick to setup, that will be literally agile and that will cost a lot less! So, are IBM, Cisco, Intel, Microsoft all betting too much on one farm? We'll find out pretty soon.
  • We are building up data centers, we still don't know how to get those customers into those data centers. That still needs to happen. A lot of effective evangelism needs to take place.
  • Data Centers for 50 years, huh? Do you know that your data centers will also have a very expensive refresh cycle? How are you going to do that while charging bare minimum, in order to compete with others? That refresh cycle will happen 10 times in those 50 years, if not more often!

So lets do the math: If your typical data center costs around $300 Mn and if 10% of that budget is required for those 5 yearly rebuilds/refreshes, then you're spending $300 Mn per data center for those 50 years. Now if you have 100 data centers across the world then it'd cost you $30 Bn on refreshes!

Cloud Computing is going to end up being a very expensive game that way. So, I don't think Microsoft is thinking on killing any storage vendors.


The thing is that cloud computing users buy less storage. Think about it. If Microsoft gets 10,000 customers for its remote data centre services then Microsoft buys servers and storage for the 10,000, while the 10,000 don't buy servers and storage for their apps that now live in the Redmond cloud. Let's suppose that Amazon, Google and Microsoft get 250,000 customers each. That's 750,000 businesses not buying servers and storage for their remotely run apps. Add in Dell, HP and IBM - they're just bound to pile in - and we could have six major cloud computing suppliers with half a million customers each by 2012, meaning 3 million fewer customers directly buying servers and storage for their apps, because they've been transferred to the Cloud.

Cloud computing means fewer and larger storage buyers. The server industry has already savagely consolidated. The storage industry has yet to do so. Judged from a server industry point of view the storage industry is ridiculously over-supplied. Let's list the tier one and two storage array suppliers - Dell, EMC, HP, HDS, IBM, and NetApp, then 3PAR, Compellent, Pillar and Sun. We can add in tier 3 suppliers (NEC, OnStor, Infortrend etc) and the virtual tape library (Overland, Quantum etc), de-duplication (Data Domain) and reference data (Copan, Nexsan) people and say there are getting on for twenty storage array suppliers.

Fewer buyers of storage who buy in larger numbers will encourage storage industry consolidation. Count the number of backup software suppliers - there are far too many. Every customer who moves over to the cloud will stop buying data centre storage hardware and software. They won't purchase arrays to store the data they don't hold any more, or business continuity and disaster recovery software to shift to backup data centres they no longer have, backup software, backup reporting software, security software for files they no longer need to protect, replication software for the data they no longer need to replicate ... you get the picture.

Comments

Popular posts from this blog

Security: VMware Workstation 6 vulnerability

vulnerable software: VMware Workstation 6.0 for Windows, possible some other VMware products as well type of vulnerability: DoS, potential privilege escalation I found a vulnerability in VMware Workstation 6.0 which allows an unprivileged user in the host OS to crash the system and potentially run arbitrary code with kernel privileges. The issue is in the vmstor-60 driver, which is supposed to mount VMware images within the host OS. When sending the IOCTL code FsSetVoleInformation with subcode FsSetFileInformation with a large buffer and underreporting its size to at max 1024 bytes, it will underrun and potentially execute arbitrary code. Security focus

Splunk that!

Saw this advert on Slashdot and went on to look for it and found the tour pretty neat to look at. Check out the demo too! So why would I need it? WHY NOT? I'd say. As an organization grows , new services, new data comes by, new logs start accumulating on the servers and it becomes increasingly difficult to look at all those logs, leave alone that you'd have time to read them and who cares about analysis as the time to look for those log files already makes your day, isn't it? Well a solution like this is a cool option to have your sysadmins/operators look at ONE PLACE and thus you don't have your administrators lurking around in your physical servers and *accidentally* messing up things there. Go ahead and give it a shot by downloading it and testing it. I'll give it a shot myself! Ok so I went ahead and installed it. Do this... [root@tarrydev Software]# ./splunk-Server-1.0.1-linux-installer.bin to install and this (if you screw up) [root@tarrydev Software]# /op