Skip to main content

Gartner: Microsoft's Cloud to be more robust and green


The company is planning to invest heavily in its online infrastructure to meet the expected demand. Providers, including arch-rival Google as well as Adobe and Amazon, are now increasingly able to replace and augment the functionality of local PCs with web-based services running on large distributed server farms.

For Microsoft, this capability represents an attack on its traditional PC operating system and office application business. To counter this, it will set up 20 new data centres over the next 20 years at a cost of a billion dollars each. Debra Chrapaty, Microsoft's VP of global foundation services, speaking to the US media, said "We're going to reinvent the infrastructure of our industry,". It took Chrapaty's team two years to set up Microsoft's first cloud data centre, which opened in Washington State last year. Because of the high energy requirement, the server farm was purposely located close to a new hydroelectric power station.

It took the team just nine months to build the next datacentre in San Antonio, Texas. One of the reasons for their success is the modular design of the new blocks of servers. Up to 2,500 servers are installed in a freight container at the factory and fitted with mains power and cooling infrastructure. There is no need to unpack computers from cardboard boxes on site: these mini-datacentres are driven into giant hangar-like buildings, the whole container connected up, and then they can become operational as soon as the software has been installed. It takes two days to activate a new container. There is room in the 70,000 square metre datacentre in San Antonio for over 220 containers – a theoretical capacity of half a million servers. The architecture saves energy, time and staff workload. According to Microsoft's own statements, these datacentres will need half the staff of previous centres and the software company expects to cut its energy budget by a third. Hardware manufacturers like Sun and Rackable Systems have proposed similar concepts before. David Capuccio, an analyst with Gartner, sees Microsoft's version as being more robust as well as more likely to offer worthwhile power savings.



Source

Comments

Popular posts from this blog

Security: VMware Workstation 6 vulnerability

vulnerable software: VMware Workstation 6.0 for Windows, possible some other VMware products as well type of vulnerability: DoS, potential privilege escalation I found a vulnerability in VMware Workstation 6.0 which allows an unprivileged user in the host OS to crash the system and potentially run arbitrary code with kernel privileges. The issue is in the vmstor-60 driver, which is supposed to mount VMware images within the host OS. When sending the IOCTL code FsSetVoleInformation with subcode FsSetFileInformation with a large buffer and underreporting its size to at max 1024 bytes, it will underrun and potentially execute arbitrary code. Security focus

Splunk that!

Saw this advert on Slashdot and went on to look for it and found the tour pretty neat to look at. Check out the demo too! So why would I need it? WHY NOT? I'd say. As an organization grows , new services, new data comes by, new logs start accumulating on the servers and it becomes increasingly difficult to look at all those logs, leave alone that you'd have time to read them and who cares about analysis as the time to look for those log files already makes your day, isn't it? Well a solution like this is a cool option to have your sysadmins/operators look at ONE PLACE and thus you don't have your administrators lurking around in your physical servers and *accidentally* messing up things there. Go ahead and give it a shot by downloading it and testing it. I'll give it a shot myself! Ok so I went ahead and installed it. Do this... [root@tarrydev Software]# ./splunk-Server-1.0.1-linux-installer.bin to install and this (if you screw up) [root@tarrydev Software]# /op...