For long we have struggled with technologies. We have had a hard time knitting it all together and then we get to hear from these so called pioneers that the model just changed. And all the investments that were made to the data centers eventually were all wasted initiatives. These so called pioneers could very well save us from having our data centers shut down and us losing our businesses!
So why do we have to bury our current legacy data centers? Or to put it more mildly, why do we have to change the way we run our data centers. I will not delve too deep into this discussion, I'd rather want you to take a look at this presentation I gave in Africa, last year.
The case here is very simple. no long stories here, really if you need long stories go Google for stuff, read Wikipedia and search the web as to how it all can be done. I know here for one thing, we better start doing it now.
Why?
Data Centers are still being build up despite increasing oil and energy issues. Look at slide 6 , and imagine the amount of energy we spend and the amount of energy that is eventually used for computing. Add this all up, we are heading for a massive shutdown in no time. (Search for "Environment" and "Data Center" on this blog and you'll get some details on the buildup and environmental concerns around it)
So what does "Pragmatic Data Hotels" mean anyways?
Global warming, whether you agree or not that us humans are responsible for its acceleration, is here today. Monitoring it is one thing, doing something significant with our natural resources, then I mean other natural resources is another. With vulnerabilities, changing weather patterns and thus making it increasingly challenging to place all you bets and dollars on some data centers across the world.
And with Cloud Computing and applications scaling on the web, tools and technologies such as iPhone, Android etc and increasing bandwidths is making it easier to use a single device and enhance the consumers productivity. But we must stop supporting legacy technologies in those data centers that push all that information to the world.
The world of the 60's was similar, note not alike, as it kind of did things the same way. but today with the hybrid model, where we want to benefit from distributed computing, tasks which can be better outsourced to the consumers themselves such as defining and configuring their own UI, integrating their own apps using generic integrators and APIs and the need to have a customized, personal appliance/device at their disposal means that we will have to fundamentally change the ways how software and demand will twist our "old fashioned ways of doing things" within our data centers. somehow the consumer just changed that "our", it has become a collective our now, not anymore that data center owner's backyard.
Re-usability and Sequestration
We need to be able to do two things, atleast one is to be able to generate energy which is a lot cleaner and produces less CO2 and the other is to be able to conserve it in a much more economically, so that it doesn't escape out. While mobility will definitely be the trend which we will be looking at, we will also see newer and better ways to use energy and reduce power consumption. Just look at Intel, according to this report, they are using air economizer where you will just pump out the warmer, hotter air instead of cooling it within the data center, this is an excellent approach that can be used when those mini-high density data buses and boats will run across the world.
You will have a massive compact memory from someone like MetaRAM, intel/AMD multi-core processors.
Can you imagine! You Mobile DataBus could have petabytes of RAM and TeraHertz processors (they may exist you know)! A single rack could be able to service huge amount of capacity.
We need to be able to do two things, atleast one is to be able to generate energy which is a lot cleaner and produces less CO2 and the other is to be able to conserve it in a much more economically, so that it doesn't escape out. While mobility will definitely be the trend which we will be looking at, we will also see newer and better ways to use energy and reduce power consumption. Just look at Intel, according to this report, they are using air economizer where you will just pump out the warmer, hotter air instead of cooling it within the data center, this is an excellent approach that can be used when those mini-high density data buses and boats will run across the world.
You will have a massive compact memory from someone like MetaRAM, intel/AMD multi-core processors.
Can you imagine! You Mobile DataBus could have petabytes of RAM and TeraHertz processors (they may exist you know)! A single rack could be able to service huge amount of capacity.
Mobile Databus (with built-in WiMax extenders & Petabytes RAM and TeraHertz processing power!)
We have got to find ways to host our datasets on numerous forms of existing means so as to support mobility and still use better, smarter technologies. That way we will not only be able to balance the load that the data centers will be creating, we will be carving out a model for a pragmatic data hotel ,where datasets will be secured and encrypted and will hop from hub to hub. From a mobile bus to a Google Databoat. Google is already working in that direction.
Google Databoat: Floating data center/docks at GoogRigs & Harbors
...and eventually the datasets will be spread out across all existing infrastructures, such as above, you might have your typical electricity pole (shown below) retaining some part of your data. Obviously the policy makers must ensure that stringent compliance and security measures must be put in place and trust me, the breaches will be a lot more contained than our typical data thieves and mischief makers. (I won't get into this but the more I talk to my Cloud Computing/Managed Hosting providers, the more I get to know that it was an "inside job". As I said, we are not getting into security and compliance as of yet, we'll write another article on that)
Data Subset on a utility electric pole
So the future data centers should rather be called "Data Hotels" where the "guests", compartmentalized multi-tenant GDM-aware datasets, will move from data center to data center. This is what the increasing bandwidths, and globalization is bound to provide us with. So in fact, the real platform here is the network. An effective GDM (global delivery model) will ensure that our datasets will move across continents to continents seamlessly.Isn't this the dream of any CIO/CEO? Why spend huge amount of money talking to numerous parties and independent sourcing advisors when you can decide , based on that intuitive, visualized dashboard, what exactly and which dataset of data center you would like to move to some other location?
I am working on my next model of Visualizational Dashboard, a 3-D model where CxOs have direct visibility into the dynamic aspects of all the domains, Business or IT. I showed my first one at a virtualization conference in Brussel and the next one is coming at CloudCamp soon (also in Brussels). This visibility will assist us in adopting a pragmatic approach to a more realistic yet optimistic means of sourcing our business activities to a more viable (politically, environmentally, finacially etc) location, the dataguests would then be ready to relocate to the other data hotel. Relying os a few high end data centers, where too much is at stake and you still end up waiting while your competitive edge begins to dull.
And that is why we will need pragmatic data hotel approach, we have placed too many bets on a single mule, we are better off by spreading our risks across mobile vehicles in all shapes in order to combat the increasing demands on the wide-open mouths of those who are now joining the revolution.
Comments
Post a Comment