Skip to main content

Bill Gates on Cloud Computing

M: Everybody now is talking about software as a service (SaaS), cloud computing, and those sorts of things. How does the move towards those kinds of models impact desktop computing, which is clearly Microsoft's legacy, the thing Microsoft is best known for?

B: There's always been this question of "Where is computing being done, right next to you or far away?" And the more bandwidth and lower latency you have, the more flexibility you have about how you split that computer task. Time sharing had terminals where almost nothing was happening locally. Whether it was a character-based display, a 3270 or X protocol, everything but presentation was happening centrally. Then the PC swung it, before the Internet shows up, to where you're doing everything on that local device and only the file store and in some cases, the database store, are done remotely, but you have most of the business logic as well as presentation, editing, and interactions done on that device. The beauty of that is you can work offline, you get great responsiveness, you don't have to worry about the latency. Those of us who grew up with time sharing understand going back to timesharing, even with great capacity, is not that great.

Now you have more of a balance. HTML is back to the terminal model. When you browse a Web site, although HTML is way more complicated than most presentation protocols, it is a presentation protocol. Now you mix that in when you put active controls in or local script. All that AJAX stuff lets you now do some code execution. So it's ironic that the good websites are the ones that aren't using HTML, they are using local execution.

Now we are in a world where you can get the best of both worlds, when you call a subroutine, that subroutine can exist on another computer across the Internet. We now have tools for developers so they can call a service right across the Internet and they think they are calling a local subroutine.

Everything in computer science is to just write less code. What is the technique for writing less code, and its called subroutines. Everything that has ever been done—object-oriented programs, software as a service—it's about taking this idea of subroutines and being able to use them broadly. When you want to draw a map, you say "That's hard, a lot of data; I just want to call a subroutine." Well now you can call Virtual Earth or Google Earth and get back the presentation in this great form. You don't have to think about the data, the format. So we are taking subroutines to this next level and making that simple. Actually debugging the stuff, performance, making it work offline—there is still work being done on this.

In the extreme case, we can take somebody's data center and run it for them on the cloud. All the issues about administrative, capacity, who owns the data, what happens when things go wrong, when people are getting error messages, that's cloud computing and there is a lot of deep invention and work. I would say we are investing more in letting businesses use cloud computers than anyone is, and we have some brilliant projects that Ray Ozzie will be talking about more over the next year.



Source

Comments

  1. By Dan D. Gutierrez
    CEO of HostedDatabase.com

    Mssr. Gates always "gets it" eventually when it comes to a new technology. The issue is how Microsoft can transition from an on-premise software model to a on-demand hybrid model in a reasonably short time period.

    My firm launched the web's first Database-as-a-Service offering in 1999 which we touted as a "Microsoft Access for the web". Fast forward nearly 10 years and we are just now seeing excellent adoption rates. The future is indeed promising for SaaS.

    ReplyDelete

Post a Comment

Popular posts from this blog

Security: VMware Workstation 6 vulnerability

vulnerable software: VMware Workstation 6.0 for Windows, possible some other VMware products as well type of vulnerability: DoS, potential privilege escalation I found a vulnerability in VMware Workstation 6.0 which allows an unprivileged user in the host OS to crash the system and potentially run arbitrary code with kernel privileges. The issue is in the vmstor-60 driver, which is supposed to mount VMware images within the host OS. When sending the IOCTL code FsSetVoleInformation with subcode FsSetFileInformation with a large buffer and underreporting its size to at max 1024 bytes, it will underrun and potentially execute arbitrary code. Security focus

Splunk that!

Saw this advert on Slashdot and went on to look for it and found the tour pretty neat to look at. Check out the demo too! So why would I need it? WHY NOT? I'd say. As an organization grows , new services, new data comes by, new logs start accumulating on the servers and it becomes increasingly difficult to look at all those logs, leave alone that you'd have time to read them and who cares about analysis as the time to look for those log files already makes your day, isn't it? Well a solution like this is a cool option to have your sysadmins/operators look at ONE PLACE and thus you don't have your administrators lurking around in your physical servers and *accidentally* messing up things there. Go ahead and give it a shot by downloading it and testing it. I'll give it a shot myself! Ok so I went ahead and installed it. Do this... [root@tarrydev Software]# ./splunk-Server-1.0.1-linux-installer.bin to install and this (if you screw up) [root@tarrydev Software]# /op

Virtualization is hot and sexy!

If this does not convince you to virtualize, believe me, nothing will :-) As you will hear these gorgeous women mention VMware, Akkori, Pano Logic, Microsoft and VKernel. They forgot to mention rackspace ;-) virtualization girl video I'm convinced, aren't you? Check out their site as well!