Utility Computing – Is It Real?

March 6th, 2007 |
Image for FaceBook

Share this post:
Facebook | Twitter | Google+ | LinkedIn | Pinterest | Reddit | Email
This post can be linked to directly with the following short URL:

The audio player code can be copied in different sizes:
144p, 240p, 360p, 480p, 540p, Other

The audio player code can be used without the image as follows:

This audio file can be linked to by copying the following URL:

Right/Ctrl-click to download the audio file.
Connected Social Media - iTunes | Spotify | Google | Stitcher | TuneIn | Twitter | RSS Feed | Email

Utility computing is not a new concept, but the technologies that make it viable are finally maturing. Properly deployed, utility computing can increase server utilization rates, reduce the requirement to build overcapacity and lower operating costs. This podcast identifies key success factors for organizations hoping to capture the benefits of utility computing.

Moreover, utility computing is a dramatic departure from the ways IT departments have traditionally worked. Like providers of electricity, gas, water and other utilities, organizations can use the utility computing model to consolidate capacity and automatically allocate resources based upon the real-time requirements of users.

As a result, the utility computing model can contribute to achieving extremely high server utilization rates — and greatly save when it comes to the cost of adding and managing data center capacity in the traditional way.

Join BearingPoint technologist Frederic Veron to explore why, for these reasons and more, BearingPoint believes that the time is right to implement utility computing.

Tags: , ,
Posted in: BearingPoint, Connected Social Media, Corporate, Technology