The Dynamic Duo of SaaS and Green IT
The convergence of software as a service (SaaS) and Green IT are creating a symbiotic relationship whereby the market adoption of each new technology contributes to an increased value proposition and accelerated advancement of both technologies. The most cited benefits of SaaS include the elimination of up front capital expenditures, lower total cost of ownership and outsourcing non-core competencies such as IT maintenance and administration to a centralized data center operator. This last benefit provides enormous value to the Green IT movement which seeks to reduce energy consumption and carbon emissions.
The SaaS model delivers a twofold combination to carbon dioxide emission reductions. The first benefit is from the economies of scale gained from centralized processing and a shared services computing model. Instead of hundreds or thousands of customers individually operating thousands of servers and the power hungry facilities to sustain those servers, the SaaS multi-tenant model centralizes and shares data center operations to use far less equipment and a small fraction of the facility costs. When you realize that the supporting facility costs for single tenant data centers generally outweigh the emissions production of the servers and related computer equipment which they house you get an idea of the mass efficiencies gained with SaaS. The bottom line is that one centralized data center is dramatically more energy efficient than hundreds or thousands of single tenant data centers. A recent McKinsey data center research report demonstrated that for many companies, server utilization is often 5% to 10% and can be as little as 3%. SaaS providers are forced to maximize much higher server utilization in order to achieve economies that support their business models.
The second SaaS-related benefit to carbon dioxide emission reduction is derived from SaaS providers taking a leading role in their commitment to decrease energy consumption and increase their cost savings. With high data center utilization and equipment density, SaaS providers more routinely recognize and take advantage of energy-efficient equipment, server refresh cycles and energy savings best practices.
For cloud computing and SaaS companies, efficient data center operations is their core competency. SaaS providers are in the full-time data center business and as a matter of practice exercise thoughtful decision making and a desire for improved power consumption, cooling efficiency and equipment density in order to achieve both material paybacks and big gains for the environment.
With increased Green technology momentum, the SaaS value proposition which has historically touted savings in up front costs for hardware, reductions in capital expenditures for software licensing and the avoidance of recurring costs for IT personnel and software maintenance will now be bolstered by undeniable reductions in carbon dioxide emissions.
Technorati: SaaS, Green, Media Add to Technorati Favorites
Del.icio.us: SaaS, Green, Media Save this page to del.icio.us
Posted In: SaaS, Green IT Permalink Comments (5) Trackback (1)
TRACKBACK
Trackback URL for this blog post: http://www.Vantivemedia.com/greenit.php.
READER COMMENTS
By Kevin Keough.
I'm looking for some specifics in how to save energy in my data center. Unfortunately, my googling finds many articles indicating it is a good idea, but few with how-to specifics. Any suggestions?
Response: Sure. Below are a sampling of suggestions which maximize energy efficiency and minimize environmental impact. Greening your IT environment or data center offers low barriers to entry and anybody can make a difference as process reengineering and rethinking the ways you deploy computer equipment can have as big an impact as procuring newer more energy-efficient equipment. Here are some things to consider in order to realize lower costs, less power and achieve a cleaner environment.
First, I recommend you talk to your peers, understand how they are approaching this opportunity and learn from their experience. The below green initiatives were sourced from an Infoworld research project and illustrate the most common green savings projects.

Second, baseline your power consumption. Know exactly where you consume power and realize the trends of where power consumption is changing. Then calculate the financial cost for each material power consuming device. Be alert that IT equipment generally does not consume most of the data center energy. The below graph of an APC study illustrates typical data center power allocation.

Decommission, consolidate or just turn off your mystery servers. Sun systems research suggests that between 8% and 10% of all running servers have no identifiable function. It is a fact that the power to run a server costs more over its lifetime than the purchase cost of the server. Performing a data center reconnaissance to hunt down equipment without purpose provides a big quick hit in your mission to reduce energy. If you discover servers and cannot identify their purpose after a reasonable investigation, turn them off and wait for the help desk call. When the help desk call does not arrive after a sustained period, you can then remove the server from the power grid.
Consolidate and virtualize your servers for greater efficiency. It has been my experience that most IT staff do not really know how underutilized their servers and equipment really are. Server virtualization allocates a single server into multiple, smaller virtual servers without multiple increments of power consumption. Leveraging unused and available processing capacity without drawing additional power results in near linear savings to the bottom line. While server virtualization provides big energy savings, it should also be complimented with IT management tools in order to ensure performance management, real-time visibility and leveling analysis across your (virtual or physical) server farm. While virtualization aids network provisioning and management in many ways (e.g. you can configure a new virtual machine (VM) in minutes instead of hours for a physical server), it also adds some new complexity as the number of virtual assets will surpass the prior number of physical assets. Managing virtual assets requires new tools and new skills and some new thinking to understand the relationships between physical and virtual assets. My experience has been that the learning curve is short and the tools are inexpensive, however, like data center the processes must be thought-through for omissions and implications and must then be scheduled in advance, performed according to change management and weaved into other data center SOPs (including information security, fail-over redundancy, performance, scalability, system fault tolerance, business continuity, etc.)
Consider replacing servers more than three years old with newer energy-efficient models. Also review the CPU performance-stepping technology which dynamically adjusts the processor energy consumption based on processor load.
Minimize storage equipment by using SANs and other network attached storage hardware which consolidate storage space and save power.
Review your storage performance. Energy savings opportunities are too often limited to servers. While storage systems are fewer in number they are typically growing their capacity at approximately 50% per year (source: IDC) and are among the most intense energy consumers in the data center. Storage devices are often ideal for improvement as they consume on average 13 times more power than servers and storage assets often have a low utilization (frequently less than 25%). SAN, NAS and other storage virtualization can drive use utilization rates from 25% to 50% or higher. I’ve also realized both energy and hardware versatility benefits from inter-operable devices such as hot-swappable disk drives, power supplies, fans and other modular components such as blades and network connectors.
Also review storage data deduplication strategies. Data deduplication reduces redundant data processed and maintained in storage assets. The benefits of removing duplicate data include improved system management, increased energy savings, better data integrity and often times an improved user experience for such user functions as data retrieval, queries and reporting. Data deduplication strategies are now advanced with new technology that creates calculated hashes for each data block so that if a duplicate hash matches that of another location the storage devices does not restore the redundant block. Duplication techniques and tools provide an added boost with virtualization. Each VM redeploys the operating system multiple times. Deduplication tools can lower storage space requirements and maximize data access to data residing in the filer’s cache.
Review thin storage provisions. This storage management technique limits the allocation of physical storage based on levels and/or need and then automatically adds incremental capacity only when it is actually required. This capability is often a SAN, NAS or iSCI hardware function and physically allocates disk blocks to a given application only when the blocks are actually written. This removes the installation allocation guessing game and the all too common over-allocation of storage space just in case it is needed.
Implement IT charge-backs. The pay-for-use process will help in improving equipment accountability and utilization.
Do not forget supplemental cooling if you are using high density equipment. I’ve seen this mistake made multiple times over. Many data centers are well past the cooling capacity provided with raised floors, which is typically about 4kW to 7kW per cabinet. While I continue to hear claims of higher cooling capacity, I personally have not found a raised floor cooling exhaust more than 7 kilowatts per cabinet. Cabinets with blade servers are often spike above 12kW and I’ve seen some exceed 30 kW. These types of loads should get supplemental, localized cooling.
Hot and cold isles. I always took this practice as a given. However, much to my dismay I continue to engage in debates with data center operators who question the savings. Our data centers use hold and cold isles and I have clearly witnessed the energy benefits. You may want to seek other opinion on this decades old practice to draw your own conclusion. Also, if you choose to implement this technique in a colo environment, make sure all the tenants follow the same configuration. I’ve been in a few colo’s where the exhaust sides of equipment don’t always face each other which results in one cabinets heat exhaust becomes another cabinets inflow.
Data center equipment segmentation. Servers and most network appliances operate at higher ambient air temperatures than storage devices and some other equipment. Similarly, 1U servers often overheat much more quickly than 2U servers - although hardware manufacturers challenge this claim. I do not know for sure, however, I suspect that the 1U servers have insufficient airflow and inefficient fans as they are just so small. Because storage equipment requires about 12% to 18% cooler air, isolating this equipment into a separate location permits the ambient air temperature for the rest of the data center to increase to 78 to 80 degrees F and thereby result in a substantial power savings.
Avoid cabinet glass doors. They look cool, however, they turn cabinets into ovens. You can still see the blinking lights without the glass.
Proper cable management inside the cabinet and under the raised flooring will result in improved airflow and more efficient cooling. Be sure to keep cables neat, labeled and bundled – and do not stuff the cables into insufficient space.
Do not over refrigerate the data center. Data center ambient air temperature can be as high as 78 degrees F. You may choose to reduce that number if you operate SANs which require more cooling than the rest of the equipment. Our data centers maintain 70 +/- 1 and humidity at 51% +/- 3. I strongly recommend not cooling much below 70 F as it adds cost without benefit. For every degree increase of ambient temperature, there is at least a few corresponding percentage points in cooling systems energy savings.
Consider 208V/30A power to reduce energy costs. Cabinet equipment which uses higher voltage 220 instead of 110 power reduces power loss through the normal wire and transformation process which permits the equipment to operate more efficiently. Also remember to use blank plates in your cabinets. This simple technique prevents hot spots and hot air from blowing into cool isles.
Stay away from perforated tiles on raised floors in a hot isle. Fortunately, this is not so common anymore, but I did just see this used in a Austin data center so I will add it to my list.
Turn the lights off. Consider motion detector lights activation so convenience is not lost and savings are gained.
Ask your hardware suppliers for recommendations. Hardware and equipment vendors have expert staff that can provide energy savings advice and design assistance.
Update your procurement policies and practices to seek out and favor energy efficient vendors and solutions.
Consider replacing multiple smaller UPSs with fewer central UPSs. UPSs now have inverter circuitry that turns DC power from batteries into AC power for IT equipment using a base level of power irrespective of the load. For bigger UPSs, consider replacing battery UPSs with flywheels. This is a significant capital investment so for most readers this is more likely something to look for when evaluating colo facilities. Battery UPSs are high maintenance and an environmental mess. The best battery replacement I’ve seen is flywheels. Flywheels completely replace chemical batteries by spinning a wheel which provides temporary power during lapses until generator power kicks in. Unfortunately, these devices are still quite expensive, however, they consume less floor space than batteries so they may make more sense for data centers in high dollar real estate locations.
Search for financial incentives. There are more than 200 state and local utility energy conservation programs which grant incentives and rebates for energy efficiency improvements. For example, Pacific Gas & Electric in California provides hardware, software and consulting reimbursement, up to $4 million, for server and storage consolidation projects.
Make sure IT approves the electric bill. It remains far to common for electricity and other energy bills to only be approved by the Corporate Controller or facilities people and without IT participation. IT management should be accountable for the data center power utilization and incented for operational and financial savings. For this to be most meaningful, data center electrical draw should be separated from the rest of organization (via sensors or installing a separate power meter).
Watch for future power supply changes. It appears the EPA is going to deliver new efficiency ratings on power supplies this year or next. This new rating scale could be a big deal. Between 1kWh and 1.5kWh can be saved for every 10kWh saved at the plug (source: EPA). The EPA is working with the Climate Savers Computing Initiative to develop new types of power supplies which are about 90% more efficient and can reduce greenhouse gas emissions by 54 million tons per year – achieving an energy cost savings of $5.5 billion.
Create a program to recycle obsolete or discarded equipment and recycle technology-related consumables (e.g. paper and printer cartridges).
For IT operators considering new data center construction, make sure you review things like catalytic converters on generators, alternative energy supplies such as photovoltaic electrical head pumps and water cooling technologies.
Each of the above suggestions can produce measurable, material and sustained savings. Even a single change can achieve big savings. For example, a single dual-socket, quad server with sufficient memory can replace 30 older, underutilized single processor machines. The power savings are in the range of 12 to 15 kilowatts and could easily translate to cost savings of $16,000 or more annually.

By Lee Hanna
I'm personally motivated to green our IT shop however I suspect that the initiative may lose momentum outside the IT organization, particularly as competing projects surface. Any suggestions on how to keep focus and progress for a green project?
Response: Sure. I suggest you begin with visible, vocal and active executive sponsorship before organizing the project team. Once you have secured your executive sponsorship, apply project management methods and discipline as you would for any other IT project. A few sample suggestions include:
- Gain early and broad involvement. Get representatives from every stakeholder group on board and and demonstrate an explicit and vigorous management commitment.
- Solicit broad participation and encourage staff to identify and suggest practices that save energy.
- Create a project plan backed with measurable objectives for energy savings, power efficiency and carbon reduction.
- Schedule periodic (monthly or quarterly) power consumption benchmark audits to verify monthly reporting and trend analysis.
- For team meetings, evaluate videoconferencing as a substitute for travel and implement telecommuting where practical.
- Participate in an industry group or consortium in order to share knowledge and collaborate with other like minded professionals.

By Anwwad
I am not knowledgeable on the benefits of virtualization beyond green computing. Can you explain if the main virtualization benefits are green savings or other?
Response: Virtualization is an answer to many IT challenges and opportunities. This technology can provide an effective means for computer equipment utilization, increased energy consumption, rising hardware costs, information security and IT staff productivity. Virtualization and server consolidation provide a unique method for businesses of all sizes to optimize their technology assets. When a company increases its server utilization it reduces the number of physical servers needed, thereby reducing the space footprint, power consumption and IT management. Further, the technology provides increased equipment flexibility and applications can be deployed more easily and efficiently, leading to increased staff productivity.
Send comments to blog[at]Vantivemedia.com. |