Data Center Costs Are Changing

October, 2007

Understanding trends in data center costs is key to managing data center spending. For example, IT executives know that hardware costs are declining. But by how much? And what about other costs, such as software, personnel, and facilities? Is the long-term trend up or down?

The mix of costs is also important. Which costs are increasing as a share of the total, and which are declining? Having the answers to these questions helps the data center manager know which costs are becoming more important and where to devote the most attention.

This Research Byte is a summary of our full report, The Shifting Mix of Data Center Costs.

Data Center Unit Costs Are Declining
Figure 1 shows data center costs for Unix and Windows systems. To minimize the effect that multiprocessor servers would have on the unit cost calculations, we normalized costs on the basis of processors rather than servers.

At first glance, it may appear that Unix is a much costlier platform than Windows: The average cost to operate a Unix processor in 2006 was $29,300, while the average cost for Windows was only $11,900. However, much of the difference in cost is due to the larger percentage of the Unix systems running high-end applications, which require larger, more expensive processors, more disk space, and additional license fees for system, database, and resource management software.

The purpose of this analysis is not to compare the two platforms in terms of their absolute costs, but rather to compare the cost trend over the past few years and to understand how the cost elements have been shifting. Over the past five years, data center costs normalized on a per-processor basis have been declining, as shown in Figure 1. Average total cost for Unix fell from $33,600 per processor in 2002 to $29,300 in 2006–a drop of 12.8% over five years. During the same time period, average total cost per Windows processor fell from $13,900 to $11,900–a drop of 14.3%.

DataCenter Fig1 - Data Center Costs Are Changing

The data for this analysis is based on data-center benchmark studies of Unix and Windows server farms conducted over the past five years by our research partner, Metrics Based Assessments (MBA). In each of the five years, MBA obtained 80-100 observations for each platform, of which approximately 40% were repeat observations from previous years, providing a basis for tracking changes in data center spending over the five-year period. The costs in this analysis include:

  • Data center hardware, excluding network hardware
  • Operating systems, utilities, database, and other infrastructure software (applications software cost is not included)
  • Data center personnel
  • Other data center costs, such as data center supplies, facilities, outsourcing, disaster recovery, off-site storage, training, etc.

The full version of this report examines the major categories of data center cost for Unix and Windows servers on a per-unit basis for the years 2002 through 2006. It also analyzes how these costs have been changing and makes recommendations based on these trends.

This Research Byte is a brief overview of our report on this subject, The Shifting Mix of Data Center Costs. The full report is available at no charge for Computer Economics clients, or it may be purchased by non-clients directly from our website at (click for pricing).

A comprehensive set of data center benchmarks are available in Mark Levin’s book, Best Practices and Benchmarks in the Data Center.