Data centers are complex and require a variety of checks and balances to make them affordable and efficient. Data center design is complicated because any change to the infrastructure can affect those checks and balances. What may seem like a trivial design change or hardware addition can have a ripple effect that influences hardware, software, network capacity, and even power and cooling. Having the right tools can simplify data center design and eliminate mistakes in advance.
As with most things, effective data center design is about balancing cost of ownership with return on investment. Anyone who operates a data center knows that the primary costs are in facilities operation and management, server technology, hardware, software, network bandwidth, and staffing. You need to perform capacity planning to make sure you don’t overdesign or underdesign the infrastructure.
For example, designing a data center with a 100 kW power capacity is impractical if you use only 10 percent, unless you plan to grow quickly. APC notes that there are added costs when you oversize your data center. Power and cooling in a typical 100 kW data center is $8.70 per watt or $870,000, and if you are wasting capacity, then at least 40 percent of overhead is wasted—that’s $348,000.
Data Center Modeling Software
When assessing data center design tools, remember that you need to consider all the factors that keep the data center running. Let’s consider the hardware costs and infrastructure amortization in a typical data center. Power is typically more than half the cost (57 percent), followed by power distribution and cooling (18 percent), networking equipment (13 percent), servers (4 percent), and other factors. All these elements are interdependent, and miscalculating one will affect performance and costs for the entire data center.
The best place to start is by modeling the data center. There are a number of virtual data center modeling packages available. Future Facilities 6SigmaDC provides a software modeling tool to design and test an entire virtual facility down to the last piece of hardware. The software takes into account available floor space, power and cooling, cable routing, and a host of other factors to predict performance and gauge costs. Similar software is available from APC, CoolSim Software, Optimum Path, ManageEngine, and other vendors. There are also more focused tools, such as Computational Fluid Dynamics, to identify potential hot spots as part of data center design. Any of these tools can help you create a virtual data center and run the analytics needed to uncover potential pitfalls.
You also can use different types of modeling software to determine how virtualization and cloud computing will affect data center performance. Companies like CloudPhysics are using data science to simulate IT operations with a specific focus on virtualization. With VMware, all computing becomes virtual and flows through the hypervisor, and you can use the normalized data layer to analyze enterprise resource consumption. The result is more efficient use of existing hardware and systems.
The Cloud Changes the Design Rules
With migration to cloud computing, the rules of data center design have changed. When you incorporate the cloud into data center design, you have to consider traditional cooling, power, and computing and how to support cloud systems. You will likely require new servers in more efficient rack systems and more commodity hardware that can be deployed to support hosted operations.
The physical architecture also is different, which means you need to adapt the infrastructure to accommodate those changes. Virtualization means more services running on fewer servers, which changes the dynamics for power and cooling. Cloud computing will lead to more converted systems and multi-tenant platforms with different infrastructure demands. You will have to consider these changes when using your design software to model the data center.
Backup and security are different in a cloud-driven data center. Cloud computing has made data centers more resilient, with more reliable backup systems. However, there is more automation with cloud computing, which means more intelligence needs to be built into the data center.
Data Center Infrastructure Management (DCIM) becomes an important architectural tool for cloud-driven data centers. DCIM gives you the data you need to manage data center assets, capacity, and other changes. It also supports power monitoring and energy management. DCIM tools can help with:
- capacity planning
- asset lifecycle management
- uptime and availability
- power management and cooling
- data center consolidation and tech refresh
- virtualization and cloud computing
DCIM tools are available from a number of vendors, including CA Technologies, Emerson Network Power, Nlyte, Panduit, Raritan, and Schneider Electric.
Any value-added reseller that is involved in any aspect of data center infrastructure support such as server installations, routers, switchers, or cabling needs to have some basic tools to help with data center design. These are just a few of the software tools that today’s network architects are using to create next-generation data centers. If you are diligent about monitoring the infrastructure and capacity planning, you will be able to equip your customers with the latest in data center technology, but you also can create a sales roadmap showing them where they will need to upgrade and expand in the future.