State and local IT organizations often struggle to meet growing employee and constituent data access demands using outdated data centers, many of which are managed in-house. With little fiscal and personnel flexibility, these organizations often wrestle with issues such as server sprawl, requirements to support antiquated systems and aging infrastructure components. Inefficient systems not only require substantial maintenance, but they also tend to impact their user community, dampening the productivity of both employees and external users.
By modernizing data center technologies and applying cloud services where feasible, government agencies can take advantage of the increased capabilities offered by IT trends, improve the efficiency of the infrastructure, better protect it against failure and potentially reduce expenses.
Enhancements in key areas — especially power, thermal management, network infrastructure, data storage and virtualization technologies — contribute to improved efficiency, performance and cost savings.
By better controlling power usage, government IT organizations can adapt to user needs and protect their IT infrastructure.
Data center infrastructure management (DCIM) systems provide a unified, real-time picture of data center operations, enabling IT teams to match resources with demand. DCIM systems are often integrated with management tools, allowing administrators to monitor dynamic IT environments using virtual machines, storage-area networks and cloud-based applications. At the infrastructure level, DCIM offers visibility into power utilization.
Uninterruptible power supply (UPS) equipment ensures delivery of continuous electrical power. UPS units condition utility-sourced power to provide the system with the necessary voltage to operate optimally. Should power fail, they protect systems until either prime power is restored or a longer-term backup power activates (for example, a generator). A UPS unit typically provides 5 to 30 minutes of power, depending on the backup power infrastructure.
Government agencies can save money and protect equipment by updating data centers with modern, efficient thermal management design.
Modern cooling units are more dependable and intelligent than their predecessors. They are also deployed in redundant configurations, so if one breaks, operations fail over to the working unit. The General Services Administration now recommends running data centers at higher temperatures — from 72 to 80 degrees Fahrenheit — which can reduce energy costs up to 5 percent for each degree uptick.
Hot-aisle and cold-aisle containment strategies effectively vent heat away from servers to avoid fluctuations in room temperature and maximize cooling efficiency. Once limited to large data centers, innovations from vendors such as APC, Emerson and others have made containment technology affordable for small data centers as well.
By taking steps to optimize their networks, government IT agencies can reduce expenses, simplify network management and support growing user demand.
Migration to 10 Gigabit Ethernet has been gradual, but evitable. This higher bandwidth computing network technology promises scalability, lower costs and simplified management. Government agencies can prepare for migration by upgrading switches and routers in 2015.
Software-defined networking promises to simplify management by potentially automating resource provisioning and network configuration. It further extends productivity through self-service capabilities that allow departments or employees to handle their own provisioning.
Cloud computing options are now being considered by even risk-averse IT executives. Data security needs may dictate whether they choose private, public or hybrid cloud models. A colocation model, which aggregates multiple organizations into a single shared infrastructure, has become a popular next step for government agencies.
By virtualizing some existing storage systems and leveraging software-defined storage, administrators can dynamically manage their storage arrays.
Solid-state drives (SSDs) help IT teams deploy faster storage media so users can quickly access the data they require for analytic support. SSDs, which use flash memory for low-latency access, are integral to supplying data to sophisticated analytic processes. Although more expensive, SSDs are faster and require less power than hard-disk drives.
Backup and recovery for data centers require a carefully designed, automated, multilevel approach. IT teams first back up locally, allowing rapid local recovery from less serious incidents. To prepare for catastrophe, organizations often turn to cloud-based options, such as hosting redundant systems through a colocation, contracting for disaster recovery as a service or relying on existing disaster recovery processes residing within the cloud infrastructure.
By adopting virtualization, IT departments can pack more computing and data into a smaller footprint, increasing efficiency and lowering lifecycle cost.
Virtualization of data centers requires less energy and space. Large fleets of underutilized physical servers are replaced with fewer machines running multiple virtual machines. Similarly, storage needs are addressed by virtual disks accessed by VMs.
Modular, hyperconverged infrastructure offerings combine storage, servers and networking in a software-defined stack on a single physical computing platform.
By investing in modern data center technology to enhance data centers in these key areas, government agencies not only benefit from new IT trends, they also gain a more efficient and resilient infrastructure.
Read the CDW white paper “Keeping Pace with Data Center Evolution” to learn more about boosting data center efficiency.