Brian Mithen, Network and Systems Administrator for the Topeka and Shawnee County Public Library, says implementing hyperconverged infrastructure has future-proofed the data center

Apr 10 2017
Data Center

New Technology Helps State and Local Teams Advance the Data Center

Future-proofing the data center and improving performance today remain top priorities, and help IT drive greater efficiency.

The older data center infrastructure within the Topeka and Shawnee County (Kan.) Public Library bogged down on certain functions nearly every day. Rather than upgrade individual components, the IT team realized it needed to do something more ­progressive. Ultimately, they opted for hyperconvergence, Digital Services Director David King says. The new turnkey system tightly integrates storage, compute, virtualization, network and other technology within a single appliance.

“I wanted to go with an emerging best practice that would get us through the next several years without too much trouble,” King says. “I didn’t want to come back to our library board in short order and hear them say, ‘We just spent a huge chunk of money on this last year, what happened?’ ”

Hyperconvergence technology is so cutting-edge that the library’s IT team had to work hard to find other public agencies using it in real-world scenarios. When King briefed his relatively tech-savvy board members about the solution, he says he armed ­himself with numerous white papers, technical reports and a Wikipedia printout “so I could explain it fully and in layman’s terms.”

The latest data center technologies, like hyperconvergence, virtualization, modular design and cloud computing, provide the simplicity, vertical density and energy efficiency that public sector organizations need to meet growing mandates to provide more data center performance with fewer resources, says David Cappuccio, vice president, distinguished analyst and chief of research for data centers at Gartner.

“A lot of organizations are taking the time to get a higher-level view of where the industry as a whole is going, which will drive what their infrastructure looks like in the future. That dictates the decision that they make today,” he says.

Enlisting Tools to Ease Hyperconvergence

By choosing to implement the Nutanix 3060-G5 hyperconvergence solution, the Topeka and Shawnee County Public Library team gained not only the future agility and longevity it needed, but also features and capabilities that would benefit the library immediately, says Brian Mithen, a network and systems administrator.

The Topeka and Shawnee County Public Library. Photo by Dan Videtich.

For starters, the new solution is much simpler to manage, offers more processing and storage capacity in a smaller footprint, and is faster in every way, from file access to routine application maintenance. The solution also offers predictive analytics that allows the team to pre-emptively and accurately plan and budget for ­upcoming technology projects, ­including a virtual desktop initiative that will speed, simplify and improve the management of the library’s 180 public computers.

“As soon as we implemented it and started migrating our environment, we all got big smiles on our faces,” says Joey Embers, another network and systems administrator. “It’s a huge improvement.”

Taking the Data Center From Crisis to Capability

In Connecticut, the state’s IT team turned a potential crisis into an opportunity to emerge onto the leading edge of data center efficiency. Sometime in late 2013, the state’s existing, leased data center facility in East Hartford maximized its cooling capacity. Beyond that, it was situated in a flood zone near a dike that had recently failed an inspection. “It was urgent to find a new site and upgrade the data center, all at the same time,” says Eric Lindquist, the state’s chief technology officer.

The Topeka and Shawnee County Public Library. Photo by Dan Videtich.

After looking at a number of possibilities, they discovered that Pfizer Pharmaceuticals would soon vacate an existing data center on their well-guarded campus in Groton. Through an innovative partnership with the company, the state entered into a lease for the data center and repurposed it. The IT team considered long-term needs and decided to set up the new data center using a more modular design.

“We put in portable pods — rectangular, enclosed areas — within the data center itself, which allows us to put a hot aisle in the center of that enclosed area and vent it from the ceiling, then cool from underneath,” Lindquist explains. “That allows us to keep the floor plan totally open, so as you recirculate the air, there’s no blockage of cool air getting through.”

The new design not only promised significantly lower energy costs, but it also allowed the IT team to deploy high-density computing and attain a much greater compute-per-square-foot ratio.

While the old data center occupied 9,600 square feet of space, the new data center takes up just 4,400 square feet. By putting the emphasis on improving speeds and relying on virtualization, the IT team has downsized its inventory from 240 physical servers to a mere 40.

At the same time, the IT team went across state lines to develop a better disaster recovery setup. Officials worked out a colocation agreement to lease 2,000 square feet of space within the Massachusetts data center in the city of Springfield, then connected back to the Groton facility via a fiber network that replicates information across the two data centers in close to real time.

“With the added capacity, manageability and business continuity capabilities that we have now, we are bringing on more and more agencies as customers, because they now have access to an easily accessible, highly redundant, higher capability data center than they ever could have afforded on their own,” Lindquist says. “We expect that to grow significantly, especially as we add new services, like off-premises Software as a Service.”

Texas is Staying Ahead of the Cloud Curve

When it comes to data centers, the state of Texas has been ahead of its time since 2006, and its efforts to stay that way are ongoing. Future-proofing started with a consolidation effort to put specific state agencies’ data under one centrally managed roof in Austin (and in a second redundant site in San Antonio).

Over the past four years, the state’s Department of Information Resources has evolved its offerings into an integrated and hybrid cloud arrangement composed of the main consolidated data center in Austin, along with leading public cloud providers. As a result, DIR now provides services to a flurry of new customers, including local agencies that can pick and choose from a range of support levels, cost structure and cloud type, says Brad Helbig, CTO for Texas Data Center Services.

“It gives them more choice and more ability to control their costs,” he says.

“Depending on their use case, agencies now have options that enable services I like to call ‘fit to purpose.’ If they need to spin something up quickly, like a sandbox initiative or a training course, the public cloud might be the better fit, while our private cloud would be better for hosting larger, highly transactional applications.”

DIR provides that level of flexibility using a number of new technologies, including an online marketplace that allows customers to easily shop hybrid cloud models and services, select and compare different options, calculate costs and submit their order. Finding ways to automate previously manual IT operations, such as monitoring and alerting, data discovery and data analysis, has proven to be another key enabler.

The Topeka and Shawnee County Public Library. Photo by Dan Videtich.

Although the Texas experience clearly demonstrates a compelling case for embracing a hybrid cloud model, Helbig’s team carefully balances any new services with their ability to offer the same assurances of security, data protection, incident response, and managing change and chargeback as the state’s offerings assure through its private community cloud in Austin — and consistently.

Helbig’s team has accomplished that by building a private, secure network that connects all cloud providers and enables the ability to automatically deploy tools to those virtual data centers located in the public cloud, he explains.

“It may feel like it slows you down, and it does require development and funding, but in the end, it’s been worthwhile because we can maintain control just as we have been in our private cloud. But we don’t need to constrain ourselves from using the public options that are out there,” says Helbig.

Dan Videtich
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT