State and local government IT managers running data centers have their hands full. Rather than delivering IT services centered around one or two main use cases, they must support a sprawling network of public service providers, from libraries and police departments to motor vehicle registries and fish and game agencies.
“We’re jacks of all trades,” says Andy Lefgren, IT operations manager for Ogden, Utah. “We have almost different organizations within our cities. We manage personal information, criminal data, fingerprints, patient data — and at the same time, we have public services, infrastructure and communication lines. That’s where a city is unique.”
Budget constraints have prevented some public-sector agencies from making data center investments in recent years, forcing them to make due with legacy systems. However, pressing needs and evolving technologies are spurring a number of state and local government IT departments to re-evaluate their data center architectures.
Experts from storage giant EMC identify three factors driving changes in state and local government data centers: IT modernization, Big Data and the storage demands of in-car and wearable camera deployments within public-safety agencies.
“There’s been a lack of investment, especially during the period of economic downturn,” says Gary Buonacorsi, vertical alliances manager for EMC. “A lot of the infrastructure was left to age, or even go out-of-support. A lot of the applications are really old, and some are still mainframe-centric. Our customers have been asking us to help them create a path forward.”
A failure to modernize can cause significant problems. Before implementing a storage area network (SAN) solution from EMC, the city of Commerce, Calif., relied on nightly manual data backups. Then a server hard drive crashed. “All our departments pretty much lost a day’s worth of data,” says Jesse Guerrero, IT supervisor for the city.
The first step in modernizing infrastructure, Buonacorsi says, is deciding what to do with legacy applications. Often, agencies will initially look toward virtualization. Organizations seeking to modernize their data centers can often benefit from converged infrastructure — or a “data center in a box,” says Buonacorsi. In addition to speeding the delivery and deployment of solutions, this approach allows agencies to outsource maintenance, which can prevent performance problems and security risks.
Big Data is becoming more important in providing government services, requiring new investments in the data center, explains Jennifer Axt, general manager for state and local government at EMC. “Government is trying to be more transparent with constituents, reporting out what they’re doing,” she says.
In addition to collecting and communicating data for reporting purposes, governments are utilizing data to gain insights on important trends. “In law enforcement, we have a customer that uses Big Data analytics to do analysis around crime patterns,” says Buonacorsi. “They’re trying to use those insights to pre-assign resources where they know crime is moving. They’re actually able to be preemptive.”
Other government uses of Big Data include waste and fraud detection, as well as Internet of Things technologies that power smart cities programs (for example, using connected streetlights to gather data on traffic patterns and then leveraging insights to adjust the timing.) “More and more, Big Data is going to expand over the next decade,” Buonacorsi says.
The increasing use of video by law enforcement is also straining IT resources for states, counties and municipalities. This includes footage from wearable cameras, which many police departments have rushed to deploy in response to public pressure stemming from a number of high-profile police shootings, as well as from departments switching their existing car cameras from standard definition to HD. “A body camera is really small and low-cost, but each one of those cameras is collecting all of this data,” says Axt. “These departments don’t have the storage capacity.”
“It’s putting a new stress on public safety data centers,” says Ken Mills, chief technology officer for surveillance and security at EMC. The storage need is compounded by the fact that these video files are considered evidence, and therefore must be retained for certain periods of time, depending on state laws. Some departments are opting to store the footage using on-premises solutions such as scalable EMC Isilon storage devices, while others utilize cloud-based services. Because data from video quickly multiplies, the ability to quickly expand capacity is essential.
The move toward virtual machines has completely changed data center architecture for many organizations in recent years. But the data center isn’t done evolving. Here are three technologies that some cities and states are beginning to implement and scenarios where these solutions make the most sense.
Hyperconvergence: Hyperconvergence solutions integrate compute, storage, networking and virtualization resources into a hardware box supported by a single vendor. Eric Burgener, research director for IDC’s Storage Practice, says the economics of hyperconvergence appeal to many government users. “You can typically buy the same amount of capacity for a lot less money,” he says. “It’s much easier to scale up. It’s more of a pay-as-you-go model.”
Still, Burgener says, some users have concerns around the maturity of such solutions, and may only trust them with secondary use cases such as data backup. “Since the hyperconverged systems are newer, there are a lot fewer people that will put mission-critical applications on them,” he says.
All-Flash Arrays: While flash storage still costs more than hard disk storage on a per-gigabyte basis, all-flash arrays may actually be more economical for certain use cases that require a high level of performance, Burgener says. “Let’s say I want to build a storage system that can deliver 200,000 input/output operations per second (IOPS),” he says. “With hard disk, I might have to buy 100 devices to hit 200,000 IOPS. But if I want to hit 200,000 IOPS with flash drives, I might only need to buy 10 drives. Because I buy so many fewer of them, the cost to do the same amount of work is actually lower.”
The city of Ogden, Utah, uses NetApp Flash Pool to enable faster and more reliable access from virtual desktops and to eliminate slow response times when accessing information from Microsoft SQL Server databases. However, the city continues to rely on more traditional storage solutions where performance is less important.
“If a video is going to sit for months without being used, you don’t need flash to run that,” says Lefgren. “You can put that on a lower I/O disk. It makes us efficient, and it saves the city money.”
Organizations everywhere are exploring cloud computing as a way to quickly scale resources in an economical manner, and cities and states are no different. In some cases, this means moving resources out of the data center and into a public cloud environment. In other cases, organizations are consolidating resources into a centralized private cloud.
Texas is doing both. “We’re moving toward a hybrid cloud model in our program,” says Brad Helbig, chief technical architect for data center services in Texas Department of Information Resources. “Today, we have a private community cloud, and we’re beginning to offer some services through public cloud offerings. We recognize that public cloud offerings are able to achieve better economies in certain cases, driving the cost point down.”
Users can spin up resources in the state’s private cloud just as quickly as with a public cloud, and at times the private cloud can even compete with public cloud providers on cost. But Helbig says that the public cloud provides additional elasticity for agency users, some of whom need additional resources on a seasonal basis, or require temporary compute power for development and testing. “Because of these public cloud providers, it’s become more of an on-demand model,” he says. “When you need it, you get it.”
Storing and providing access to data is only half the battle. Organizations, of course, must also take measures to secure their information — a concern that is especially acute for state and local governments handling sensitive and regulated data, including personally identifiable information, health records, credit card data and more.
Many organizations are implementing next-generation security tools that go beyond recognizing signature-based attacks and can better identify new threats based on context. For example, next-generation firewalls offer advanced features such as application awareness and control, identity awareness and integration with intrusion prevention systems. These next-generation intrusion prevention systems, in turn, utilize application awareness and context in order to identify threats in sequences of network packets. Other solutions include web gateways and security information and management systems.