Close

See How IT Leaders Are Tackling AI Challenges and Opportunities

New research from CDW reveals insights from AI experts and IT leaders.

Jul 18 2025
Artificial Intelligence

Data Management Makes or Breaks AI Success for SLGs

Consolidated, clean and well-governed data is crucial to AI implementations at scale.

State and local governments have been utilizing artificial intelligence for years. For example, Arlington County, Va., public service answering points use AI to route nonemergency calls. North Carolina uses AI to optimize procurement. New Jersey uses it for cyberthreat detection.

But as jurisdictions face shrinking revenues and dwindling federal aid, opportunities exist to deploy AI at scale or cooperatively, provided governments are prepared.

“Many agencies start their AI journeys with a specific use case, something simple like a chatbot,” says John Whippen, regional vice president for U.S. public sector at Snowflake. “As they show the value of those individual use cases, they’ll attempt to make it more prevalent across an entire agency or department.”

Especially in populous jurisdictions, readying data for large-scale AI initiatives can be challenging. Nevertheless, that initial data consolidation, governance and management are central to cross-agency AI deployments, according to Whippen and other industry experts.

Click the banner below to learn more about AI and data readiness.

 

Data Consolidation Is More Important (and Easier) Than Ever

“The first thing to do is to consolidate,” Whippen says.

Most state agencies operate on a hybrid cloud model. Many of them work with multiple hyperscalers and likely will for the foreseeable future. This creates potential data fragmentation. However, where the data is stored is not necessarily as important as the ability to centralize how it is accessed, managed and manipulated.

“Today, you can extract all of that data much more easily, from a user interface perspective, and manipulate it the way you want, then put it back into the system of record, and you don't need a data scientist for that,” says Mike Hurt, vice president of state and local government and education for ServiceNow. “It's not your grandmother's way of tagging anymore.”

ServiceNow’s Workflow Data Fabric and Snowflake make it easier to bridge data in a logical interface, obviating the need in many cases to physically consolidate it.

For example, Snowflake’s SnowConvert automates the migration of Oracle PL/SQL to standard SQL, reducing the effort required to move legacy applications off the Oracle platform, Whippen says. Similarly, ServiceNow’s Workflow Data Fabric can connect to disparate data lakes and reach into different applications. The data can then be manipulated in the ServiceNow platform and put back into the platform of record.

States such as North Dakota and Missouri have taken greater advantage in this regard, partly because they’re nimbler due to their size, and partly out of budget necessity, Hurt says.

“Some smaller states are looking at it more holistically because they have to. They just haven’t had as much money or resources available in the past.”

It all begins with solid data management, including breaking down silos and achieving consistency across departments.”

John Whippen Regional Vice President for U.S. Public Sector, Snowflake

Simplifying Data Policy and Governance

Through consolidation, data policies are easier to enforce.

“A strong data policy is the backbone of AI readiness,” Whippen says. “If your end goal is to identify use cases where AI can drive efficiency down to the constituents, you have to make sure that the data you have on those constituents is accurate and secure.”

Governing the data across IT infrastructure and enforcing those policies is the challenge, especially at the state level.

“Often, there’s data associated with so many different systems of record,” Hurt says. “It makes it super hard to get uniform policy across all those individual apps because you've got individual people developing them, and then putting AI models on top of that.”

EXPLORE: Put people first in the age of AI.

The solution, according to both Hurt and Whippen, is a unified data management platform.

“We can look at our role-based access control from a data policy perspective,” Whippen says. “So, you can create those policies, implement them and drop them across every agency and department.”

While data policy can be a moving target, establishing and simplifying policy governance sets public sector agencies up for success.

“If you have very sound data policies that take into consideration privacy and security and access, you wouldn’t even perhaps need an AI policy because your existing data policy would govern it,” Alan Shark, associate professor at George Mason University’s Schar School of Policy and Government, told StateTech earlier this year.

AI governance is also an important piece of the puzzle.

“You need to have governance for the actual large language models that will be touching that data,” Hurt says. This is key to preventing shadow AI and improving the consistency of outputs.

Click the banner below to sign up for the StateTech newsletter for weekly updates.

 

Classify, Clean and Monitor Data

Centralizing data management also simplifies the other core aspects of readying data for AI. These include:

  • Data classification. “Standardized metadata policy and governance for your access controls is key,” Whippen says. “You want to ensure people can access and utilize the right data that they need to provide the business function they’re responsible for.”
  • Data cleaning. Eliminating duplicate records and ensuring records are up to date and accurate will help achieve greater consistency and performance in AI models leveraging that data.
  • Data sharing. Being able to share and ultimately activate that data across teams and departments is key to cooperative, cross-departmental AI initiatives.
  • Data monitoring. Improving data quality and regularly enriching it keeps AI outputs relevant and up to date over time, protects data access and flags anomalies.

Tagging and cleaning data will require some manual processes.

“You want to make sure that you actually have hands-on experience, and that you're looking at that data to do a pulse check against what more often than not are automated processes,” Whippen says. “It can seem like a heavy lift initially, especially if you're not prepared for it, but there are a lot of native platforms out there that make it easier.”

But data quality and security will ultimately dictate the success of large-scale AI initiatives.

“It all begins with solid data management, including breaking down silos and achieving consistency across departments,” Whippen says. “Because more often than not, agencies rely on data from more than one department, and creating something that's concise and consistent will deliver the underlying data reliability needed to grow an AI use case into something bigger and more meaningful.”

MTStock Studio/Getty Images