Many state and local agencies that operate in the cloud have different technology needs than they did when they made their initial investments. As organizations re-evaluate their cloud environments, they should consider the technologies they’re using now as well as the tools they plan to adopt and implement in the future.
One such solution proliferating in tech portfolios is artificial intelligence. A McKinsey Global Survey found that the percentage of organizations using an AI tool for at least one business function jumped to 72% this year, up from 55% in 2023. Generative AI use nearly doubled in the same time frame: 33% of organizations used it in 2023, compared with 65% in 2024.
To accommodate this dynamic technology, government organizations must make plans for its use and growth. In doing so, one of the most important components to consider is the data users are feeding it and the governance of that data.
Click the banner to tailor a cloud solution to your agency’s needs with CDW’s help.
The Risk of Sharing Sensitive Data with Public AI Platforms
As employees embrace generative AI platforms, IT professionals must find ways to ensure that sensitive data isn’t being shared publicly. Users need ways to explore large language models (LLMs) without disclosing any of their data.
“First, we do a data governance check. What kind of data are you going to be using? What are the controls around that data? Then we can design a solution that allows you to keep your data in-house and not expose any data,” says Roger Haney, chief architect for software-defined infrastructure at CDW.
Data governance is key for organizations looking to prepare their infrastructure and users for AI and LLMs.
“We have a workshop called Mastering Operational AI Transformation, or MOAT,” Haney says. “You’re drawing a circle around the data that we don’t want to get out. We want it to be internally useful, but we don’t want it to get out.”
WATCH NOW: State CIOs talk about their top AI use cases.
To ensure data security, partners such as CDW can help organizations set up or build cloud solutions that don’t rely on public LLMs. This gives them the benefits of generative AI without the risk.
“We can set up your cloud in a way where we’re able to use a prompt to make a copy of an LLM,” Haney explains. “We build private enclaves containing a chat resource to an LLM that people can use without a public LLM learning the data they’re putting in.”
Government faces many of the same challenges as private industry when it comes to mastering AI.
“I don’t see their solutions as being much different than standard industry solutions. They’re learning about it, learning about new approaches to the problem and then trying to figure out how AI can help solve them,” Haney says.
When to Host AI Databases in the Cloud
Agencies’ plans for generative AI will determine how they should prepare their infrastructure for the future of this technology. Haney says most users want to communicate with their data for retrieval or analysis purposes.
“Chatting with your data doesn’t require a new data store. You don’t have to build a huge data lake or warehouse,” he says. “If you have student data, then we add another model that can create the query in SQL, do the query and pull the data back. Then you can ask it questions, using that data as part of your prompt, and you can talk with your data.”
Partners such as CDW can give agencies this functionality quickly and inexpensively by creating a retrieval-augmented generation database for schools. When asked a simple question, it can return two or three top answers. Often, these solutions don’t require the cloud.
“If you’re going to do 20 queries per second, for example, you probably could do that on-premises,” Haney says. “If you’re going to do 200 queries or, if you’re a company the size of CDW and you’re building an HR bot, 500 queries per second, you want to do that with resources that are scalable. That’s where the cloud comes in.”
65%
The percentage of organizations using generative artificial intelligence in 2024
Source: mckinsey.com, “The state of AI in early 2024: Gen AI adoption spikes and starts to generate value,” May 30, 2024
How Cloud Makes AI Happen
Because state and local governments are going to be simplifying processes with AI, they could likely host any databases on-premises. Other organizations, however, may consider cloud-based resources.
“With a fine-tuned model, you need heavy GPU resources because now you’re embedding that information into the model itself,” Haney explains. “We do most of that work in the cloud, where we’re able to rent a GPU or a TPU, and it’s a lot less expensive than buying a huge DGX or other piece of equipment to do that work.”
DIVE DEEPER: Some states are using RAG to create LLMs for cybersecurity.
So, when it comes to determining how you’ll prepare your cloud infrastructure for AI, think first about how you want to use AI, how you want to use your data and what that will require in your organization. Working with an experienced partner can help you answer these questions and more to prepare your district’s digital infrastructure for whatever comes next.
“I see a lot of the government agencies starting to really think about their customers and what they’re trying to produce,” Haney says. “Case in point, we were working with visual recognition and comparing drone footage over time to see if people have added a pool, or built a deck or a gazebo, and does a person need to go visit the site so they can appropriately update the tax records? So, they will fly the drone around, but then the power company goes in and borrows their footage and sees where they need to trim trees. So, they’re working together to use these video feeds to stop going to every house and only focus on the ones they need to focus on. It’s actually saving the taxpayer money.”