Fiduciary Responsibility Is Key in the Age of AI
It’s hard to separate AI from financial strategy.
Whether it’s a city council or a state agency, the fiduciary duty is clear: Be a good steward of taxpayer dollars. That means avoiding flashy one-off projects that fail to deliver.
Instead, state and local leaders must build a roadmap that aligns AI initiatives with measurable returns. Many jurisdictions have already begun doing this, especially at the state level.
Leaders must also think sequentially. This means deploying AI in ways that generate value at each step while minimizing risk. This requires acknowledging limits, asking how the public might react and setting realistic guardrails before rushing into automation.
This does not mean doing nothing and taking zero risks. Calculated risks will need to be taken, and that’s where organizational change management, cross-functional guiding coalitions and AI sandboxes will come into play.
Trust Is Central to AI and Government’s Use of It
If there are two things people don’t trust right now, it's government and AI. Public trust in government is nearing a historic low. Meanwhile, the Edelman Trust Barometer shows that about one-third of the public is excited about AI, one-third is deeply skeptical, and the rest are somewhere in between. If government leaders implement AI poorly, half of residents could reject it outright.
Building trust isn’t just about policies. It’s also about listening to concerns, explaining why and how AI is being used and involving diverse perspectives in decision-making.
Enter cross-functional guiding coalitions, also known as centers of excellence. Too often, AI efforts are siloed within IT or security teams, leading to what we call “cognitive blindness” about impacts in other areas. This concept is demonstrated in tests where people are asked to focus on something happening in a video, but in doing so, fail to see a gorilla walk through the frame. In the case of AI, the gorilla could be a breach of data privacy.
To succeed, an AI coalition must have representation from operations, legal public communications and beyond to ensure decisions are made transparently and with full context.
Chief AI Officers Must Take the Lead on Change Management
The role of the chief AI officer has become more common in state and local government. There’s risk in treating the CAIO post as purely technical. Yes, AI demands data architecture, cybersecurity and vendor evaluation skills. But the biggest challenge isn’t the algorithms. It’s the people.
The ideal CAIO is part strategist, part communicator and part change manager with the skills to navigate the social, cultural and workforce disruptions AI brings. This means preparing employees for job transitions, guiding ethical adoption and ensuring AI supports human decision-making.
Organizational change management is a must, and the CAIO needs to spearhead this, helping agencies to rethink workflows, not just automate existing ones. After all, you don't want to automate your mistakes; you want to tap AI’s transformational potential in a safe and productive way.
Click the banner below to sign up for the StateTech newsletter for weekly updates.