Tech Trends: States Right-Size Cloud to Keep Data Close to Home — and AI Ready
State and local government IT leaders are grappling with how to best capitalize on stockpiles of data. And that mindset is reshaping how agencies design hybrid environments — and why data repatriation has become a top management priority.
Eryn Brodsky, server and storage practice lead at CDW, frames the trend in the language of value and urgency.
“Data is the most valuable commodity that a customer has within their environment,” she says, arguing that it drives good business outcomes and operational efficiency — and that it is foundational to effective adoption of artificial intelligence.
As agencies push beyond experimentation and into operational AI, Brodsky says the infrastructure conversation expands fast. Looking ahead, she describes a tight linkage among AI, data management and data protection — the latter two becoming critical enablers as organizations decide which AI projects graduate from proof-of-concept.
WATCH: Eryn Brodsky examines data protection in hybrid environments.
Data Management and Data Protection Gain New Potency
For years, agencies often treated data management and data protection as parallel tracks. Brodsky says that’s changing: “Those two things used to be very separate conversations and we’re seeing those two different topics blend more and more together,” because protection becomes inseparable from the “total management story.”
In hybrid environments, that blending becomes a governance issue as much as a technology one. Brodsky describes hybrid infrastructure as spanning on-premises environments alongside cloud and hybrid cloud and multicloud options — and she focuses on how agencies track and protect data as it moves between them.
“As your data moves, is it in the correct location?” she asks, tying placement to governance, regulatory requirements and access controls.
The question — “where is the data, and who can touch it?” — make “close to home” a practical objective for governments, not just a rhetorical one. Keeping certain datasets and workloads in state-run environments can simplify compliance and improve confidence that AI systems only access data they are supposed to use, Brodsky says.
LEARN MORE: On-demand storage may change the game for governments.
Cloud Repatriation Becomes Part of Hybrid Planning
Brodsky calls cloud repatriation “a very key topic” as organizations run AI pilots in public cloud and then decide which ones to operationalize. She says industry researchers agree that as organizations determine which proof-of-concepts move into implementation, they “bring them back on-prem.”
Her argument is straightforward: Many agencies use public cloud to move fast during experimentation, then re-evaluate placement when costs and security considerations become clearer at production scale. She points directly to the economics and risk profile of running AI at scale in public cloud, citing “the cost of cloud to run it there” and “a lot of security concerns” as drivers of repatriation decisions.
From a government perspective, Prakash Pattni, managing director of digital transformation for IBM Cloud for Regulated Industries, also stresses that the “blanket push” to move everything into public cloud is fading. He describes a “hybrid by design” approach that places each workload where it delivers the most value across cost, performance, scalability and control.
Pattni says the government case for repatriation often centers on sovereignty, cost predictability and security requirements — and he adds a distinctly current accelerant: AI’s need for predictable, high-capacity compute and low-latency access to large datasets.
READ MORE: Data management is critical to successful AI projects.
Workloads Increasingly Require Deliberate Data Placement
Hybrid strategies only work when workload placement becomes a managed discipline. BreAnne Buehl, Head of Healthcare & Life Sciences of the VMware Cloud Foundation for Broadcom, describes the repatriation movement as a major wave, citing findings that 74% of public-sector leaders are considering moving workloads from public cloud back to private cloud or on-premises, and about 40% have already started.
She emphasizes that repatriation targets specific workload categories — especially those tied to security/compliance concerns, data intensity, high integration needs and customer-facing applications.
And Herb Thompson, a VMware field CTO at Broadcom and a former Wisconsin deputy CIO, argues that the governance model itself is evolving. He recalls how public cloud made it easy for agencies to stand up systems without being “a computer, storage, network, server or security expert,” because “everything was presented in a single portal.”
That ease, he suggests, becomes an expectation — even when agencies decide some workloads belong back in state-controlled environments. In other words, rightsizing succeeds when agencies combine cloud-like operational simplicity with deliberate data placement and governance.
Keeping Data On-Premises as a Strategic Move
Brodsky says hybrid infrastructure is ultimately about outcomes: Agencies protect time and mission performance by ensuring data is available, prepared and trusted. When it isn’t, she says, organizations “lose time” and “lose their competitive edge.”
To get ahead of the next wave — especially as AI pilots mature — she urges leaders to treat resilience and protection as building blocks for a foundation rather than activities down the road. She specifically warns agencies not to treat backup, disaster recovery and cyber resiliency “as an afterthought” but to address them at the beginning of migrations and new builds.
Pattni adds that hybrid infrastructure supports a cloud-smart posture by preserving agility while enabling controlled repatriation. He describes hybrid as “a vital strategic enabler,” and says it lets organizations deploy new applications, adjust resources and “reallocate resources dynamically” without getting locked into a single platform.
Hybrid infrastructure in state and local government increasingly functions as a data strategy: Agencies use public cloud where it fits, but they also repatriate selected workloads and datasets so high-value data stays governed, protected and close to home — especially when AI moves from pilots to production.
