Close

New Workspace Modernization Research from CDW

See how IT leaders are tackling workspace modernization opportunities and challenges.

Nov 06 2025
Cloud

Why Cloud Repatriation Is Rising in State and Local Government IT

Agencies are adopting a hybrid-by-design strategy so they can place each workload where it fits best and move it in phases without disruption. AI intensifies this trend.

When Herb Thompson served as Wisconsin deputy CIO, he faced a deceptively simple question from a board of agency secretaries: If there’s a breach, where is our data? And is it protected?

At the time, multiple agencies were pursuing their own cloud paths. “When everybody is using various clouds, you can’t control it,” Thompson says. That realization sparked stricter governance on what could shift to public cloud and why, a mindset he now sees spreading as agencies revisit placement decisions and selectively bring workloads home.

Today, as a field CTO for VMware by Broadcom, Thompson hears that same refrain from CIOs nationwide: Modernization isn’t reversing, but cloud repatriation is becoming a disciplined tool for better security, performance and cost control.

Click the banner below to consider challenges in managing cloud environments.

 

What Is Cloud Repatriation for State and Local Governments?

Prakash Pattni, managing director for digital transformation at IBM, defines cloud repatriation as moving applications and data out of public cloud and back to on-premises or private environments — often as part of a “hybrid by design” strategy rather than a wholesale retreat.

“The cloud remains essential,” he says, “but the blanket push to migrate everything there is fading.” Organizations are splitting workloads: stable, mission-critical systems in private data centers or colocation; elastic demand in public cloud; and sensitive data sets kept onsite while using cloud services for analytics.

Broadcom’s team echoes that practical definition. “Right now, it’s less ‘everyone jump on the public cloud bandwagon’ and more ‘be smart about where we place our workloads,’ because it doesn’t make sense for everything,” says BreAnne Buehl, who leads public sector for Broadcom’s VMware Cloud Foundation division.

Prakash Pattni
The cloud remains essential, but the blanket push to migrate everything there is fading.”

Prakash Pattni Managing Director for Digital Transformation, IBM

State and Local Governments Repatriate Workloads from Public Cloud

For state and local leaders, the appeal centers on sovereignty, compliance, cost predictability and steady performance for data-heavy or latency-sensitive systems.

“As governments face rising demand for advanced compute, navigate concerns about cost predictability, sovereignty and operational control, and manage security requirements, they may be moving toward cloud repatriation,” Pattni says. He points to predictable run-rate economics on private infrastructure and lower-latency access to sensitive data sets as key motivators.

Buehl adds that public sector sentiment has shifted after hard-won lessons. In a recent Broadcom-commissioned survey of roughly 1,800 respondents, about 20% from the public sector preferred a mix of public and private cloud, with a “very clear preference” to build new workloads in private cloud and repatriate some from public. Top drivers were security risk, cost management and integration with legacy systems.

“Two of 3 IT leaders were either very or extremely concerned with maintaining compliance for data stored in public cloud,” she says. Many also perceived material waste in public cloud spending.

Thompson sees the same themes in the field, framed through governance. “I cannot think of a public sector customer right now that doesn’t have a governance team over workload placement,” he says — one that looks at the financial, technical and business case before a workload moves.

READ MORE: Data dashboards can improve citizen services.

When Does Repatriation Make Sense for Government Agencies?

Repatriation isn’t an ideology; it’s a workload-by-workload decision. Pattni names three common triggers:

  • Security and compliance. Controls can be easier to implement consistently on private infrastructure for regulated data.
  • Vendor lock-in. Agencies regain leverage and choice by reducing single-provider dependency.
  • Skills and configuration gaps. Lift-and-shift apps that weren’t refactored often underperform and cost more in public cloud.

Thompson offers an example: One state vacated its data center and moved 1,500 applications to public cloud, only to find that about 20% “just didn’t perform,” especially legacy databases not designed for cloud. Others realized that auditing every application against federal frameworks across multiple clouds was onerous.

“When I pull some of those workloads back, I can do a single federal audit and comply with it much quicker,” he says. Cost dynamics can also bite; long-term commitments and slower-than-planned migrations left “money on the table,” prompting agencies to rightsize their cloud expectations.

Beyond contracts, day-to-day economics matter. Agencies that moved systems and then had to bring data back for integration confronted egress fees and rising storage rates.

“It’s going to be cheaper to run this back inside the data center,” Thompson says, especially with federal funding headwinds affecting healthcare and education programs.

Hybrid-by-Design Turns Repatriation Into a Feature, Not a Failure

The connective tissue for all of this is hybrid infrastructure.

“Hybrid environments allow organizations to efficiently move workloads back on-site or to private clouds when it becomes more cost-effective, secure or compliant,” Pattni says. Crucially, hybrid environments let agencies transition in phases without disrupting operations while preserving agility to use public cloud services where they add value.

Meanwhile, agencies expect the cloud experience on-premises. “Our customers using VMware Cloud get a single pane of glass, and application performance and cost analysis is far superior to anything they had in the cloud,” Thompson says. “They’re getting a better experience on-premises than they actually did in the cloud,” along with cost transparency.

That combination — governed workload placement plus cloudlike ergonomics — helps leaders reframe repatriation as simply putting each workload where it fits best, now and over time.

DIVE DEEPER: State and local governments can improve monitoring to boost AI.

Where Do AI and Data Repatriation Intersect for Governments?

Generative and agentic artificial intelligence are accelerating data repatriation decisions for state and local governments.

“We see a huge push toward modern applications, and people overall have shown that they trust private cloud environments a lot more” for model training, tuning and inference, Buehl says. Agencies want privacy, control and predictable performance for AI workloads running alongside sensitive data sets.

Thompson has seen several universities and health systems shift from public cloud pilots to shared, private AI platforms.

“They brought those private AI workloads on-prem and are building multitenancy — a single instance in a single data center,” Thompson says. He adds that studies show 40% to 50% savings versus public cloud while keeping student and health data on campus.

Pattni sees the same pattern at the strategy level: AI training and serving demand for predictable, high-capacity compute with low-latency access to large, regulated data sets. This is often a better match for private cloud or integrated into a hybrid fabric. Agencies will keep using public cloud for experimentation or specialized services, he says, but anchor sensitive data closer to home when using AI.

A Practical Playbook for Agencies

Cloud repatriation and data repatriation aren’t retreats from modernization; they’re the maturation of it. In a hybrid-by-design world, agencies place each workload where it best serves the mission. For an increasing number of systems — especially data-rich, regulated or AI-driven workloads — that may mean coming home.
Here are some steps for making it happen:

  • Inventory and govern. Before moving data, stand up (or strengthen) a cross-functional workload placement board that weighs cost, performance, risk and mission value. Wisconsin’s lesson: Start with data stewardship.
  • Identify candidates. Prioritize data-intensive, tightly integrated and latency-critical systems that underperform or cost more in public cloud, especially legacy databases.
  • Model true total cost of ownership. Include egress, premium services and compliance overhead. If a workload runs “hot and steady,” private may pencil out better.
  • Keep the user experience. Deliver cloudlike provisioning, observability and chargeback on-premises so teams don’t lose speed.
  • Design for hybrid AI. Run sensitive model training and steady inference near the data; burst to public cloud for tests and evaluation.

As Thompson’s Wisconsin experience reminds government officials, the first questions after any incident still apply: Where is your data, and is it protected?

FreshSplash/Getty Images