Governments are moving beyond data residency toward mandatory control over encryption keys

Storing data inside your borders isn’t enough anymore. Governments are realizing that data “residency” doesn’t equal data “sovereignty.” Simply put, having servers inside your country but managed by foreign hyperscalers means you can’t truly control your data. That’s a problem. Increasingly, public sector leaders around the world are shifting their focus toward controlling encryption keys themselves. That shift is rooted in a pretty obvious truth: if someone else holds the keys, they control the access.

When cloud providers retain access to decryption mechanisms, whether for maintenance, compliance, or responding to legal requests, it undermines the entire concept of data privacy. If your data can be decrypted, even in theory, by a third party subject to foreign laws like the U.S. CLOUD Act, then your data isn’t sovereign. Sanchit Vir Gogia, Chief Analyst at Greyhound Research, nailed it: “When a cloud provider has any ability to decrypt customer data… the data is no longer truly sovereign.”

If a government wants to ensure its critical information stays out of foreign hands, owning the encryption process, end to end, is non-negotiable. And that means owning the keys. This isn’t just about security; it’s about control, compliance, and geopolitical leverage in a digital world. Executives and CIOs should pay attention because this shift impacts how you choose cloud partners, how you structure data governance, and what your long-term infrastructure strategy needs to look like.

Forward-thinking governments are already heading in this direction. If you’re building or selling to this sector, your future in the market depends on understanding that control over encryption keys isn’t a “nice to have.” It’s foundational.

Swiss public sector leaders are advocating for end-to-end encryption

Switzerland is raising the bar. A group called Privatim, Swiss local government data protection officers, recently made a clear and direct call: no more storing sensitive government data on international cloud platforms unless there’s genuine end-to-end encryption, fully controlled and implemented by the government body itself.

They specifically flagged Microsoft 365 as not meeting that threshold. Not because it’s insecure in the narrow sense, but because Microsoft, like most hyperscale providers, can still access user data when required. Even with data-at-rest and in-transit encryption, if the provider holds the key, there’s a backdoor. And that makes it unsuitable for very sensitive public workloads.

This is a pragmatic pivot. Swiss officials are not rejecting SaaS altogether, they’re saying: “Use it wisely. Own the keys. Encrypt it yourself. Don’t trust blindly.” That’s not paranoia, it’s governance. Especially in a climate where agencies are not just protecting citizen data but navigating real geopolitical stakes in how, where, and by whom digital infrastructure is controlled.

If you’re a technology vendor, ignore this trend at your own risk. More public institutions are going to follow Switzerland’s lead. If you can’t support full encryption with customer key control, chances are, you’re going to lose out on high-stakes public sector contracts. The demand is clear: zero access to plaintext by the provider, ever. For governments, that’s becoming the baseline. For enterprises eyeing these markets, it should be your blueprint.

Geographic data residency fails to address the full scope of data control risks in modern cloud environments

A lot of organizations still talk about data residency like it’s the gold standard. But let’s be honest, it isn’t. Placing data inside a particular country doesn’t mean that data is protected from foreign influence or access. That’s the reality most regulators are waking up to. Storing it locally is not the same thing as locking it down.

Accountability gaps run deeper than location. Today’s SaaS ecosystems involve complex service chains that include subcontractors, third-party tools, and real-time updates that public agencies often can’t track, let alone control. When vendors reserve the right to change contract terms unilaterally, or when backend services are managed by teams in multiple jurisdictions, the customer doesn’t just lose oversight, they lose sovereignty.

Ashish Banerjee, Senior Principal Analyst at Gartner, cut to the point: “Data stored in one jurisdiction can still be accessed by foreign governments under extraterritorial laws like the US CLOUD Act.” That means even if the data never physically leaves a country like Germany or the UAE, it could still be accessed if a U.S.-based provider is running the service.

For public sector and enterprise leaders, this redefines what must be considered “secure.” If legal and operational control aren’t inside your domain, then local hosting is just surface-level protection. Decision-makers should look beyond the promise of regional data centers and demand transparent technical controls, especially around cross-border legal exposure and provider access privileges. The compliance narrative has changed. Encryption and trust boundaries now matter more than physical geography.

European regulatory trends indicate rising prioritization of technical sovereignty through enhanced encryption controls

Switzerland is not standing alone. Germany, France, Denmark, and the European Commission are making similar moves, quietly in some cases, publicly in others. What’s consistent is the direction: European public sector organizations are leaning toward technical sovereignty. That means placing control mechanisms in the hands of the customer, not the cloud provider.

This shift is accelerating. Prabhjyot Kaur, Senior Analyst at Everest Group, said it clearly: “While the Swiss position is more stringent than most, it is not an isolated outlier. It accelerates a broader regulatory pivot toward technical sovereignty controls.” What’s key here is the word “technical.” European regulators are shifting away from relying on legal clauses and certifications. They want enforceable technical barriers, client-side encryption, third-party key custody, and full visibility into who can access what, and how.

This matters for cloud strategy moving forward. If you’re in the business of digital infrastructure, compliance tools, or SaaS, these markets are now demanding more than checkboxes and paper guarantees. They want systems that can stand up to legal tests and geopolitical shifts. If the system can’t prevent unauthorized decryption, even from the provider itself, it doesn’t pass.

C-suite leaders aiming to future-proof their public sector offerings in Europe need to act now. Building sovereignty into the architecture, rather than selling it with contracts, is going to determine which vendors get selected in the next round of public tenders. The roadmap is changing, and the message is simple: giving customers direct control isn’t just an upgrade, it’s an entry requirement.

End-to-end encryption, while securing data, introduces operational trade-offs for government agencies

End-to-end encryption puts the control where it belongs, with the customer. If implemented correctly, it prevents anyone, cloud providers included, from accessing the plaintext content of data. That level of protection is powerful, but it comes at a significant cost.

Functionality takes a hit. When the provider doesn’t have access to the data, standard cloud features like full-text search, indexing, or automated threat detection become limited or impossible. AI-driven tools, which rely on backend data analysis and real-time processing by the provider, don’t work the same way, or at all. Prabhjyot Kaur, a senior analyst at Everest Group, pointed this out when noting that tools like Microsoft’s AI copilots require provider-side processing, which is incompatible with strict end-to-end encryption models.

On top of that, infrastructure complexity increases. Agencies that manage their own encryption keys need to invest in internal key management systems. That means more governance overhead, more technical expertise, and more budget. Ashish Banerjee from Gartner emphasized this, saying encryption and decryption at scale can affect performance, it adds latency, requires extra hardware resources, and increases the total cost of ownership.

For public sector executives and CIOs, this creates a trade-off between ironclad security and operational agility. The technology to encrypt everything at source exists. But doing that uniformly across all use cases may not be practical, or even necessary. The key is to identify which data sets truly require that level of control and which do not. Security is critical, but so is system performance and long-term maintainability. Balance will be mandatory.

A tiered approach to encryption is emerging as a practical solution for balancing functionality and security

Not all workloads are equal, and enforcing full end-to-end encryption across every piece of data presents diminishing returns. Governments are starting to approach this challenge with more precision. Instead of applying maximum encryption to everything, they’re allocating encryption levels based on data sensitivity.

Content like state secrets, legal investigations, or national security-related material is being isolated into sovereign environments, custom tenants with strict client-side encryption and rigid access control. Meanwhile, lower-sensitivity data, everything from administrative communications to citizen service records, can still use standardized SaaS platforms with controlled encryption and audit layers.

Sanchit Vir Gogia, Chief Analyst at Greyhound Research, explained this emerging trend clearly: highly confidential content should be wrapped in true end-to-end encryption and segregated into specialized infrastructures, while broader government operations continue on mainstream platforms under enhanced oversight.

This tiered approach gives government CIOs more flexibility. They can invest heavily in securing only the data that truly demands it, without overburdening less critical systems. It also reduces cost, simplifies user experience for general workloads, and lets teams continue using familiar platforms for day-to-day operations.

For vendors, this means offering optionality, not just a one-size-fits-all security model. Public sector clients will expect cloud platforms that support strong segmentation, variable encryption models, and seamless integration with sovereign tools. Long term, markets will reward providers able to deliver that kind of architectural flexibility.

Hyperscale cloud providers must strengthen technical sovereignty features to retain government clientele

Governments are tightening their standards, and hyperscale cloud vendors are under pressure to adapt. Traditional guarantees like regional data hosting and contractual safeguards aren’t enough anymore. The new baseline includes customer-controlled encryption, jurisdiction-based access restrictions, and enforceable technical barriers that eliminate provider-side decryption rights.

These aren’t edge-case requests. They are becoming critical requirements in markets where compliance and sovereignty intersect. Prabhjyot Kaur from Everest Group pointed out that Microsoft has already started rolling out stricter models with jurisdictional access controls and enhanced customer-controlled encryption. That shift isn’t decorative, it’s mandatory to stay relevant in high-compliance public sectors.

Sanchit Vir Gogia at Greyhound Research made it clear: features like confidential computing, client-side encryption, and external key management are no longer value-adds. They’re must-haves if a cloud vendor wants to remain competitive in public-sector procurement.

Leaders at cloud companies should be recalibrating their go-to-market strategies. The assumption that regulatory compliance can be addressed through detailed terms and backend audit trails is no longer valid. Public agencies expect providers to prove, technically, not just contractually, that their infrastructure cannot be involuntarily accessed or legally compelled into exposing sensitive workloads.

Providers who don’t meet this bar will be disqualified early in the bid process. Those that can, stand to win long-term institutional contracts. The market is not just shifting, it’s dividing. And public sector decision-makers are leading the charge on standards.

The cloud computing market is poised to bifurcate as sovereign solutions gain traction over traditional global platforms

We’re seeing a structural shift in the cloud market. As regulatory demands escalate, a clear pattern is emerging: one tier of cloud services will continue to serve global commercial clients with standard models; another will cater to governments and high-compliance sectors through sovereign solutions with tight encryption control and local jurisdictional boundaries.

This isn’t speculative. Ashish Banerjee at Gartner forecasted exactly that: a two-tier model, where non-U.S. and regional vendors gain ground by offering platforms tailored to meet stringent government encryption standards. These vendors are seen as better equipped to provide transparency, regulatory alignment, and independence from laws like the U.S. CLOUD Act.

For public sector CIOs, this means expanding beyond familiar vendor ecosystems, and evaluating emerging providers who can deliver sovereignty-oriented products natively. For large tech firms, it’s a wake-up call to build adaptable infrastructure that can operate under varied national frameworks, without compromising security expectations on either side.

This change is already in motion. It will reshape partnerships, procurement processes, and product design standards. Vendors that can’t offer high-trust, sovereignty-ready solutions at scale will lose traction in regulated markets. Business leaders need to position now, because the demand isn’t slowing, and the next round of decisions will define who controls critical infrastructure across borders.

Concluding thoughts

Across the board, expectations have changed. Governments no longer accept surface-level assurances or broad compliance language. They want proof, technical, structural, and enforceable, that their most sensitive data can’t be accessed without their consent. That shift doesn’t just affect cloud providers. It reshapes how enterprise vendors, security consultants, infrastructure architects, and SaaS platforms operate in regulated markets.

For decision-makers, this is a moment to reassess your architecture, partnerships, and risk assumptions. If you’re selling to the public sector, or building systems that touch public data, you’ll need to offer actual control, not just promises. Solutions that don’t include client-side encryption, external key management, and verifiable access boundaries will continue to fall off the shortlist.

Markets will reward companies that lead with sovereignty as a core function, not an afterthought. That means being able to operate across national frameworks, offer flexible deployment models, and deliver full transparency. For business leaders, the opportunity is clear: design with control in mind, and your infrastructure remains relevant. Ignore the shift, and you’ll find yourself locked out of critical markets that are only becoming more demanding.

Alexander Procter

December 5, 2025

11 Min