Nearly half of sensitive cloud data is still unencrypted, but global uncertainties and AI are driving a new push for data security. ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­    ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­  
View in browser
AdobeStock_705190755-with-peg-newsletter-header

Greetings,

 

Is end-to-end encryption experiencing a surge in demand?  In short, yes.  Even as countries like the UK undermine strong encryption, an increasing number of companies are adding strong encryption requirements to meet their data security and data sovereignty needs.  Global affairs and distrust of U.S. tech companies have driven stronger data security requirements.

 

I've talked before about how AI is an urgent privacy problem. Adoption of application-layer encryption is needed to give customers control of their own data via hold-your-own-key or end-to-end encryption patterns. Yet nearly half of sensitive cloud data remains unencrypted (per the Thales 2025 Cloud Security Study).

 

IronCore has solutions that help app developers add this level of security while keeping data searchable and usable. If you're not already doing this, please talk to us and see if we can help.

 

Stay safe and watch out for those bots: they're coming for your data (among other things).  See ya next time.

Patrick Walsh CEO IronCore Labs

Patrick Walsh
CEO, IronCore 

 

 

Upcoming events:

  • IOPD Webinar: Hidden Risks of Integrating AI: Managing Data Proliferation and Leakage
    • March 20, 11:00am - 12:00pm ET (virtual)
    • Abstract: A discussion of the hidden risks in apps leveraging modern AI systems, especially those using large language models (LLMs), retrieval-augmented generation (RAG) workflows, and agentic workflows. We will be prepared to demonstrate how sensitive data, such as personally identifiable information (PII), can be extracted through real-world attacks such as a vector inversion attack. We will also be prepared to discuss or demonstrate how to prevent such attacks through the use of encryption and other PETs, plus the wise application of policy.
  • OWASP SnowFROC: Hacking AI-Enabled Apps: Exploit Demos, Data Compromises, and Hardening Patterns
    • April 17 (Denver University Cable Center)
    • Abstract: Adding LLMs to your product is deceptively easy: drop in a chat UI, add RAG, connect tools, and call it done. But when untrusted content becomes part of the prompt, models can be steered into revealing secrets, leaking tenant data, or taking actions you never intended that leaves your app vulnerable to non-obvious attacks.

      We’ll demonstrate exploits mapped to the OWASP Top 10s that start with user-generated content and end with real security impact. We then discuss practical mitigations including architectural patterns, protection strategies, and a decision framework for which AI use cases are safe, safe-enough, or too unsafe to ship.

 

issa-ai-data-blindspots-webinar-2026-newsletter

ISSA Webinar: AI Data Blind Spots 

What Security Professionals Need to Know 

 

In this webinar, you’ll learn about modern AI systems and how to secure them, as well as an introduction to the role of vector embeddings and how to protect embeddings with encryption-in-use. Companies building AI systems on private data need to know how to keep the data safe without inhibiting the usefulness of new AI products.

 

> Watch the recording

 

seald-migration-newsletter

Seald's U.S. Shutdown: Migration Options

Comparing Seald's Offering to IronCore's Zero-Trust, Scalable End-to-End Encryption Approach

 

Seald's U.S. shutdown came with little notice, while its European customers get 'alpha' status with no support. Here's a comparison with a better option for those wanting end-to-end encryption or cryptographic access controls, especially if using groups to manage access.

 

> Read the blog

 

pacman-ai-eating-data-newsletter

[VIDEO] AI Is Eating Your Data

Here's The Privacy Tech Fighting Back

 
Your vendors can read your data. That means hackers and governments can too. This video explains the three biggest risks to data privacy today — security failures, government access, and the AI data pipeline — and introduces encrypted search, a privacy-enhancing technology that lets applications search and operate on data they can't decrypt. We cover how vector embeddings can be reversed, why 'encrypted at rest' isn't enough, and what businesses should require from their technology providers.

 

> Watch the video (5m 42s)

 

ai-policy-newsletter

AI Coding Agents: Our Privacy Line in the Sand

How IronCore balances AI productivity with data protection

 
A breakdown on where IronCore draws the line on the use of generative AI and coding agents to ensure that private data stays private while still taking advantage of the productivity that these tools can bring.

 

> Read the full blog

 

LinkedIn
X
GitHub
Mastadon
YouTube

IronCore Labs, 1750 30th Street #500, Boulder, CO 80301, United States, 3032615067

Unsubscribe Manage preferences