Snowflake makes enterprise data AI-ready with native Postgres in its AI Data Cloud

Snowflake announced advancements that make data AI-ready by design, enabling enterprises to rely on data that is continuously available, usable, and governed as AI transitions from experimentation into real-world production systems. With new enhancements to Snowflake Postgres, the database now runs natively in the AI Data Cloud, allowing enterprises to consolidate transactional, analytical, and AI use cases on a single, secure platform.

To help ensure AI systems are trusted at enterprise scale, Snowflake is further embedding enhanced interoperability, governance, and resilience features into its platform, enabling more customers to bring Snowflake directly to their data, wherever it lives.

“As businesses move from AI experimentation to production, the real challenge is ensuring AI systems can consistently access data that is connected, governed, and discoverable across the enterprise,” said Christian Kleinerman, EVP of Product, Snowflake.

“That means eliminating data silos, fragile pipelines, and closed systems that slow down AI deployment and increase risk. By bringing unified operational and analytical data, as well as open interoperability together in one platform, we’re empowering customers to develop enterprise-ready AI systems that work with real business data, securely and at scale,” Kleinerman continued.

“At Sigma, our customers expect live, interactive analytics on the most current business data,” said Jake Hannan, Head of Data, Sigma Computing. “With Snowflake Postgres, we can work directly on fresh transactional data inside Snowflake without relying on complex pipelines or external systems. That gives our teams and customers a simpler, more reliable foundation to build governed analytics and AI-powered experiences that respond in real time.”

Connecting enterprise data and AI to power mission-critical apps and AI agents

Most organizations still keep their transactional and analytical databases siloed on separate systems, a legacy approach that forces teams to rely on complex pipelines to connect these systems. This fragmentation adds steep costs, slows development, introduces risk, and delays insights.

Snowflake Postgres eliminates these pipelines by bringing transactional, analytical, and AI capabilities together on a single, enterprise-ready platform. In turn, full compatibility with open source Postgres allows companies to move their existing apps onto Snowflake, without code changes.

With Snowflake Postgres, teams can power critical apps and AI agents, analyze business performance and trends using the most up-to-date data from their operations, and build AI-driven features like recommendations or forecasting — all without costly, complex data pipelines or the infrastructure overhead of managing multiple vendors.

Powered by pg_lake, a set of PostgreSQL extensions that allow Postgres to easily work within an organization’s open and interoperable lakehouse grounded in Apache Iceberg, enterprises can leverage Snowflake Postgres to directly query, manage, and write to Apache Iceberg tables using standard SQL.

This capability is delivered within a familiar Postgres environment, so enterprises can eliminate costly data movement between transactional and analytical systems. Enterprises such as BlueCloud and Sigma Computing are using Snowflake Postgres to simplify their data architectures and run enterprise-ready AI and apps on connected data.

“For BlueCloud, Snowflake Postgres represents a major opportunity to help our customers eliminate data pipelines, without compromising performance,” said Rob Sandberg, SVP and Head of Advisory Consulting, BlueCloud. “Its enterprise-grade Postgres foundation brings real credibility, particularly for the financial services organizations we support. With Snowflake Postgres, we can deliver low-latency transactional workloads alongside analytics and AI on a single platform, reducing overhead and helping our customers be more agile in meeting their business goals.”

Making data governed and open for trusted AI

As AI moves into production, enterprises need data that remains open, governed, and resilient as it flows across engines, formats, and environments. To address this need, Snowflake is expanding how customers access, share, and govern their data, so AI systems can scale with real-world demands:

Freedom to work across engines without impacting governance controls: To reduce silos and avoid vendor lock-in, Snowflake enables enforcement of the same governance policies when Snowflake data is queried from other engines. Snowflake Horizon Catalog, which provides context and governance for AI across all data, is enabling customers like science and technology company, Merck, and Motorq, a leading connected vehicle intelligence company, to leverage external engines to securely access data in Apache Iceberg tables (now generally available), as well as create, update, or manage data stored in Iceberg tables (public preview soon).

Seamless data collaboration across open formats: As organizations increasingly rely on open table formats, Snowflake is simplifying how those formats are shared without duplicating data or managing fragile pipelines. Open Format Data Sharing extends Snowflake’s zero-ETL sharing model to include formats such as Apache Iceberg and Delta Lake, enabling secure data sharing across teams, clouds, and regions.

Customers can now share data in open formats, while maintaining control over access and costs. A new integration with Microsoft OneLake (now generally available) enables mutual customers with secured bidirectional read access for Iceberg data managed by Snowflake or Microsoft Fabric. This means customers can seamlessly access all their data across both platforms without complexity or data duplication.

Built-in resilience to protect business-critical data: To help enterprises address regulatory requirements and withstand disruptions, Snowflake is strengthening how data is protected by default. Snowflake Backups (now generally available) further strengthens data resilience by protecting business-critical data. Organizations can recover quicker from ransomware or disruptions, while ensuring data isn’t altered or deleted once created. These protections give enterprises greater confidence that essential data is preserved, even in the face of unexpected events or security incidents.

More about

Don't miss