Snowflake plugs PostgreSQL into its AI Data Cloud • The Register
#Cloud

Snowflake plugs PostgreSQL into its AI Data Cloud • The Register

Regulation Reporter
5 min read

Snowflake launches PostgreSQL database-as-a-service within its AI Data Cloud, allowing teams to power apps and AI agents while analyzing business performance without costly data pipelines.

Snowflake is launching a PostgreSQL database-as-a-service within its AI data environment to place transactional workloads alongside analytics and AI under a single set of governance rules. PostgreSQL now runs natively in the AI Data Cloud, allowing teams to power apps and AI agents, analyze business performance and trends using the data from their operations, and build recommendations or forecasting systems without the cost and complexity of building data pipelines and managing multiple vendors, the cloud data platform provider said.

"Say you want to build an app on data that is in Snowflake, but if that app doesn't have a relational OLTP [online transaction processing] database to store the data, they have to go break out of the boundary," Snowflake EVP of product Christian Kleinerman told The Register. "With the PostgreSQL service, our goal is to provide this secure boundary where, if customers build apps or build agents within that boundary, their data has not left the compliance and regulatory perimeter for Snowflake."

In addition, the service's full compatibility with open source PostgreSQL allows organizations to move existing apps onto Snowflake without code changes, according to Snowflake. The service relies on pg_lake, a set of open source PostgreSQL extensions that allow developers and data engineers to read and write directly to Apache Iceberg tables from PostgreSQL, thereby cutting out the need to extract and move data. Iceberg is an open table format that proponents say lets users bring their preferred analytics engines to their data without moving it. It is widely used and supported across the cloud and data platform ecosystem, including by Snowflake, Google, AWS, and others.

Snowflake claims this would eliminate costly data movement between transactional and analytical systems. The vendor already had a transactional capability within its platform. Called Unistore, it was announced in 2022, but was not generally available until late 2024. One analyst commented last year that Unistore had attracted little interest. Kleinerman said Unistore offered low-latency reads and writes, but customers were also saying they wanted a PostgreSQL-compatible service. After looking at a number of options, Snowflake bought Crunchy Data, a PostgreSQL service provider.

IDC research director Devin Pratt said the move helps extend Snowflake beyond analytics into a managed OLTP offering. Online transaction processing handles live application reads and writes, while online analytical processing (OLAP) is used for reporting and analysis. Running both in one environment, he said, supports agentic AI and real-time streaming because agents need continuous access to analytical insight and live transactional data, reducing the delay between data creation and analysis.

But Snowflake was not alone in adopting the strategy. "It reflects a broader trend of vendors pairing operational databases with analytics to support real-time and agentic AI workflows," Pratt said. Following its acquisition of Neon, which provides a serverless PostgreSQL architecture, Snowflake rival Databricks announced its service Lakebase. In many cases, with OLTP and OLAP in the same platform, teams can reduce ETL and data duplication and apply more consistent governance and observability across transactional and analytical workloads, Pratt said.

"The value is a more unified operational and analytical stack with consistent management and security," he said.

Featured image

Key Benefits of Snowflake's PostgreSQL Integration:

  • Unified Governance: All workloads operate under a single compliance and regulatory perimeter
  • No Code Changes: Full PostgreSQL compatibility allows existing applications to migrate seamlessly
  • Eliminated Data Movement: pg_lake extensions enable direct reading and writing to Apache Iceberg tables
  • Cost Reduction: Removes the need for separate transactional and analytical systems
  • Real-time AI Support: Enables agentic AI workflows with continuous access to both operational and analytical data

The PostgreSQL service represents a significant evolution in Snowflake's platform strategy, moving beyond its analytics roots to become a more comprehensive data platform. By integrating transactional capabilities directly into its AI Data Cloud, Snowflake is positioning itself to compete more directly with traditional database vendors while leveraging its strengths in data governance and analytics.

This move also reflects the growing convergence between operational and analytical workloads in modern data architectures. As organizations increasingly rely on AI and real-time analytics, the ability to process transactions and analyze data within the same environment becomes a critical competitive advantage. Snowflake's approach of using open standards like PostgreSQL and Apache Iceberg ensures compatibility with existing tools and ecosystems while providing the unified governance that enterprises require.

For organizations already invested in Snowflake, this new capability could significantly simplify their data architecture by eliminating the need for separate transactional databases and the complex data pipelines that typically connect them to analytical systems. The integration of pg_lake with Apache Iceberg is particularly noteworthy, as it enables a new paradigm where data can be both operational and analytical without the traditional trade-offs between performance, cost, and complexity.

The broader trend that Pratt identifies - the pairing of operational databases with analytics - suggests that this is not just a Snowflake strategy but a fundamental shift in how data platforms are designed. As AI agents and real-time applications become more prevalent, the ability to process transactions and generate insights within the same environment will likely become a standard requirement rather than a competitive differentiator.

For developers and data engineers, this means a future where the boundaries between operational and analytical workloads continue to blur, requiring new skills and approaches to data architecture. The emphasis on open standards and compatibility suggests that organizations will have more flexibility in choosing their tools while still benefiting from unified governance and management.

Snowflake's PostgreSQL integration represents a significant milestone in the evolution of data platforms, one that could reshape how organizations think about their data architecture and the tools they use to build AI-powered applications and analytics.

Comments

Loading comments...