Snowflake Introduces Cortex: A Managed Service for AI Building Blocks

Today, the Montana-based data-as-a-service and cloud storage company Snowflake announced Cortex, a fully managed service that brings the power of large language models (LLMs) into its data cloud. Cortex provides a suite of AI building blocks, including open-source LLMs, to analyze data and build applications targeting different business-specific use cases.

Enabling Enterprises to Embrace Generative AI

Enterprises have been looking to leverage generative AI, but the associated constraints, such as the need for AI talent and complex GPU infrastructure management, have made it challenging to bring applications to production. Snowflake Cortex aims to streamline this process by providing users with a set of serverless specialized and general-purpose AI functions.

  • The specialized functions leverage language and machine learning models, allowing users to accelerate specific analytical tasks through natural language inputs, such as extracting answers, summarizing information, translating languages, building forecasts, and detecting anomalies.
  • The general-purpose functions cover a variety of models, including open-source LLMs and Snowflake’s proprietary models, and come with vector embedding and search capabilities to help users contextualize model responses based on their data. This allows for the creation of custom applications targeting different use cases.

“This is great for our users because they don’t have to do any provisioning,” said Sridhar Ramaswamy, SVP of AI at Snowflake. “We do the provisioning and deployment. It is just like an API, similar to what OpenAI offers but built right within Snowflake. The data does not leave anywhere, and it comes with the kind of guarantees that our customers want and demand.”

Enhancing Snowflake’s Platform with Native LLM Experiences

While Cortex has just been announced for enterprise use, Snowflake is already leveraging the service to enhance its platform’s functionality with native LLM experiences. Three Cortex-powered capabilities have been launched in private preview:

  • Snowflake Copilot: A conversational assistant that allows users to ask questions about their data in plain text, write SQL queries, refine queries, and filter down insights.
  • Universal Search: LLM-powered search functionality that helps users find and start getting value from the most relevant data and apps for their use cases.
  • Document AI: Helps in extracting information from unstructured documents hosted in the Snowflake data cloud.

Similar capabilities have been built by other players in the data industry, including Databricks, Informatica, and Dremio, further highlighting the importance of LLM-based solutions in managing and querying data through natural language inputs.

Additional Advancements and Investments

Beyond Cortex, Snowflake also announced advancements in support for Iceberg Tables and new capabilities in its Horizon governance solution. These include data quality monitoring, improved data lineage understanding, enhanced data classification, and a trust center for cross-cloud security and compliance monitoring.

Additionally, Snowflake launched a funding program to invest up to $100 million in early-stage startups building Snowflake native apps. This program is backed by Snowflake’s VC arm and multiple venture capital firms, reinforcing the company’s commitment to fostering innovation in the Snowflake ecosystem.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts