Snowpark Container Services Launched
July 10, 2023

Snowflake announced new innovations that extend data programmability for data scientists, data engineers, and application developers so they can build faster and more efficiently in the Data Cloud.

With the launch of Snowpark Container Services (private preview), Snowflake is expanding the scope of Snowpark so developers can unlock broader infrastructure options such as accelerated computing with NVIDIA GPUs and AI software to run more workloads within Snowflake’s secure and governed platform without complexity, including a wider range of AI and machine learning (ML) models, APIs, internally-developed applications, and more. Using Snowpark Container Services, Snowflake customers also get access to an expansive catalog of third-party software and apps including large language models (LLMs), Notebooks, MLOps tools, and more within their account. In addition, Snowflake is simplifying and scaling how users develop, operationalize, and consume ML models, unveiling new innovations so more organizations can bring their data and ML models to life. These advancements include a set of new Snowpark ML APIs for more efficient model development (public preview), a Snowpark Model Registry (private preview) for scalable MLOps, Streamlit in Snowflake (public preview soon) to turn models into interactive apps, and advanced streaming capabilities.

“Snowflake’s product advancements are revolutionizing how customers build in the Data Cloud, enabling data scientists, data engineers, and application developers with extended programmability and a wide range of use cases so they can build, test, and deploy anything they can dream up, without tradeoffs,” said Christian Kleinerman, SVP of Product, Snowflake. “Our continued investments in Snowpark, alongside our machine learning and streaming capabilities accelerate how users put their data to work, unlocking new ways to drive impact across their organizations with increased flexibility.”

Snowpark continues to serve as Snowflake’s secure deployment and processing of non-SQL code with various runtimes and libraries — expanding who can build and what gets built in the Data Cloud. It lets builders work with data more effectively in their programming languages and tools of choice, while providing organizations with the automation, governance, and security guarantees missing in legacy data lakes and big data environments.

Snowpark Container Services further expands the scope of workloads that can be brought to customers’ data. It provides users with the flexibility to build in any programming language and deploy on broader infrastructure choices, including the NVIDIA AI platform for optimized acceleration, with the same ease of use, scalability, and unified governance of the Snowflake Data Cloud. In addition, Snowpark Container Services can be used as part of a Snowflake Native App (public preview on AWS), enabling developers to distribute sophisticated apps that run entirely in their end-customer’s Snowflake account. Snowpark Container Services will also enable users to securely run leading third-party generative model providers like Reka directly within their Snowflake account, removing the need to expose proprietary data to accelerate innovation.

Snowflake has partnered with dozens of third-party software and application providers to deliver world-class products that can run within their end-customer’s Snowflake account using Snowpark Container Services. For example, customers can run Hex’s Notebooks for analytics and data science, use popular AI platforms and ML features from Alteryx, Dataiku, and SAS to run more advanced AI and ML processing, and manage these data workflows with Astronomer’s platform powered by Apache Airflow — all entirely within Snowflake. These are just a few examples, with AI21 Labs, Amplitude, CARTO, H2O.ai, Kumo AI, Pinecone, RelationalAI, Weights & Biases, and more also delivering their products and services with Snowpark Container Services.

To streamline and scale machine learning model operations (MLOps), Snowflake is announcing the new Snowpark Model Registry, a unified repository for organizations’ ML models. The registry enables users to centralize the publishing and discovery of models, further streamlining collaboration across data scientists and ML engineers to seamlessly deploy models into production.

Snowflake is also advancing its integration of Streamlit in Snowflake, empowering data scientists and other Python developers to increase the impact of their work by building apps that bridge the gap between data and business action. With Streamlit in Snowflake, builders can use familiar Python code to develop their apps, transforming an idea into an enterprise-ready app with just a few lines of code, and then quickly deploy and share these apps securely in the Data Cloud.

In addition, Snowflake is making development within its unified platform easier and more familiar through new capabilities including native Git integration (private preview) to support seamless CI/CD workflows, and native Command Line Interface (CLI) (private preview) for optimized development and testing within Snowflake. New innovations also make it easier and more cost effective for data engineers to work with low latency data, without having to stitch together solutions or build additional data pipelines. Snowflake is eliminating boundaries between batch and streaming pipelines with Snowpipe Streaming (general availability soon) and Dynamic Tables (public preview), delivering a simplified and cost effective solution for data engineers to ingest streaming data and easily build complex declarative pipelines.

Share this

Industry News

April 25, 2024

JFrog announced a new machine learning (ML) lifecycle integration between JFrog Artifactory and MLflow, an open source software platform originally developed by Databricks.

April 25, 2024

Copado announced the general availability of Test Copilot, the AI-powered test creation assistant.

April 25, 2024

SmartBear has added no-code test automation powered by GenAI to its Zephyr Scale, the solution that delivers scalable, performant test management inside Jira.

April 24, 2024

Opsera announced that two new patents have been issued for its Unified DevOps Platform, now totaling nine patents issued for the cloud-native DevOps Platform.

April 23, 2024

mabl announced the addition of mobile application testing to its platform.

April 23, 2024

Spectro Cloud announced the achievement of a new Amazon Web Services (AWS) Competency designation.

April 22, 2024

GitLab announced the general availability of GitLab Duo Chat.

April 18, 2024

SmartBear announced a new version of its API design and documentation tool, SwaggerHub, integrating Stoplight’s API open source tools.

April 18, 2024

Red Hat announced updates to Red Hat Trusted Software Supply Chain.

April 18, 2024

Tricentis announced the latest update to the company’s AI offerings with the launch of Tricentis Copilot, a suite of solutions leveraging generative AI to enhance productivity throughout the entire testing lifecycle.

April 17, 2024

CIQ launched fully supported, upstream stable kernels for Rocky Linux via the CIQ Enterprise Linux Platform, providing enhanced performance, hardware compatibility and security.

April 17, 2024

Redgate launched an enterprise version of its database monitoring tool, providing a range of new features to address the challenges of scale and complexity faced by larger organizations.

April 17, 2024

Snyk announced the expansion of its current partnership with Google Cloud to advance secure code generated by Google Cloud’s generative-AI-powered collaborator service, Gemini Code Assist.

April 16, 2024

Kong announced the commercial availability of Kong Konnect Dedicated Cloud Gateways on Amazon Web Services (AWS).

April 16, 2024

Pegasystems announced the general availability of Pega Infinity ’24.1™.