Snowflake Unveils Arctic, an Open-Source Large Language Model for Enterprise AI

Snowflake unveils Arctic, a state-of-the-art open-source LLM optimized for enterprise use cases, setting a new standard for transparency and cost-effectiveness in AI technology.

author-image
Israel Ojoko
Updated On
New Update
Snowflake Unveils Arctic, an Open-Source Large Language Model for Enterprise AI

Snowflake Unveils Arctic, an Open-Source Large Language Model for Enterprise AI

Snowflake, a cloud computing company, has announced the release of Snowflake Arctic, a state-of-the-art large language model (LLM) designed to be the most open and enterprise-grade LLM on the market.

Arctic is part of the Snowflake Arctic model family, which also includes advanced text embedding models optimized for retrieval use cases.

The Snowflake Arctic LLM incorporates a unique Mixture-of-Experts (MoE) architecture, delivering top-tier intelligence with unparalleled efficiency at scale. "Snowflake Arctic achieves the highest score on this metric while being trained at the minimum cost, outperforming models like DataBricks which spend 8 times more," according to Snowflake. Arctic utilizes a Dense-MoE Hybrid architecture, combining a 10B dense transformer model with a 128x3.66B MoE MLP, resulting in 480B total and 17B active parameters.

Snowflake is releasing Arctic's weights under an Apache 2.0 license, allowing for open and commercial use. The company has also shared details of the research leading to Arctic's development, setting a new standard for transparency and collaboration in enterprise AI technology.

"Snowflake has open-sourced the Arctic model, its data recipes, model implementations, and the research insights that went into its development, democratizing the skills required to build and optimize large-scale MoE models," the company stated.

Why this matters: The launch of Snowflake Arctic represents a significant advancement in open, cost-effective, and capable language models tailored for enterprise use cases. It empowers enterprises to leverage cutting-edge AI technology while maintaining transparency and collaboration.

Arctic is optimized for complex enterprise workloads, outperforming leading open models in various benchmarks such as SQL code generation and instruction following. The model will be available for serverless inference in Snowflake Cortex, Snowflake's fully managed machine learning and AI service, as well as on Amazon Web Services and other platforms.

Snowflake's AI research team achieved remarkable feats with Arctic, building it in less than three months and incurring significantly lower training costs compared to similar models. "Snowflake's AI research team was able to train Arctic in less than three months and at roughly one-eighth of the cost of similar models, setting a new baseline for how fast state-of-the-art open enterprise-grade models can be trained," Snowflake reported.

Snowflake continues to invest in AI innovation, providing enterprises with the data foundation and advanced AI building blocks to create powerful AI and machine learning applications. The release of Snowflake Arctic demonstrates the company's commitment to openness and collaboration, aiming to further the frontiers of what open-source AI can achieve.

Key Takeaways

  • Snowflake released Snowflake Arctic, a state-of-the-art open-source LLM for enterprise use.
  • Arctic uses a unique Mixture-of-Experts (MoE) architecture, delivering high performance at low cost.
  • Snowflake open-sourced Arctic's weights, data recipes, and research insights under Apache 2.0 license.
  • Arctic outperforms leading open models in benchmarks like SQL code generation and instruction following.
  • Snowflake trained Arctic in under 3 months at 1/8th the cost of similar models, setting a new baseline.