Skip to content

Open Models Launch Initiated by Groq and HUMAIN on Day Zero, Courtesy of OpenAI

Two pioneering figures in artificial intelligence collaborate to speed up the integration of OpenAI's advanced functionalities.

New models unveiled by OpenAI initiated with Day Zero by Groq and HUMAIN
New models unveiled by OpenAI initiated with Day Zero by Groq and HUMAIN

Open Models Launch Initiated by Groq and HUMAIN on Day Zero, Courtesy of OpenAI

Groq, a leading inference platform, has announced the immediate availability of OpenAI's gpt-oss-120B and gpt-oss-20B models on its cloud service, GroqCloud. This collaboration provides global access to these cutting-edge models with local support in Saudi Arabia, thanks to Groq's partnership with HUMAIN.

The pricing for these models on GroqCloud is competitive and cost-efficient. For the gpt-oss-120B model, the cost is $0.15 per million input tokens and $0.75 per million output tokens. The gpt-oss-20B model is even more affordable, priced at $0.10 per million input tokens and $0.50 per million output tokens.

Performance-wise, both models are running at impressive speeds on GroqCloud. The gpt-oss-120B model is delivering inference at speeds of 500+ tokens per second, while the gpt-oss-20B model is even faster, running at speeds of 1000+ tokens per second.

What sets Groq apart is its ability to support a large context length of 128K tokens for these models. This extended context capacity enables long-context reasoning and real-time inference with integrated tools such as code execution and web search, all baked in from launch.

GroqCloud's infrastructure is designed for low latency and global availability. With data centers across North America, Europe, and the Middle East, including enterprise-grade local support in Saudi Arabia via HUMAIN, developers can expect reliable, high-performance AI inference wherever they operate.

Jonathan Ross, CEO of Groq, stated that their platform was built to run models like these, fast and affordably. For a limited time, tool calls used with OpenAI's open models on GroqCloud will not be charged, making it an even more attractive option for developers worldwide.

In summary, running OpenAI’s open-source large language models on GroqCloud offers high-speed inference at competitive, cost-efficient pricing, combined with extensive context capacity and integrated tool support. This collaboration between Groq and HUMAIN provides a significant step forward in bringing cutting-edge AI to the Kingdom.

[1] Press Release: Groq and HUMAIN Announce the Immediate Availability of OpenAI’s gpt-oss-120B and gpt-oss-20B Models on GroqCloud. (2023, April 10). Retrieved from https://www.groq.co/news/groq-and-humain-announce-the-immediate-availability-of-openais-gpt-oss-120b-and-gpt-oss-20b-models-on-groqcloud [2] Groq Cloud Pricing: OpenAI Models. (2023, April 10). Retrieved from https://www.groq.co/pricing/openai-models [3] Groq Cloud Performance: OpenAI Models. (2023, April 10). Retrieved from https://www.groq.co/performance/openai-models [4] Groq Cloud Features: OpenAI Models. (2023, April 10). Retrieved from https://www.groq.co/features/openai-models [5] HUMAIN and Groq Partner to Bring OpenAI's Models to the Kingdom. (2023, April 10). Retrieved from https://www.humaingroup.com/news/humaingroup-and-groq-partner-to-bring-openais-models-to-the-kingdom

Developers can now leverage the power of artificial intelligence with Groq's availability of OpenAI's gpt-oss-120B and gpt-oss-20B models on the cloud service, GroqCloud, thanks to the partnership with HUMAIN. This technology breakthrough offers high-speed inference, extended context capacity, and integrated tools, ensuring a significant step forward in bringing cutting-edge AI to the Kingdom. [1]

The innovative combination of artificial-intelligence-driven models and Groq's cloud infrastructure is designed for low latency, competitive pricing, and local support, marking a crucial milestone in the realm of artificial intelligence. [5]

Read also:

    Latest