Groq
Groq ↗ delivers high-speed processing and low-latency performance.
https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/groqWhen making requests to Groq ↗, replace https://api.groq.com/openai/v1 in the URL you're currently using with https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/groq.
When making requests to Groq, ensure you have the following:
- Your AI Gateway Account ID.
 - Your AI Gateway gateway name.
 - An active Groq API token.
 - The name of the Groq model you want to use.
 
curl https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/groq/chat/completions \  --header 'Authorization: Bearer {groq_api_key}' \  --header 'Content-Type: application/json' \  --data '{    "messages": [      {        "role": "user",        "content": "What is Cloudflare?"      }    ],    "model": "llama3-8b-8192"}'If using the groq-sdk ↗, set your endpoint like this:
import Groq from "groq-sdk";
const apiKey = env.GROQ_API_KEY;const accountId = "{account_id}";const gatewayId = "{gateway_id}";const baseURL = `https://gateway.ai.cloudflare.com/v1/${accountId}/${gatewayId}/groq`;
const groq = new Groq({  apiKey,  baseURL,});
const messages = [{ role: "user", content: "What is Cloudflare?" }];const model = "llama3-8b-8192";
const chatCompletion = await groq.chat.completions.create({  messages,  model,});You can also use the OpenAI-compatible endpoint (/ai-gateway/usage/chat-completion/) to access Groq models using the OpenAI API schema. To do so, send your requests to:
https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completionsSpecify:
{"model": "groq/{model}"}Was this helpful?
- Resources
 - API
 - New to Cloudflare?
 - Directory
 - Sponsorships
 - Open Source
 
- Support
 - Help Center
 - System Status
 - Compliance
 - GDPR
 
- Company
 - cloudflare.com
 - Our team
 - Careers
 
- © 2025 Cloudflare, Inc.
 - Privacy Policy
 - Terms of Use
 - Report Security Issues
 - Trademark