Overview
GroqLLMService
provides access to Groq’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService
and supports streaming responses, function calling, and context management with ultra-fast inference speeds.
Groq LLM API Reference
Pipecat’s API methods for Groq integration
Example Implementation
Complete example with function calling
Groq Documentation
Official Groq API documentation and features
Groq Console
Access models and manage API keys
Installation
To use Groq services, install the required dependency:Prerequisites
Groq Account Setup
Before using Groq LLM services, you need:- Groq Account: Sign up at Groq Console
- API Key: Generate an API key from your console dashboard
- Model Selection: Choose from available models with ultra-fast inference
Required Environment Variables
GROQ_API_KEY
: Your Groq API key for authentication