Overview

GroqLLMService provides access to Groq’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService and supports streaming responses, function calling, and context management with ultra-fast inference speeds.

Installation

To use Groq services, install the required dependency:
pip install "pipecat-ai[groq]"

Prerequisites

Groq Account Setup

Before using Groq LLM services, you need:
  1. Groq Account: Sign up at Groq Console
  2. API Key: Generate an API key from your console dashboard
  3. Model Selection: Choose from available models with ultra-fast inference

Required Environment Variables

  • GROQ_API_KEY: Your Groq API key for authentication