Overview

CerebrasLLMService provides access to Cerebras’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService and supports streaming responses, function calling, and context management with ultra-fast inference speeds.

Installation

To use Cerebras services, install the required dependency:
pip install "pipecat-ai[cerebras]"

Prerequisites

Cerebras Account Setup

Before using Cerebras LLM services, you need:
  1. Cerebras Account: Sign up at Cerebras Cloud
  2. API Key: Generate an API key from your account dashboard
  3. Model Selection: Choose from available Cerebras models with ultra-fast inference

Required Environment Variables

  • CEREBRAS_API_KEY: Your Cerebras API key for authentication