Overview
FireworksLLMService
provides access to Fireworks AI’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService
and supports streaming responses, function calling, and context management with optimized inference infrastructure.
Fireworks LLM API Reference
Pipecat’s API methods for Fireworks AI integration
Example Implementation
Complete example with function calling
Fireworks Documentation
Official Fireworks AI API documentation and features
Fireworks Platform
Access models and manage API keys
Installation
To use Fireworks AI services, install the required dependency:Prerequisites
Fireworks AI Account Setup
Before using Fireworks AI LLM services, you need:- Fireworks Account: Sign up at Fireworks AI
- API Key: Generate an API key from your account dashboard
- Model Selection: Choose from available open-source and proprietary models
Required Environment Variables
FIREWORKS_API_KEY
: Your Fireworks AI API key for authentication