Provider Implementation
Checklist
-
Implement the full interface (ChatModel, Embedder, VectorStore, STT, TTS, etc.).
-
Register via init() with parent package's Register() .
-
Map provider errors to core.Error with correct ErrorCode.
-
Support context cancellation.
-
Include token/usage metrics where applicable.
-
Compile-time check: var _ Interface = (*Impl)(nil) .
-
Unit tests with mocked HTTP responses (httptest).
File Structure
llm/providers/openai/ ├── openai.go # Implementation + New() + init() ├── stream.go # Streaming ├── errors.go # Error mapping ├── openai_test.go # Tests └── testdata/ # Recorded HTTP responses
Template
var _ llm.ChatModel = (*Model)(nil)
func init() { llm.Register("openai", func(cfg llm.ProviderConfig) (llm.ChatModel, error) { return New(cfg) }) }
func New(cfg llm.ProviderConfig) (*Model, error) { if cfg.APIKey == "" { return nil, &core.Error{Op: "openai.new", Code: core.ErrAuth, Message: "API key required"} } return &Model{client: newClient(cfg.APIKey, cfg.BaseURL), model: cfg.Model}, nil }
func (m Model) Stream(ctx context.Context, msgs []schema.Message, opts ...llm.GenerateOption) iter.Seq2[schema.StreamChunk, error] { return func(yield func(schema.StreamChunk, error) bool) { / stream implementation */ } }
Error Mapping
switch apiErr.StatusCode { case 401: code = core.ErrAuth case 429: code = core.ErrRateLimit case 408, 504: code = core.ErrTimeout case 400: code = core.ErrInvalidInput }
See docs/providers.md for provider categories and priorities.