ExLLM

A unified Elixir client for Large Language Models with intelligent test caching, comprehensive provider support, and advanced developer tooling.

⚠️ Alpha Quality Software: This library is in early development. APIs may change without notice until version 1.0.0 is released. Use in production at your own risk.

What’s New Since v0.7.0

v0.8.1 - Documentation & Code Quality

v0.8.0 - Advanced Streaming & Telemetry

v0.7.1 - Documentation System

Features

πŸ”— Core API

πŸ“Š Developer Experience

🎯 Advanced Features

Supported Providers

ExLLM supports 14 providers with access to 300+ models:

Installation

Add ex_llm to your list of dependencies in mix.exs:

def deps do
  [
    {:ex_llm, "~> 0.8.1"},
    
    # Optional hardware acceleration backends (choose one):
    {:exla, "~> 0.7", optional: true},
    
    # Optional: For Apple Silicon Metal acceleration
    {:emlx, github: "elixir-nx/emlx", branch: "main", optional: true}
  ]
end

Quick Start

1. Configuration

Set your API keys as environment variables:

export ANTHROPIC_API_KEY="your-anthropic-key"
export OPENAI_API_KEY="your-openai-key"
export GROQ_API_KEY="your-groq-key"
# ... other provider keys as needed

2. Basic Usage

# Single completion
{:ok, response} = ExLLM.chat(:anthropic, [
  %{role: "user", content: "Explain quantum computing in simple terms"}
])

IO.puts(response.content)
# Cost automatically tracked: response.cost

# Streaming response
ExLLM.chat_stream(:openai, [
  %{role: "user", content: "Write a short story"}
], fn chunk ->
  IO.write(chunk.delta)
end)

# With session management
{:ok, session} = ExLLM.Session.new(:groq)
{:ok, session, response} = ExLLM.Session.chat(session, "Hello!")
{:ok, session, response} = ExLLM.Session.chat(session, "How are you?")

# Multimodal with vision
{:ok, response} = ExLLM.chat(:gemini, [
  %{role: "user", content: [
    %{type: "text", text: "What's in this image?"},
    %{type: "image", image: %{data: base64_image, media_type: "image/jpeg"}}
  ]}
])

3. Testing with Caching

# Run integration tests with automatic caching
mix test.anthropic --include live_api

# Manage test cache
mix ex_llm.cache stats
mix ex_llm.cache clean --older-than 7d

Documentation

πŸ“š Quick Start Guide - Get up and running in 5 minutes
πŸ“– User Guide - Comprehensive documentation of all features
πŸ—οΈ Architecture Guide - Clean layered architecture and namespace organization
πŸ”§ Logger Guide - Debug logging and troubleshooting
⚑ Provider Capabilities - Feature comparison across providers
πŸ§ͺ Testing Guide - Comprehensive testing system with semantic tagging and caching

Key Topics Covered in the User Guide

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Support