Elixir

RustPythonNode.jsWASMJavaGoC#PHPRubyElixirDockerHomebrewC FFILicenseDocs
kreuzberg.dev
Discord

Universal LLM API client for Elixir. Access 143+ LLM providers through a single interface with native BEAM concurrency, OTP integration, and idiomatic Elixir API.

Installation

Package Installation

Add to your mix.exs dependencies:

def deps do
  [
    {:liter_llm, "~> 1.2.0"}
  ]
end

Then run:

mix deps.get

System Requirements

Quick Start

Basic Chat

Send a message to any provider using the provider/model prefix:

{:ok, response} =
  LiterLlm.chat(
    %{
      model: "openai/gpt-4o",
      messages: [%{role: "user", content: "Hello!"}]
    },
    api_key: System.fetch_env!("OPENAI_API_KEY")
  )

IO.puts(hd(response["choices"])["message"]["content"])

Common Use Cases

Streaming Responses

Stream tokens in real time:

{:ok, chunks} =
  LiterLlm.Client.chat_stream(client, %{
    model: "openai/gpt-4o-mini",
    messages: [%{role: "user", content: "Hello"}]
  })

for chunk <- chunks, do: IO.inspect(chunk)

Next Steps

Features

Supported Providers (143+)

Route to any provider using the provider/model prefix convention:

Provider Example Model
OpenAIopenai/gpt-4o, openai/gpt-4o-mini
Anthropicanthropic/claude-3-5-sonnet-20241022
Groqgroq/llama-3.1-70b-versatile
Mistralmistral/mistral-large-latest
Coherecohere/command-r-plus
Together AItogether/meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo
Fireworksfireworks/accounts/fireworks/models/llama-v3p1-70b-instruct
Google Vertexvertexai/gemini-1.5-pro
Amazon Bedrockbedrock/anthropic.claude-3-5-sonnet-20241022-v2:0

Complete Provider List

Key Capabilities

Performance

Built on a compiled Rust core for speed and safety:

Provider Routing

Route to 143+ providers using the provider/model prefix convention:

openai/gpt-4o
anthropic/claude-3-5-sonnet-20241022
groq/llama-3.1-70b-versatile
mistral/mistral-large-latest

See the provider registry for the full list.

Proxy Server

liter-llm also ships as an OpenAI-compatible proxy server with Docker support:

docker run -p 4000:4000 -e LITER_LLM_MASTER_KEY=sk-your-key ghcr.io/kreuzberg-dev/liter-llm

See the proxy server documentation for configuration, CLI usage, and MCP integration.

Documentation

Part of kreuzberg.dev.

Contributing

Contributions are welcome! See CONTRIBUTING.md for guidelines.

Join our Discord community for questions and discussion.

License

MIT -- see LICENSE for details.