Spark Logo

Spark

Spark CILicense: MITHex version badgeHexdocs badgeREUSE statusAsk DeepWiki

Build powerful, extensible DSLs with exceptional developer experience

Spark is a framework for creating declarative domain-specific languages in Elixir. It transforms simple struct definitions into rich, extensible DSLs that come with autocomplete, documentation generation, and sophisticated tooling built right in.

Quick Example

Here's how you can build a data validator DSL with Spark:

defmodule MyApp.PersonValidator do
  use MyLibrary.Validator

  fields do
    required [:name]
    field :name, :string

    field :email, :string do
      check &String.contains?(&1, "@")
      transform &String.trim/1
    end
  end
end

MyApp.PersonValidator.validate(%{name: "Zach", email: " foo@example.com "})
{:ok, %{name: "Zach", email: "foo@example.com"}}

The DSL definition itself is clean and declarative:

@field %Spark.Dsl.Entity{
  name: :field,
  args: [:name, :type],
  target: Field,
  describe: "A field that is accepted by the validator",
  schema: [
    name: [type: :atom, required: true, doc: "The name of the field"],
    type: [type: {:one_of, [:integer, :string]}, required: true, doc: "The type of the field"],
    check: [type: {:fun, 1}, doc: "A function to validate the value"],
    transform: [type: {:fun, 1}, doc: "A function to transform the value"]
  ]
}

@fields %Spark.Dsl.Section{
  name: :fields,
  entities: [@field],
  describe: "Configure the fields that are supported and required"
}

use Spark.Dsl.Extension, sections: [@fields]

What You Get Out of the Box

Installation

Add spark to your list of dependencies in mix.exs:

def deps do
  [
    {:spark, "~> 2.3"}
  ]
end

Getting Started

The best way to get started is with our comprehensive tutorial that walks you through building a complete DSL from scratch:

📖 Get Started with Spark - Build a data validator DSL step by step

Quick Start Checklist

  1. Define your DSL structure using Spark.Dsl.Section and Spark.Dsl.Entity
  2. Create your extension with use Spark.Dsl.Extension
  3. Build your DSL module that users will import
  4. Add transformers and verifiers for advanced behavior
  5. Generate helper functions with Spark.InfoGenerator

Each step is covered in detail in the tutorial above.

Documentation

📚 Guides & Tutorials

🔧 API Reference

Production Ready

Spark is battle-tested and powers all DSLs in the Ash Framework, handling complex real-world applications with thousands of DSL definitions. Whether you're building configuration DSLs, workflow orchestrators, or domain-specific languages for your business logic, Spark provides the foundation for production-grade solutions.

Contributing

We welcome contributions! Please see our contributing guidelines and feel free to open issues or submit pull requests.

Links

License

MIT - see LICENSES/MIT.txt for details.