@@ -5,14 +5,171 @@ SPDX-FileCopyrightText: 2022 spark contributors <https://github.com/ash-project/
55SPDX-License-Identifier: MIT
66-->
77
8- [ ![ Spark CI] ( https://github.com/ash-project/spark/actions/workflows/elixir.yml/badge.svg )] ( https://github.com/ash-project/spark/actions/workflows/elixir.yml ) [ ![ License: MIT] ( https://img.shields.io/badge/License-MIT-yellow.svg )] ( https://opensource.org/licenses/MIT )
8+ <p align =" center " >
9+ <img src =" logos/logo.svg " alt =" Spark Logo " width =" 150 " height =" 200 " />
10+ </p >
11+
12+ <!-- ex_doc_ignore_start -->
13+ # Spark
14+ <!-- ex_doc_ignore_end -->
15+
16+ [ ![ Spark CI] ( https://github.com/ash-project/spark/actions/workflows/elixir.yml/badge.svg )] ( https://github.com/ash-project/spark/actions/workflows/elixir.yml )
17+ [ ![ License: MIT] ( https://img.shields.io/badge/License-MIT-yellow.svg )] ( https://opensource.org/licenses/MIT )
918[ ![ Hex version badge] ( https://img.shields.io/hexpm/v/spark.svg )] ( https://hex.pm/packages/spark )
1019[ ![ Hexdocs badge] ( https://img.shields.io/badge/docs-hexdocs-purple )] ( https://hexdocs.pm/spark )
1120[ ![ REUSE status] ( https://api.reuse.software/badge/github.com/ash-project/spark )] ( https://api.reuse.software/info/github.com/ash-project/spark )
1221
22+ ** Build powerful, extensible DSLs with exceptional developer experience**
1323
14- # Spark
24+ Spark is a framework for creating declarative domain-specific languages in
25+ Elixir. It transforms simple struct definitions into rich, extensible DSLs that
26+ come with autocomplete, documentation generation, and sophisticated tooling
27+ built right in.
28+
29+ ## Quick Example
30+
31+ Here's how you can build a data validator DSL with Spark:
32+
33+ ``` elixir
34+ defmodule MyApp .PersonValidator do
35+ use MyLibrary .Validator
36+
37+ fields do
38+ required [:name ]
39+ field :name , :string
40+
41+ field :email , :string do
42+ check & String .contains? (&1 , " @" )
43+ transform & String .trim / 1
44+ end
45+ end
46+ end
47+
48+ MyApp .
PersonValidator .
validate (%{
name: " Zach" ,
email: " [email protected] " })
49+ {
:ok , %{
name: " Zach" ,
email: " [email protected] " }}
50+ ```
51+
52+ The DSL definition itself is clean and declarative:
53+
54+ ``` elixir
55+ @field %Spark .Dsl .Entity {
56+ name: :field ,
57+ args: [:name , :type ],
58+ target: Field ,
59+ describe: " A field that is accepted by the validator" ,
60+ schema: [
61+ name: [type: :atom , required: true , doc: " The name of the field" ],
62+ type: [type: {:one_of , [:integer , :string ]}, required: true , doc: " The type of the field" ],
63+ check: [type: {:fun , 1 }, doc: " A function to validate the value" ],
64+ transform: [type: {:fun , 1 }, doc: " A function to transform the value" ]
65+ ]
66+ }
67+
68+ @fields %Spark .Dsl .Section {
69+ name: :fields ,
70+ entities: [@field ],
71+ describe: " Configure the fields that are supported and required"
72+ }
73+
74+ use Spark .Dsl .Extension , sections: [@fields ]
75+ ```
76+
77+ ## What You Get Out of the Box
78+
79+ * 🔧 ** Extensible Architecture** - Anyone can write extensions for your DSL,
80+ making it infinitely customizable
81+ * 🧠 ** Smart Autocomplete** - Built-in ElixirSense integration provides
82+ intelligent code completion and inline documentation in your editor
83+ * 📚 ** Auto Documentation** - Generate comprehensive documentation for your DSL
84+ automatically, including all options and usage examples
85+ * ⚡ ** Developer Tools** - Mix tasks for formatting, code generation, and
86+ maintaining ` locals_without_parens ` automatically
87+ * 🔄 ** Compile-time Processing** - Use transformers to modify DSL structure$
88+ during compilation and verifiers to validate correctness
89+ * 🎯 ** Type Safety** - Rich schema validation ensures DSL usage is correct at
90+ compile time with helpful error messages
91+ * 🔍 ** Introspection** - Built-in tools to inspect and query DSL definitions
92+ programmatically at runtime
93+
94+ ## Installation
95+
96+ Add ` spark ` to your list of dependencies in ` mix.exs ` :
97+
98+ ``` elixir
99+ def deps do
100+ [
101+ {:spark , " ~> 2.3" }
102+ ]
103+ end
104+ ```
105+
106+ ## Getting Started
107+
108+ The best way to get started is with our comprehensive tutorial that walks you
109+ through building a complete DSL from scratch:
110+
111+ 📖 ** [ Get Started with Spark] ( documentation/tutorials/get-started-with-spark.md ) ** -
112+ Build a data validator DSL step by step
113+
114+ ### Quick Start Checklist
115+
116+ 1 . ** Define your DSL structure** using ` Spark.Dsl.Section ` and ` Spark.Dsl.Entity `
117+ 2 . ** Create your extension** with ` use Spark.Dsl.Extension `
118+ 3 . ** Build your DSL module** that users will import
119+ 4 . ** Add transformers and verifiers** for advanced behavior
120+ 5 . ** Generate helper functions** with ` Spark.InfoGenerator `
121+
122+ Each step is covered in detail in the tutorial above.
123+
124+ ## Documentation
125+
126+ ### 📚 Guides & Tutorials
127+ - ** [ Get Started with Spark] ( documentation/tutorials/get-started-with-spark.md ) ** -
128+ Complete tutorial building a validator DSL
129+ - ** [ Writing Extensions] ( documentation/how_to/writing-extensions.md ) ** -
130+ Deep dive into extension development
131+ - ** [ Setup Autocomplete] ( documentation/how_to/setup-autocomplete.md ) ** -
132+ Configure editor integration
133+ - ** [ Split Up Large DSLs] ( documentation/how_to/split-up-large-dsls.md ) ** -
134+ Organize complex DSL definitions
135+ - ** [ Use Source Annotations] ( documentation/how_to/use-source-annotations.md ) ** -
136+ Leverage location tracking for better errors
137+
138+ ### 🔧 API Reference
139+ - ** [ HexDocs] ( https://hexdocs.pm/spark ) ** - Complete API documentation
140+ - ** Core Modules** : ` Spark.Dsl.Extension ` , ` Spark.Dsl.Entity ` ,
141+ ` Spark.Dsl.Section `
142+ - ** Advanced Features** : ` Spark.Dsl.Transformer ` , ` Spark.Dsl.Verifier ` ,
143+ ` Spark.InfoGenerator `
144+
145+ ## Production Ready
146+
147+ Spark is battle-tested and powers all DSLs in the [ Ash Framework] ( https://ash-hq.org ) ,
148+ handling complex real-world applications with thousands of DSL definitions.
149+ Whether you're building configuration DSLs, workflow orchestrators, or
150+ domain-specific languages for your business logic, Spark provides the foundation
151+ for production-grade solutions.
152+
153+ <!-- ex_doc_ignore_start -->
154+ ## Contributing
155+
156+ We welcome contributions! Please see our [ contributing guidelines] ( CONTRIBUTING.md )
157+ and feel free to open issues or submit pull requests.
158+ <!-- ex_doc_ignore_end -->
159+
160+ ## Links
161+
162+ - ** [ GitHub] ( https://github.com/ash-project/spark ) ** - Source code and issue
163+ tracking
164+ - ** [ Hex.pm] ( https://hex.pm/packages/spark ) ** - Package repository
165+ - ** [ HexDocs] ( https://hexdocs.pm/spark ) ** - API documentation
166+ - ** [ Ash Framework] ( https://ash-hq.org ) ** - See Spark in action
167+ - ** [ Discord] ( https://discord.gg/HTHRaaVPUc ) ** - Community chat
168+ - ** [ Forum] ( https://elixirforum.com/c/elixir-framework-forums/ash-framework-forum ) ** -
169+ Discussion forum
15170
16- Spark helps you build powerful and well documented DSLs that come with useful tooling out of the box. DSLs are declared using simple structs, and every DSL has the ability to be extended by the end user. Spark powers all of the DSLs in Ash Framework.
171+ <!-- ex_doc_ignore_start -->
172+ ## License
17173
18- See the [ getting started guide] ( https://ash-hq.org/docs/guides/spark/latest/get-started-with-spark )
174+ MIT - see [ ` LICENSES/MIT.txt ` ] ( LICENSES/MIT.txt ) for details.
175+ <!-- ex_doc_ignore_end -->
0 commit comments