ai is a Python framework about network of connected neurons.
Each neuron is a function with inputs and an output.
Connections between compatible types can be allowed, i.e., a neuron with an int output can connect to another with an int input.
learn function uses an A* algorithm to target a value.
To run, ai requires a maximum of 500MB of RAM, runs on a CPU rather than a GPU, is deterministic, and is explainable. A model can fit on a few MB.
Here are the final results of this approach on the ARC AGI benchmark:
- Total number of tasks solved: 446/1000
- Success rate: 44.6 %
- Total execution time (on a CPU with 12 logical processors): 15 hours
- Data size to analyze: 170 MB
A C++ version of the engine has been made and appears to be 5 times faster.
Here are some test files for example:
- hello_world.py: This adds functions related to str and searches for connections translating the relationship x + y + z from input values, and shows the generalization with a user value.
- test_ai1.py: This adds functions related to numbers and str and displays connections for the number 103.
- test_ai2.py: This adds functions related to int and displays all possible connections starting from the number 18.
- test_ai3.py: This adds functions related to int and searches for connections translating the relationship x * y + z from input values (chosen to be unambiguous), and shows the generalization with user values.
- test_ai4.py: This adds functions related to np.ndarray and searches for connections translating the relationship x * y + z from input values, and shows the generalization with user values.
- test_full_ai.py: This adds functions related to all the available types (currently implemented in any case) and activate certain types of neurons depending on the case with the display of connections of numbers between 100 and 999 or all possible connections of depth 2
- test_number.py: The goal is to find the connection from elementary bricks to form any number. Learning takes about 20 seconds for a total of 147 neurons.
- test_expression.py: The goal is to find the connection from elementary bricks to form any expression. After learning the numbers, learning takes about 30 seconds for a total of 193 neurons.
- test_word.py: This example aims to reconstruct French words from syllables.
- test_sentence.py: This example aims to reconstruct simple sentences from words.
- test_sympy.py: This example aims to find the symbolic transformation between an input expression and an output expression.
- test_cv2.py: This example aims to find the transformation between an input image and an output image.
- test_syracuse.py: This example aims to find the Syracuse sequence from intermediate steps.
A basic GUI (gui.py) is available to manipulate a model. It is possible to visualize neurons graph in 2D or 3D, with colors.
| Task | My engine (CPU) | LLM (estimated, API/GPU) | Comment |
|---|---|---|---|
Simple numeric (e.g., 2+3) |
0.01 s ✅ | ~1 s |
Instantaneous and exact, against API latency and trivial error risk. |
Complex numeric (e.g., 3.1*x + y) |
0.01 s ✅ | ~1–2 s |
Constant time, explainable; LLM can yield an equivalent but not necessarily identical expression. |
Symbolic diff (e.g., differentiating sin(x)*exp(x)) |
0.70 s ✅ | ~2–5 s ❌ | Exact and explainable results. LLMs often hallucinate or fail. |
| Word reconstruction (e.g., reconstructing a word from syllables) | 0.11 s ✅ | ~1–3 s ❌ | Deterministic and reproducible; an LLM generates variants that are not guaranteed. |
✅ = exact and deterministic
conda create --name gt -c conda-forge graph-tool
conda activate gt
pip install bs4
pip install colour
pip install dill
pip install matplotlib
pip install numpy
pip install opencv-python
pip install pandas
pip install plotly
pip install pyqtgraph
pip install pyside6
pip install requests
pip install sympy
pip install textdistance
pip install torch