This project translates Tensor Logic: The Language of AI into a runnable, well-tested Python codebase. It serves two complementary purposes:
- Learning companion – walk through each section of the paper with annotated code and executable experiments.
- Reusable library – import
tensor_logicinto your own projects to build neural, symbolic, kernel, graphical, or embedding-based systems using tensor equations.
python3 -m venv .venv
source .venv/bin/activate
pip install -e .
python3 -m pytest # run the full validation suiteIf you use pre-commit, install the hooks once via pre-commit install.
For optional tutorial dependencies (Apple MLX + Weights & Biases logging), install:
pip install -e .[examples]The docs/ directory contains a guided reading of the paper with links into the code:
docs/paper_companion.md– section-by-section outline mapping paper concepts to modules.docs/library_usage.md– short recipes that exercise the core APIs.docs/tutorials/neurosymbolic_recommendation.md– build a neuro-symbolic recommendation pipeline end-to-end.docs/tutorials/mlx_xor_classifier.md– train a neural network with MLX and bridge the weights back into tensor logic.
Prefer working directly in notebooks? Each test under tests/ doubles as an executable
example; start with tests/models/test_neural.py to explore Section 4.1.
| Paper Section | Code | Tests | Highlights |
|---|---|---|---|
| §3 Tensor Logic | tensor_logic/core/{tensor,ops,equation,program}.py |
tests/core/ |
Named-axis tensors, joins, projections, inference, derivatives |
| §4.1 Neural | tensor_logic/models/neural.py |
tests/models/test_neural.py |
MLPs, convolutions, GNN layer, single-head attention |
| §4.2 Symbolic | tensor_logic/models/symbolic.py |
tests/models/test_symbolic.py |
Recursive ancestor reasoning with tensor joins |
| §4.3 Kernel | tensor_logic/models/kernel.py |
tests/models/test_kernel.py |
Linear kernel construction and prediction |
| §4.4 Graphical | tensor_logic/models/graphical.py |
tests/models/test_graphical.py |
Bayesian marginal inference via tensor equations |
| §5 Embeddings | tensor_logic/models/embedding.py |
tests/models/test_embedding.py |
Superposition reasoning in embedding space |
from tensor_logic import Tensor, TensorStore, TensorProgram
from tensor_logic.models.neural import (
DenseLayerSpec,
build_mlp_program,
)
store = TensorStore()
store.add(Tensor("X", ("i",), [1.0, -2.0, 0.5]))
store.add(Tensor("W1", ("h", "i"), [[0.5, -0.3, 0.8], [-0.2, 0.4, 0.1]]))
store.add(Tensor("W2", ("o", "h"), [[1.0, -1.5]]))
program = build_mlp_program(
store=store,
input_name="X",
input_axes=("i",),
layer_specs=[
DenseLayerSpec(weight_name="W1", output_name="Hidden"),
DenseLayerSpec(weight_name="W2", output_name="Output"),
],
)
program.forward()
print(store.get("Output").data)See docs/library_usage.md for more examples spanning symbolic reasoning, kernel
machines, graphical models, and embedding-based queries.
- Python ≥ 3.9 (see
pyproject.tomlfor dependencies) - Run
python3 -m pytestbefore proposing changes - Keep docstrings and inline comments focused on intent; the repository is meant to be read alongside the paper
- Contributions should extend
docs/to explain new ideas or experiments