GaleMind ML Inference Server v0.1 - A high-performance machine learning inference server providing both REST and gRPC APIs.
- Rust (1.70+): Install from rustup.rs
- Make: Required for using the Makefile commands
- Pushing a commit to
mainordevelopbranch will trigger Docker image building and (upon success) pushing togalemindzen's Docker hub private repo. - Pushing a
v*tag onto any commit will trigger its docker image building and (upon success) pushing togalemindzen's Docker hub private repo;- if that "
*" ends in+k8s, then also deploy onto Galemind's Linode Kubernetes cluster.
- if that "
- Clone the repository:
git clone <repository-url>
cd galemind-server- Install Rust dependencies:
cargo build# Build the entire project (includes format and test)
make all
# Run tests only
make test
# Format code
make format
# Run the server
make run# Build the project
cargo build
# Build for production (optimized)
cargo build --release
# Run tests
cargo test
# Format code
cargo fmtSet the required environment variables in the .env file (recommended):
export MODELS_DIR=/path/to/your/modelsUsing Makefile (automatically loads environment variables from .env):
make runOr using cargo directly:
cargo run -p galemind startThe server supports the following command-line options:
cargo run -p galemind start \
--rest-host 0.0.0.0 \
--rest-port 8080 \
--grpc-host 0.0.0.0 \
--grpc-port 50051- REST API: Available at
http://localhost:8080(default) - gRPC API: Available at
localhost:50051(default)
| Command | Description |
|---|---|
make all |
Format code, run tests, and build the project |
make test |
Run all tests |
make format |
Format code using cargo fmt |
make run |
Start the GaleMind server |
This is a Rust workspace containing:
src/galemind/- Main server applicationsrc/grpc_server/- gRPC server implementationsrc/rest_server/- REST API server implementation
See the LICENSE file for details.