Skip to content

Anna-Bele/COLT

Repository files navigation

COLT: Lightweight Multi-LLM Collaboration through Shared MCTS Reasoning for Model Compilation

Detailed implementations of this project are included in the folder python/tvm/meta_schedule/search_strategy. This project uses TVM, an open source compiler stack for deep learning systems with Apache-2.0 license.

Usage:

  1. Install TVM and configure the environment as detailed in TVM's documentation.
  2. Configure the strategy. Initialize the strategy with hyperparameters.
my_strategy = MCTSSearchPyFull(
    population_size=3,
    init_measured_ratio=0,
    init_min_unmeasured=3,
    max_fail_count=20,
    genetic_num_iters=3,
    genetic_mutate_prob=0.85,
    genetic_max_fail_count=2,
    trace_commit=True,
    mcts_num_threads=1,
    mcts_num_rollouts_per_expansion=1,
    use_llm=True,
    llm_budget=500,
)
  1. Run tuning. Pass the strategy object to tune_tir as a parameter
database = ms.tune_tir(
    mod=MyModule,
    target="llvm --num-cores=1",
    max_trials_global=64,
    num_trials_per_iter=64,
    work_dir="./tune_tmp",
    strategy=my_strategy,
)

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors