-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Description
Howdy,
I am trying to build the project, but I run into a couple of issues:
The first one, which is probably causing the second one: In the cargo.toml file, build fails on llama-core and endpoints. Those items have paths to files that do not appear to be part of the project. I removed the path and ran the build again.
The second time around, it once again fails, this time while compiling. the error is:
Compiling llama-core v0.14.1
error[E0308]: mismatched types
--> /Users/jperedo/.cargo/registry/src/index.crates.io-6f17d22bba15001f/llama-core-0.14.1/src/chat.rs:1873:56
|
1873 | match chat_prompt.build_with_tools(&mut chat_request.messages, None) {
| ---------------- ^^^^^^^^^^^^^^^^^^^^^^^^^^ expected `ChatCompletionRequestMessage`, found a different `ChatCompletionRequestMessage`
| |
| arguments to this method are incorrect
|
= note: `ChatCompletionRequestMessage` and `ChatCompletionRequestMessage` have similar names, but are actually distinct types
note: `ChatCompletionRequestMessage` is defined in crate `endpoints`
--> /Users/jperedo/.cargo/registry/src/index.crates.io-6f17d22bba15001f/endpoints-0.12.0/src/chat.rs:1342:1
|
1342 | pub enum ChatCompletionRequestMessage {
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
note: `ChatCompletionRequestMessage` is defined in crate `endpoints`
--> /Users/jperedo/.cargo/registry/src/index.crates.io-6f17d22bba15001f/endpoints-0.13.1/src/chat.rs:1345:1
|
1345 | pub enum ChatCompletionRequestMessage {
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
= note: perhaps two different versions of crate `endpoints` are being used?
note: method defined here
--> /Users/jperedo/.cargo/registry/src/index.crates.io-6f17d22bba15001f/chat-prompts-0.11.2/src/chat/mod.rs:44:8
|
44 | fn build_with_tools(Any idea how I can get around this?
Here is my version of the cargo.toml file after removing the path properties:
[package]
name = "sd-api-server"
version = "0.1.0"
edition = "2021"
[dependencies]
clap = { version = "4.4.6", features = ["cargo", "derive"] }
anyhow = "1.0.80"
log = { version = "0.4.21", features = ["std", "kv", "kv_serde"] }
wasi-logger = { version = "0.1.2", features = ["kv"] }
hyper_wasi = { version = "0.15", features = ["full"] }
tokio_wasi = { version = "1", features = ["full"] }
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
thiserror = "1"
llama-core = { version = "0.14.1", features = [ "logging", ] }
endpoints = { version = "0.12.0" }
uuid = { version = "1.4", features = ["v4", "fast-rng", "macro-diagnostics"] }
multipart-2021 = "0.19.0"Thanks!
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels