-
Notifications
You must be signed in to change notification settings - Fork 40
Open
Description
When I use model="meta.llama3-8b-instruct-v1:0" it says: ValueError: Model meta.llama3-8b-instruct-v1:0 is not supported. Supported models are: ['anthropic.claude-v2']. Isn't there support for other models like llama-3 and mistral? Also, the chat format for these should be specified via os.environ["BEDROCK_PROMPT"] = "vicuna", right?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels