-
Notifications
You must be signed in to change notification settings - Fork 2
Description
ValueError: The chat model does not have an 'llm' attribute. Available attributes: ['abstractmethods', 'class', 'delattr', 'dict', 'dir', 'doc', 'eq', 'format', 'ge', 'getattribute', 'gt', 'hash', 'init', 'init_subclass', 'le', 'lt', 'module', 'ne', 'new', 'reduce', 'reduce_ex', 'repr', 'setattr', 'sizeof', 'slots', 'str', 'subclasshook', 'weakref', '_abc_impl', 'chat']
Traceback:
File "/root/miniconda3/envs/chatbot/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 534, in _run_script
exec(code, module.dict)
File "/root/chatbot-main/chatbot-main/web.py", line 195, in
for res in client.completion(prompt, cur_llm_mode, cur_agent_mode, cur_dialogue_mode, "DEFAULT"):
File "/root/chatbot-main/chatbot-main/chat/chat_client.py", line 23, in completion
raise ValueError(f"The chat model does not have an 'llm' attribute. Available attribu
I change the api-key, and v2ray port in run.sh, when I start it respond like this.