Skip to content

Does this tool support DeepSeek Model? #28

@Hollow-cat

Description

@Hollow-cat

``First of all, I'm very grateful that you developed this tool and made it open source. I encountered problems when using it to connect to the deepseek model.

I created the mcp-server-config.json in my project

{
  "systemPrompt": "You are an AI assistant helping a software engineer...",
  "llm": {
    "provider": "deepseek",
    "model": "DeepSeek-R1,
    "api_key": xxxxx
    "base_url":  "https://xxx/api"
}

then I runned the cmd llm "hello"

  File "D:\UserData\004MCP\mcp-cli-test\.venv\Lib\site-packages\httpx\_transports\default.py 
 line 118, in map_httpcore_exceptions
     raise mapped_exc(message) from exc
 httpx.ConnectError

 The above exception was the direct cause of the following exception:

 Traceback (most recent call last):
   File "D:\UserData\004MCP\mcp-cli-test\.venv\Lib\site-packages\mcp_client_cli\cli.py", line 
 252, in handle_conversation
     async for chunk in agent_executor.astream(
     ...<8 lines>...
                 break
   File "D:\UserData\004MCP\mcp-cli-test\.venv\Lib\site-packages\langgraph\pregel\__init__.py 
 line 2732, in astream
     async for _ in runner.atick(
     ...<7 lines>...
             yield o
   File "D:\UserData\004MCP\mcp-cli-test\.venv\Lib\site-packages\langgraph\pregel\runner.py", 
 line 392, in atick
     _panic_or_proceed(
     ~~~~~~~~~~~~~~~~~^
         futures.done.union(f for f, t in futures.items() if t is not None),
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
         timeout_exc_cls=asyncio.TimeoutError,
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
         panic=reraise,
         ^^^^^^^^^^^^^^
     )
     ^
   File "D:\UserData\004MCP\mcp-cli-test\.venv\Lib\site-packages\langgraph\pregel\runner.py", 
 line 499, in _panic_or_proceed
     raise exc
   File "D:\UserData\004MCP\mcp-cli-test\.venv\Lib\site-packages\langgraph\pregel\retry.py",  
 line 128, in arun_with_retry
     return await task.proc.ainvoke(task.input, config)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "D:\UserData\004MCP\mcp-cli-test\.venv\Lib\site-packages\langgraph\utils\runnable.py" 
 line 672, in ainvoke
     input = await asyncio.create_task(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^
         step.ainvoke(input, config, **kwargs), context=context
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     )
     ^
   File "D:\UserData\004MCP\mcp-cli-test\.venv\Lib\site-packages\langgraph\utils\runnable.py" 
 line 431, in ainvoke
     ret = await asyncio.create_task(coro, context=context)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "D:\UserData\001 
 work\TPDS\004MCP\mcp-cli-test\.venv\Lib\site-packages\langgraph\prebuilt\chat_agent_executor.py", line 763 
 in acall_model
     response = cast(AIMessage, await model_runnable.ainvoke(state, config))
                                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "D:\UserData\001 
 work\TPDS\004MCP\mcp-cli-test\.venv\Lib\site-packages\langchain_core\runnables\base.py", line 3075, in     
 ainvoke
     input = await coro_with_context(part(), context, create_task=True)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "D:\UserData\001 
 work\TPDS\004MCP\mcp-cli-test\.venv\Lib\site-packages\langchain_core\runnables\base.py", line 5429, in     
 ainvoke
     return await self.bound.ainvoke(
            ^^^^^^^^^^^^^^^^^^^^^^^^^
     ...<3 lines>...
     )
     ^
   File "D:\UserData\001 
 work\TPDS\004MCP\mcp-cli-test\.venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 
 391, in ainvoke
     llm_result = await self.agenerate_prompt(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     ...<8 lines>...
     )
     ^
   File "D:\UserData\001 
 work\TPDS\004MCP\mcp-cli-test\.venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 
 957, in agenerate_prompt
     return await self.agenerate(
            ^^^^^^^^^^^^^^^^^^^^^
         prompt_messages, stop=stop, callbacks=callbacks, **kwargs
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     )
     ^
   File "D:\UserData\001 
 work\TPDS\004MCP\mcp-cli-test\.venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 
 915, in agenerate
     raise exceptions[0]
   File "D:\UserData\001 
 work\TPDS\004MCP\mcp-cli-test\.venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 
 1072, in _agenerate_with_cache
     async for chunk in self._astream(messages, stop=stop, **kwargs):
     ...<7 lines>...
         chunks.append(chunk)
   File "D:\UserData\001 
 work\TPDS\004MCP\mcp-cli-test\.venv\Lib\site-packages\langchain_openai\chat_models\base.py", line 1105, in 
 _astream
     response = await self.async_client.create(**payload)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "D:\UserData\001 
 work\TPDS\004MCP\mcp-cli-test\.venv\Lib\site-packages\openai\resources\chat\completions\completions.py",   
 line 2028, in create
     return await self._post(
            ^^^^^^^^^^^^^^^^^
     ...<45 lines>...
     )
     ^
   File "D:\UserData\004MCP\mcp-cli-test\.venv\Lib\site-packages\openai\_base_client.py", lin 
 1742, in post
     return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "D:\UserData\004MCP\mcp-cli-test\.venv\Lib\site-packages\openai\_base_client.py", lin 
 1516, in request
     raise APIConnectionError(request=request) from err
 openai.APIConnectionError: Connection error.

I can use the baseurl and api key in other tools,like the cline in the VsCode. Did I make a mistake somewhere? Hope you can help clear up the doubts

System: Windows10
python: python3.13

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions