Skip to content

Change to local Ollama but still require API_KEY #13

@wkholy

Description

@wkholy

I changed the setting from openai to ollama, and run the ollama serve. I also set an environmental variable of OPENAI_API_KEY='12434', for the checking part. When I run the command following the instruction, it showed an error below.

raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: 12434. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
Error running command: Command '['python', 'database/script/faiss_command_help.py', '--database_path=./database']' returned non-zero exit status 1.

I would like to ask if there will still be a need for the API key for other purpose apart from the deep learning. I will appreciate if there is a tutorial for nozzle profile optimization or simulation with multiphase settings.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions