Skip to content

issue using model with llama-server #48

@scalar27

Description

@scalar27

Sorry if this is not the right place for this issue:

I am running llama-server and using the Q8_0.gguf and the llama-joycaption-beta-one-llava-mmproj-model-f16.gguf.

It loads fine and seems to work initially but I get a lot of browser pop-up errors like "Failed to load image or audio file" and
mtmd_helper_bitmap_init_from_buf: failed to decode image bytes
srv log_server_r: request: POST /v1/chat/completions 127.0.0.1 400
in the terminal.

I can't tell if this is a model issue or a llama-server issue. Any help would be appreciated. I'm trying .png images which I'm not sure are supported but also .jpg.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions