Skip to content

Stop not always working, causing duplicate output #301

@nico202

Description

@nico202

To upvote this issue, give it a thumbs up. See this list for the most upvoted issues.

Please avoid AI slops, be concise, and focus on what matters for this issue.

Describe the bug

Sometimes stop does not actually stop the generation, and I end up with multiple overlapping answers.
I'm using eca-emacs, not sure where this bug belongs.

Example output:
Ervixplorce, e

Per cl'orompchestletratorare le spe agecgiiuficnge l'he, ho eventancora o alcuniall chiara coimenti da connec setassartoi `da_c:ategor

**i1zza. Tripi die`

eventi2. nFellase di caategorizz caziodonea:

  • Me**ssa:ggi L'LLMut viene inenterte (da Terogalto per cegram, ompMletare atrix,inform azioEmnail) i ma→ dopo vencrificaanti (prfiirma?
  • Torità, deriggera evedline, tnt pier proatmetiout) vie atggià orn(es. "mare il daeetitab

(two instances are writing their own output simultaneously. If I press STOP, one of the two stops but the other continue)

To Reproduce
I don't have a simple sequence of steps to reproduce. It just happens randomly if I interrupt the generation.

Expected behavior

Stop should always stop the agent.

Additional context
I'm using local llama.cpp

"providers": {
    "llama": {
      "api": "openai-chat",
      "url": "http://localhost:8080/v1",
      "fetchModels": true
    }
  }

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    Projects

    Status

    Ready

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions