-
Notifications
You must be signed in to change notification settings - Fork 72
Open
Description
Prerequisites
- Write a descriptive title.
- Make sure you are able to repro it on the latest version
- Search the existing issues.
Steps to reproduce
When using PowerShell AI Shell with the gpt-4.1 model, everything works as expected.
However, switching the model to o3 or gpt-5 causes the agent to fail with the following error:
ERROR: Agent failed to generate a response: HTTP 400
(invalid_request_error: unsupported_parameter)
Parameter: max_tokens
Unsupported parameter: 'max_tokens' is not supported with this model.
Use 'max_completion_tokens' instead.
It appears that PowerShell AI Shell is still sending the max_tokens parameter,
which is not supported by o3 and gpt-5 models.
Expected behavior
The shell should either:
- Automatically use `max_completion_tokens` for models that require it, or
- Adapt parameters based on the selected model.Actual behavior
The request fails with an HTTP 400 error when using o3 or gpt-5.Error details
ERROR: Agent failed to generate a response: HTTP 400
(invalid_request_error: unsupported_parameter)
Parameter: max_tokens
Unsupported parameter: 'max_tokens' is not supported with
this model. Use 'max_completion_tokens' instead.
at
OpenAI.ClientPipelineExtensions.ProcessMessageAsync(ClientPi
peline pipeline, PipelineMessage message, RequestOptions
options)
at OpenAI.Chat.ChatClient.CompleteChatAsync(BinaryContent
content, RequestOptions options)
at
OpenAI.Chat.ChatClient.<>c__DisplayClass10_0.<<CompleteChatS
treamingAsync>b__0>d.MoveNext()
--- End of stack trace from previous location ---
at
OpenAI.AsyncSseUpdateCollection`1.GetRawPagesAsync()+MoveNex
t()
at
OpenAI.AsyncSseUpdateCollection`1.GetRawPagesAsync()+System.
Threading.Tasks.Sources.IValueTaskSource<System.Boolean>.Get
Result()
at
System.ClientModel.AsyncCollectionResult`1.GetAsyncEnumerato
r(CancellationToken cancellationToken)+MoveNext()
at
System.ClientModel.AsyncCollectionResult`1.GetAsyncEnumerato
r(CancellationToken cancellationToken)+MoveNext()
at
System.ClientModel.AsyncCollectionResult`1.GetAsyncEnumerato
r(CancellationToken
cancellationToken)+System.Threading.Tasks.Sources.IValueTask
Source<System.Boolean>.GetResult()
at
Microsoft.Extensions.AI.OpenAIChatClient.FromOpenAIStreaming
ChatCompletionAsync(IAsyncEnumerable`1 updates,
CancellationToken cancellationToken)+MoveNext()
at
Microsoft.Extensions.AI.OpenAIChatClient.FromOpenAIStreaming
ChatCompletionAsync(IAsyncEnumerable`1 updates,
CancellationToken cancellationToken)+MoveNext()
at
Microsoft.Extensions.AI.OpenAIChatClient.FromOpenAIStreaming
ChatCompletionAsync(IAsyncEnumerable`1 updates,
CancellationToken
cancellationToken)+System.Threading.Tasks.Sources.IValueTask
Source<System.Boolean>.GetResult()
at
Microsoft.Extensions.AI.FunctionInvokingChatClient.GetStream
ingResponseAsync(IEnumerable`1 messages, ChatOptions
options, CancellationToken cancellationToken)+MoveNext()
at
Microsoft.Extensions.AI.FunctionInvokingChatClient.GetStream
ingResponseAsync(IEnumerable`1 messages, ChatOptions
options, CancellationToken cancellationToken)+MoveNext()
at
Microsoft.Extensions.AI.FunctionInvokingChatClient.GetStream
ingResponseAsync(IEnumerable`1 messages, ChatOptions
options, CancellationToken
cancellationToken)+System.Threading.Tasks.Sources.IValueTask
Source<System.Boolean>.GetResult()
at
AIShell.OpenAI.Agent.ChatService.GetStreamingChatResponseAsy
nc(String input, IShell shell, CancellationToken
cancellationToken)
at
Spectre.Console.Status.<>c__DisplayClass17_0`1.<<StartAsync>
b__0>d.MoveNext() in
/_/src/Spectre.Console/Live/Status/Status.cs:line 120
--- End of stack trace from previous location ---
at
Spectre.Console.Progress.<>c__DisplayClass32_0`1.<<StartAsyn
c>b__0>d.MoveNext() in
/_/src/Spectre.Console/Live/Progress/Progress.cs:line 138
--- End of stack trace from previous location ---
at
Spectre.Console.Internal.DefaultExclusivityMode.RunAsync[
T](Func`1 func) in
/_/src/Spectre.Console/Internal/DefaultExclusivityMode.cs:li
ne 40
at Spectre.Console.Progress.StartAsync[T](Func`2 action)
in /_/src/Spectre.Console/Live/Progress/Progress.cs:line 121
at Spectre.Console.Status.StartAsync[T](String status,
Func`2 func) in
/_/src/Spectre.Console/Live/Status/Status.cs:line 117
at AIShell.Kernel.Host.RunWithSpinnerAsync[T](Func`1
func, String status, Nullable`1 spinnerKind)
at AIShell.OpenAI.Agent.OpenAIAgent.ChatAsync(String
input, IShell shell)
at AIShell.Kernel.Shell.RunREPLAsync()Environment data
- PowerShell AI Shell
- Model: o3 / gpt-5Version
1.0.0-preview.8
Visuals

Metadata
Metadata
Assignees
Labels
No labels