Skip to content

feat: call fetchToken in startChat to speed up the first message response#504

Open
jrobinson01 wants to merge 1 commit intogoogleapis:mainfrom
jrobinson01:async-startchat
Open

feat: call fetchToken in startChat to speed up the first message response#504
jrobinson01 wants to merge 1 commit intogoogleapis:mainfrom
jrobinson01:async-startchat

Conversation

@jrobinson01
Copy link

In contrast to the gemini SDK, in the Vertex SDK, sending the first message is really slow, due to having to retreive the auth token. I'm not sure why this is slow, but this PR adds the fetchToken call to the ChatSession.startChat method. This makes startChat slow, but results in a nicer UX (IMO) because the user waits for a chat ui to load/initialize instead of waiting for their first response.

BREAKING CHANGE: startChat is now async in order to await fetching the auth token

BREAKING CHANGE: startChat is now async in order to await fetchToken
@product-auto-label product-auto-label bot added the api: aiplatform Issues related to the googleapis/nodejs-vertexai API. label Mar 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

api: aiplatform Issues related to the googleapis/nodejs-vertexai API.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant