Skip to content
This repository has been archived by the owner on Oct 23, 2023. It is now read-only.

Problems with context of conversation using chatGPT API #22

Answered by om4csaba
MarkOdinSon asked this question in Q&A
Discussion options

You must be logged in to vote

Hi there,

You're correct that the Chat web app and the API work differently, particularly concerning the token size. The web app allows more tokens, while the API provides more control over chat history. To overcome the token limit, you have a couple of options:

Exclude older messages from the prompt data. You can truncate or remove some messages to fit within the token limit.
Summarize earlier messages to compress them and reduce the token count.

For general advice and token counting examples, refer to OpenAI's openai-cookbook.

We leave it to application developers to determine how to handle historical data, but I recommend exploring https://community.openai.com for more relevant discuss…

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@MarkOdinSon
Comment options

@om4james
Comment options

Answer selected by om4james
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants