[ad_1]
There is error when I ask too many question on 1 webpage:
“Error while calling OpenAI: This model’s maximum context length is 4097 tokens, however you requested 4145 tokens (2145 in your prompt; 2000 for the completion). Please reduce your prompt; or completion length.”
Usaully I have to refresh and get a new webpage. Do we have other better slove way?
