llama-gpt/components/Chat
Mckay Wrigley 537957d5f5
Token based and model conditional limits (#36)
* use tiktoken for api limit

* model conditional char limits on frontend

* adjust for completion tokens

---------

Co-authored-by: Alan Pogrebinschi <alanpog@gmail.com>
2023-03-20 22:02:24 -06:00
..
Chat.tsx Token based and model conditional limits (#36) 2023-03-20 22:02:24 -06:00
ChatInput.tsx Token based and model conditional limits (#36) 2023-03-20 22:02:24 -06:00
ChatLoader.tsx fix loader position 2023-03-18 22:21:11 -06:00
ChatMessage.tsx User message should not render as Markdown 2023-03-19 19:38:54 +08:00
ModelSelect.tsx Add GPT-4 support (#25) 2023-03-20 03:53:00 -06:00
Regenerate.tsx error handling (#27) 2023-03-20 07:17:58 -06:00