llama-gpt/components/Chat
Thomas LÉVEIL 00c6c72270
feat: add DEFAULT_MODEL environment variable (#280)
*  feat: add DEFAULT_MODEL environment variable

* set the model maxLength setting in the models definition

* set the model tokenLimit setting in the models definition
2023-03-28 21:10:47 -06:00
..
Chat.tsx feat: add DEFAULT_MODEL environment variable (#280) 2023-03-28 21:10:47 -06:00
ChatInput.tsx feat: add DEFAULT_MODEL environment variable (#280) 2023-03-28 21:10:47 -06:00
ChatLoader.tsx feat: add in prettier and format code for consistency (#168) 2023-03-25 23:13:18 -06:00
ChatMessage.tsx fix: resolve Enter event conflict during CJK IME (#253) 2023-03-28 02:46:16 -06:00
ErrorMessageDiv.tsx Prompts (#229) 2023-03-27 09:38:56 -06:00
ModelSelect.tsx feat: add DEFAULT_MODEL environment variable (#280) 2023-03-28 21:10:47 -06:00
PromptList.tsx make all chat area components tabbable (accessibility) (#246) 2023-03-28 02:35:57 -06:00
Regenerate.tsx make all chat area components tabbable (accessibility) (#246) 2023-03-28 02:35:57 -06:00
SystemPrompt.tsx feat: add DEFAULT_MODEL environment variable (#280) 2023-03-28 21:10:47 -06:00
VariableModal.tsx Prompts (#229) 2023-03-27 09:38:56 -06:00