Commit Graph

17 Commits

Author SHA1 Message Date
Ivan Fioravanti ea1d09244a
Feature request: Adding temperature as parameter (#513)
* Adding temperature as parameter

* NEXT_PUBLIC_ prefix added

* add spacing

---------

Co-authored-by: Ivan Fioravanti <>
Co-authored-by: Mckay Wrigley <mckaywrigley@gmail.com>
2023-04-13 05:09:03 -06:00
Mckay Wrigley 6500db9c1c
MAJOR REFACTOR (#494)
* move index to home folder, create state and context files and barrell folder

* Sanity Check Commit:  reducer added to home.tsx manual QA all working

* WIP: promptBar

* fix missing json parse on folders and prompts

* split context and add promptbar context

* add context to nested prompt componets and componetize Folder componet

* remove log

* Create buttons folder and componetize sidebar action button

* tidy up prompt handlers

* componetized sidebar

* added back chatbar componet to left side sidebar

* monster commit: Componetized the common code between chatbar and promptbar into new componet Sidebar and added context to both bars

* add useFetch service

* added prettier import sort to keep imports ordered and easier to indentify

* added react query and useFetch to work with RQ

* added apiService, errorService and reactQuery

* add callback and tidy up error service

* refactor chat and child componets to useContext

* fix extra calls and bad calls to mel endpoint

* minor import cleanup

---------

Co-authored-by: jc.durbin <jc.durbin@ardanis.com>
2023-04-10 21:10:18 -06:00
Mckay Wrigley a89308d03a revert 2023-04-02 08:02:00 -06:00
Mckay Wrigley 1dc4f86df5 change output limit 2023-04-02 06:59:47 -06:00
Jason Banich b7b6bbaaca
add react-hot-toast and surface OpenAI API errors to users (#328) 2023-04-01 23:05:07 -06:00
Abror Aliboyev 462ca9bb04
fix for openai token limit error (#350) 2023-04-01 22:46:32 -06:00
Thomas LÉVEIL 00c6c72270
feat: add DEFAULT_MODEL environment variable (#280)
*  feat: add DEFAULT_MODEL environment variable

* set the model maxLength setting in the models definition

* set the model tokenLimit setting in the models definition
2023-03-28 21:10:47 -06:00
Mckay Wrigley 34c79c0d66
Prompts (#229) 2023-03-27 09:38:56 -06:00
Simon Holmes d6973b9ccc
feat: add in prettier and format code for consistency (#168) 2023-03-25 23:13:18 -06:00
Alan P 1a4b4401ee
include prompt in token count (#104) 2023-03-23 15:51:51 -06:00
Mckay Wrigley 0d6ff739a2
add custom system prompt (#39) 2023-03-21 01:39:32 -06:00
Mckay Wrigley 537957d5f5
Token based and model conditional limits (#36)
* use tiktoken for api limit

* model conditional char limits on frontend

* adjust for completion tokens

---------

Co-authored-by: Alan Pogrebinschi <alanpog@gmail.com>
2023-03-20 22:02:24 -06:00
Mckay Wrigley 7810a3e7dc
Add GPT-4 support (#25)
* mobile ui updates

* fixes sidebar btn

* return if null

* mobile input blur

* handle mobile enter key

* new convo name

* new delete mechanism

* test height

* revert

* change padding

* remove overflow

* check relative

* padding

* done

* retry

* test

* test

* should work now

* test

* test

* more

* max h

* revert

* done
2023-03-20 03:53:00 -06:00
Xiangxuan Liu 7c9e552a5c Correct the improper context being used when it exceeds the limit. 2023-03-19 14:19:43 +08:00
Mckay Wrigley e6449998ef add api key 2023-03-18 22:19:19 -06:00
Mckay Wrigley ce331a1bbd boom 2023-03-15 04:24:09 -06:00
Mckay Wrigley a6503fb498 chatbot-ui starter 2023-03-13 19:21:14 -06:00