| .. | 
		
		
			
			
			
			
				| chat.ts | Initialize the project to use self-hosted Llama model | 2023-08-15 20:53:37 +07:00 | 
		
			
			
			
			
				| data.ts | Initialize the project to use self-hosted Llama model | 2023-08-15 20:53:37 +07:00 | 
		
			
			
			
			
				| env.ts | Initialize the project to use self-hosted Llama model | 2023-08-15 20:53:37 +07:00 | 
		
			
			
			
			
				| error.ts | Initialize the project to use self-hosted Llama model | 2023-08-15 20:53:37 +07:00 | 
		
			
			
			
			
				| export.ts | Initialize the project to use self-hosted Llama model | 2023-08-15 20:53:37 +07:00 | 
		
			
			
			
			
				| folder.ts | Initialize the project to use self-hosted Llama model | 2023-08-15 20:53:37 +07:00 | 
		
			
			
			
			
				| google.ts | Initialize the project to use self-hosted Llama model | 2023-08-15 20:53:37 +07:00 | 
		
			
			
			
			
				| index.ts | Initialize the project to use self-hosted Llama model | 2023-08-15 20:53:37 +07:00 | 
		
			
			
			
			
				| openai.ts | Add support for 13B and 70B models, workflow, readme | 2023-08-15 23:11:39 +07:00 | 
		
			
			
			
			
				| plugin.ts | Initialize the project to use self-hosted Llama model | 2023-08-15 20:53:37 +07:00 | 
		
			
			
			
			
				| prompt.ts | Initialize the project to use self-hosted Llama model | 2023-08-15 20:53:37 +07:00 | 
		
			
			
			
			
				| settings.ts | Initialize the project to use self-hosted Llama model | 2023-08-15 20:53:37 +07:00 | 
		
			
			
			
			
				| storage.ts | Initialize the project to use self-hosted Llama model | 2023-08-15 20:53:37 +07:00 |