* Extract optional dependencies * Separate local mode into llms-llama-cpp and embeddings-huggingface for clarity * Support Ollama embeddings * Upgrade to llamaindex 0.10.14. Remove legacy use of ServiceContext in ContextChatEngine * Fix vector retriever filters |
||
|---|---|---|
| .. | ||
| ingestion-reset.mdx | ||
| ingestion.mdx | ||
| llms.mdx | ||
| settings.mdx | ||
| ui.mdx | ||
| vectordb.mdx | ||