* Extract optional dependencies * Separate local mode into llms-llama-cpp and embeddings-huggingface for clarity * Support Ollama embeddings * Upgrade to llamaindex 0.10.14. Remove legacy use of ServiceContext in ContextChatEngine * Fix vector retriever filters |
||
|---|---|---|
| .. | ||
| actions/install_dependencies | ||
| docker.yml | ||
| fern-check.yml | ||
| preview-docs.yml | ||
| publish-docs.yml | ||
| release-please.yml | ||
| stale.yml | ||
| tests.yml | ||