* Update the version of llama_index used to fix transient openai errors * Update poetry.lock file * Make `local` mode the default mode by default |
||
|---|---|---|
| .. | ||
| chat | ||
| chunks | ||
| completions | ||
| embeddings | ||
| health | ||
| ingest | ||
| __init__.py | ||
* Update the version of llama_index used to fix transient openai errors * Update poetry.lock file * Make `local` mode the default mode by default |
||
|---|---|---|
| .. | ||
| chat | ||
| chunks | ||
| completions | ||
| embeddings | ||
| health | ||
| ingest | ||
| __init__.py | ||