* Extract optional dependencies
* Separate local mode into llms-llama-cpp and embeddings-huggingface for clarity
* Support Ollama embeddings
* Upgrade to llamaindex 0.10.14. Remove legacy use of ServiceContext in ContextChatEngine
* Fix vector retriever filters
This mode behaves the same as the openai mode, except that it allows setting custom models not
supported by OpenAI. It can be used with any tool that serves models from an OpenAI compatible API.
Implements #1424
* Refactor documentation architecture
Split into several `tab` and sections
* Fix Fern's docs.yml after PR review
Thank you Danny!
Co-authored-by: dannysheridan <danny@buildwithfern.com>
* Re-add quickstart in the overview tab
It went missing after a refactoring of the doc architecture
* Documentation writing
* Adapt Makefile to fern documentation
* Do not create overlapping page names in fern documentation
This is causing 500. Thank you to @dsinghvi for the troubleshooting and the help!
* Add a readme to help to understand how fern documentation work and how to add new pages
* Rework the welcome view
Redirects directly users to installation guide with links for people that are not familiar with documentation browsing.
* Simplify the quickstart guide
* PR feedback on installation guide
A ton of refactoring can still be made there
* PR feedback on ingestion
* PR feedback on ingestion splitting
* Rename section on LLM
* Fix missing word in list of LLMs
---------
Co-authored-by: dannysheridan <danny@buildwithfern.com>