private-gpt/private_gpt/settings
icsy7867 e21bf20c10
feat: prompt_style applied to all LLMs + extra LLM params. (#1835)
* Updated prompt_style to be moved to the main LLM setting since all LLMs from llama_index can utilize this.  I also included temperature, context window size, max_tokens, max_new_tokens into the openailike to help ensure the settings are consistent from the other implementations.

* Removed prompt_style from llamacpp entirely

* Fixed settings-local.yaml to include prompt_style in the LLM settings instead of llamacpp.
2024-04-30 09:53:10 +02:00
..
__init__.py Next version of PrivateGPT (#1077) 2023-10-19 16:04:35 +02:00
settings.py feat: prompt_style applied to all LLMs + extra LLM params. (#1835) 2024-04-30 09:53:10 +02:00
settings_loader.py fix(tests): load the test settings only when running tests 2024-01-09 12:03:16 +01:00
yaml.py fix: fix pytorch version to avoid wheel bug (#1123) 2023-10-27 20:27:40 +02:00