private-gpt/private_gpt/components/llm
icsy7867 e21bf20c10
feat: prompt_style applied to all LLMs + extra LLM params. (#1835)
* Updated prompt_style to be moved to the main LLM setting since all LLMs from llama_index can utilize this.  I also included temperature, context window size, max_tokens, max_new_tokens into the openailike to help ensure the settings are consistent from the other implementations.

* Removed prompt_style from llamacpp entirely

* Fixed settings-local.yaml to include prompt_style in the LLM settings instead of llamacpp.
2024-04-30 09:53:10 +02:00
..
custom fix(llm): special tokens and leading space (#1831) 2024-04-04 14:37:29 +02:00
__init__.py Next version of PrivateGPT (#1077) 2023-10-19 16:04:35 +02:00
llm_component.py feat: prompt_style applied to all LLMs + extra LLM params. (#1835) 2024-04-30 09:53:10 +02:00
prompt_helper.py feat: Upgrade to LlamaIndex to 0.10 (#1663) 2024-03-06 17:51:30 +01:00