private-gpt/private_gpt/components/llm
Gianni Acquisto 9c192ddd73
Added max_new_tokens as a config option to llm yaml block (#1317)
* added max_new_tokens as a configuration option to the llm block in settings

* Update fern/docs/pages/manual/settings.mdx

Co-authored-by: lopagela <lpglm@orange.fr>

* Update private_gpt/settings/settings.py

Add default value for max_new_tokens = 256

Co-authored-by: lopagela <lpglm@orange.fr>

* Addressed location of docs comment

* reformatting from running 'make check'

* remove default config value from settings.yaml

---------

Co-authored-by: lopagela <lpglm@orange.fr>
2023-11-26 19:17:29 +01:00
..
custom Feature/sagemaker embedding (#1161) 2023-11-05 16:16:49 +01:00
__init__.py Next version of PrivateGPT (#1077) 2023-10-19 16:04:35 +02:00
llm_component.py Added max_new_tokens as a config option to llm yaml block (#1317) 2023-11-26 19:17:29 +01:00
prompt_helper.py Multi language support - fern debug (#1307) 2023-11-25 14:34:23 +01:00