* added max_new_tokens as a configuration option to the llm block in settings
* Update fern/docs/pages/manual/settings.mdx
Co-authored-by: lopagela <lpglm@orange.fr>
* Update private_gpt/settings/settings.py
Add default value for max_new_tokens = 256
Co-authored-by: lopagela <lpglm@orange.fr>
* Addressed location of docs comment
* reformatting from running 'make check'
* remove default config value from settings.yaml
---------
Co-authored-by: lopagela <lpglm@orange.fr>