* Updated prompt_style to be moved to the main LLM setting since all LLMs from llama_index can utilize this. I also included temperature, context window size, max_tokens, max_new_tokens into the openailike to help ensure the settings are consistent from the other implementations. * Removed prompt_style from llamacpp entirely * Fixed settings-local.yaml to include prompt_style in the LLM settings instead of llamacpp. |
||
---|---|---|
.. | ||
components | ||
open_ai | ||
server | ||
settings | ||
ui | ||
utils | ||
__init__.py | ||
__main__.py | ||
constants.py | ||
di.py | ||
launcher.py | ||
main.py | ||
paths.py |