Skip to content
Snippets Groups Projects
Commit 1083a08d authored by Grießhaber Daniel's avatar Grießhaber Daniel :squid:
Browse files

remove max_tokens parameter from create_completion

parent b4f9fd21
No related branches found
No related tags found
2 merge requests!2remove is_chat argument,!1Refactor models
......@@ -75,7 +75,6 @@ class LLMModel(ABC):
) -> tuple[str, ModelUsage]:
if chat is None:
chat = self.chat
max_tokens = kwargs.pop("max_tokens", self.options.max_tokens)
# create prompt
prompt = prompt_prefix + prompt + prompt_suffix + prompt_appendix
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment