History log of /plugin/aichat/Model/Ollama/ChatModel.php (Results 1 – 4 of 4)
Revision Date Author Comments
# 7be8078e 15-Apr-2025 Andreas Gohr <andi@splitbrain.org>

allow models to have a zero token limit

This allows for configuring completely unknown models. For these models
no token limit is known and we will simply do not apply any. Instead we
trust that the

allow models to have a zero token limit

This allows for configuring completely unknown models. For these models
no token limit is known and we will simply do not apply any. Instead we
trust that the model will be either large enough to handle our input or
at least throw useful error messages.

show more ...


# d481c63c 25-Feb-2025 Andreas Gohr <andi@splitbrain.org>

ollama: remove thinking part from deepseek answers


# 4dd0657e 06-Feb-2025 Andreas Gohr <andi@splitbrain.org>

allow to set arbitrary models

We now initialize a model configuration even if we have no info in
model.json using some default values for the token limits.

Models can implement the loadUnkonwModelI

allow to set arbitrary models

We now initialize a model configuration even if we have no info in
model.json using some default values for the token limits.

Models can implement the loadUnkonwModelInfo() method to fetch the info
from an API if such a thing exist. Implemented for gemini and ollama
currently.

show more ...


# 074b7701 22-Apr-2024 Andreas Gohr <andi@splitbrain.org>

added support for groq