OAIClient (OpenAICompatible) Parameter Guide (v4)
This page explains how parameter settings actually work in Agently v4, based on source behavior in the OpenAI-compatible requester.
OpenAICompatible,OpenAI, andOAIClientare aliases of the same model requester plugin in v4.
1. Where to set parameters
| Layer | API | Typical keys | Scope |
|---|---|---|---|
| Global / agent settings | Agently.set_settings("OpenAICompatible", {...}) | base_url model request_options stream | Default for following requests |
| Per-request override | agent.options({...}) | temperature top_p max_tokens tools | Current request prompt |
Both sources eventually become request body fields.
2. Effective precedence in source logic
In OpenAICompatible.generate_request_data():
modelis force-written from plugin settingsplugins.ModelRequester.OpenAICompatible.model>default_model[model_type]
(agent.options({"model": ...})is overwritten)streamis force-written from plugin settings
explicitplugins...stream, or defaults:chat/completions=true,embeddings=false
(agent.options({"stream": ...})is overwritten)- Other request options
agent.options({...})overridesrequest_options
3. Recommended setup for temperature/top_p
3.1 Global defaults
python
from agently import Agently
Agently.set_settings("OpenAICompatible", {
"base_url": "https://api.openai.com/v1",
"api_key": "YOUR_API_KEY",
"model": "gpt-4o-mini",
"request_options": {
"temperature": 0.2,
"top_p": 0.9,
"max_tokens": 800
}
})3.2 Per-request overrides
python
agent = Agently.create_agent()
result = (
agent
.input("Explain RAG in one paragraph")
.options({
"temperature": 0.7,
"max_tokens": 300
})
.start()
)4. Two different tools mechanisms
4.1 Agently tooling flow (@agent.tool_func + agent.use_tools)
This is Agently orchestration: tool judging + tool execution + result injection into action_results.
python
from agently import Agently
agent = Agently.create_agent()
@agent.tool_func
def add(a: int, b: int) -> int:
return a + b
agent.use_tools(add)
print(agent.input("Use tool to calculate 12+34").start())4.2 Provider-native OpenAI tools (pass-through)
Put raw tools/tool_choice in request_options or agent.options():
python
agent.options({
"tools": [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather",
"parameters": {
"type": "object",
"properties": {
"city": { "type": "string" }
},
"required": ["city"]
}
}
}
],
"tool_choice": "auto"
})These two approaches can coexist, but they are not the same mechanism.
5. URL, model type, and message shaping
full_urloverridesbase_url + path_mapping[model_type]model_type=chatsendsmessagesmodel_type=completionssendspromptmodel_type=embeddingssendsinputand defaults to non-streamingstrict_role_ordersaffects role normalization- Attachment prompts auto-enable
rich_content
6. Auth choices
Supported ways:
api_keyauth(api_key/headers/body)
python
Agently.set_settings("OpenAICompatible", {
"base_url": "https://api.example.com/v1",
"auth": {
"headers": {
"Authorization": "Bearer xxx",
"X-Project": "demo"
}
}
})7. Fast troubleshooting
temperaturenot working: userequest_optionsoragent.options()instead of root-leveloptions.streamnot working: plugin-levelstreamoverrides request-level value.modelnot working:modelis force-written at plugin level.toolsnot working: confirm whether you want Agently tool orchestration or provider-native tools.- endpoint errors: try
full_urlfirst.
8. Debugging
python
from agently import Agently
Agently.set_settings("debug", True)
# or:
# Agently.set_settings("runtime.show_model_logs", True)You will see request-stage fields like request_url, request_options, and stream.
9. Source references
agently/builtins/plugins/ModelRequester/OpenAICompatible.pyagently/builtins/plugins/PromptGenerator/AgentlyPromptGenerator.pyagently/builtins/agent_extensions/ToolExtension.pyagently/core/Agent.py
Related docs
- OpenAI model setup: /en/models/openai
- Model settings overview: /en/model-settings
- Full settings reference: /en/settings