POST /mltk/aicommander_metadata
POST /mltk/aicommander_metadata
Live-test a provider configuration without saving it.
edit_ai_commander_configlist_ai_commander_config
Description. Live-test a provider configuration without saving it. The handler issues a probe request against the provider using the supplied or previously-saved credentials and reports latency, HTTP status, and a snippet of the response.
Body (JSON). Same shape as POST /mltk/aicommander (above). The handler uses servicesettings.access_token.value; if absent it pulls from the saved secret keyed by connection_name + id.
Response (200 on success).
{
"message": "Successfully connected to OpenAI. Model 'gpt-4o-mini' is accessible.",
"status": "success",
"provider": "openai",
"model": "gpt-4o-mini",
"response_content": "Hello! How can I assist you today?",
"response_time": 1.23,
"http_status_code": 200
}
Response (failure).
{
"message": "Authentication failed. Please check your OpenAI API key.",
"status": "fail",
"provider": "openai",
"model": "gpt-4o-mini",
"http_status_code": 401,
"response_time": 0.45
}
Example.
curl -sk -u "$SPLUNK_USER:$SPLUNK_PASSWORD" -X POST
-H "Content-Type: application/json"
--data @- "$SPLUNK_HOST/servicesNS/nobody/Splunk_ML_Toolkit/mltk/aicommander_metadata?output_mode=json" <<'JSON'
{
"service": "OpenAI",
"model": "gpt-4o-mini",
"servicesettings": {
"endpoint": { "value": "https://api.openai.com/v1/chat/completions" },
"access_token": { "value": "sk-..." }
},
"modelsettings": {
"max_tokens": { "value": 50 },
"connection_name":{ "value": "default" }
}
}
JSON
Prerequisites by provider.
| Provider | Required fields |
|---|---|
| OpenAI | endpoint, access_token |
| Azure OpenAI | endpoint, access_token, azure_api_version |
| Anthropic | endpoint, access_token |
| Bedrock | region, aws_access_key_id, aws_access_token, role_arn |
| Splunk Hosted Models | Valid SCS token, configured in mlspl.conf |
| Ollama | endpoint (no token typically) |
Source. bin/rest_handlers/aicommander.py, bin/rest_handlers/aicommander_metadata.py, bin/ai_commander/ai_commander_util.py, bin/ai_commander/llm_factory.py.