REST API Reference

AI Commander (LLM Providers)

Endpoints in the AI Commander (LLM Providers) group of the /mltk/** REST API.

These endpoints configure LLM provider connections (OpenAI, Azure OpenAI, Anthropic, Bedrock, Groq, Gemini, Ollama, Splunk Hosted Models) and test connectivity to them.

Endpoint Methods Source file
/mltk/aicommander GET, POST, DELETE bin/rest_handlers/aicommander.py
/mltk/aicommander/{service}/{connection_name}/{model} GET, DELETE bin/rest_handlers/aicommander.py
/mltk/aicommander_metadata GET, POST bin/rest_handlers/aicommander_metadata.py

Backing store. KV collection mltk_ai_commander_collection for configuration; Splunk passwords store (storage/passwords) for API keys and other secrets.

Capabilities. Write paths require edit_ai_commander_config AND list_ai_commander_config. Read paths accept either capability.

Endpoints Permalink to this section

Section Method Endpoint Page
2.1 GET /mltk/aicommander get-aicommander
2.2 GET /mltk/aicommander/{service}/{connection_name}/{model} get-aicommander-service-connection-name-model
2.3 POST /mltk/aicommander post-aicommander
2.4 DELETE /mltk/aicommander/{service}/{connection_name}/{model} delete-aicommander-service-connection-name-model
2.5 GET /mltk/aicommander_metadata get-aicommander-metadata
2.6 POST /mltk/aicommander_metadata post-aicommander-metadata

Press Cmd/Ctrl+K to focus search. Esc to close.

Type to search the portal.