Step LLM (:llm)
Quick Answer
Use this reference for the :llm step schema, prompt/message patterns, templates, and model-call configuration.
Use for model calls via an :llm-provider connection.
Canonical Shape
Core fields:
| Field | Type | Required | Notes |
|---|---|---|---|
:connection | keyword/string | Recommended | Slot or connection id |
:messages | vector | Yes* | Explicit chat messages |
:prompt | string/map | Yes* | Prompt shorthand |
:system | string | No | System prompt shorthand |
:input | map | No | Canonical input envelope |
:template | keyword | No | :llm-prompt template id |
:data | map | No | Template data |
:model | string | No | Model override |
:provider | keyword/string | No | Provider override |
:temperature, :top-p, :stop | scalar | No | Generation controls |
:max-tokens | int | No | Response token cap |
:output / :response-format | map/keyword | No | Structured output config |
:json-schema | map | No | Structured JSON schema |
:tools | map/vector | No | Agentic tools config |
:openai | map | No | OpenAI Responses-specific options (:responses) |
:available-steps | vector | No | Auto-tool step set |
:max-iterations | int | No | Agentic loop bound |
:auth | map | No | Explicit auth if not using connection |
* Provide :messages, :prompt/:system, or :input.
Limits And Behavior
- Use either
:messagesor:prompt/:system. - Prefer templates for long prompts.
:toolsbelongs on the step config, not inside templates.- For most production flows, bind a reusable
:llm-providerconnection via:requires. :max-tokenspolicy:- platform-managed keys are capped by platform limit
- user-provided keys are not platform-clamped (provider/account limits still apply)
- OpenAI hosted shell can be configured via
:openai {:responses ...}. - Phase 1 limits:
:openai {:responses {:transport :websocket}}is not supported.:openai {:responses {:shell {:environment {:type :local}}}}is not supported.- OpenAI Responses-only options require direct OpenAI Responses routing (not Azure/custom Chat-compatible endpoints).
Canonical Example
;; In the flow definition:
;; :templates [{:id :summary
;; :type :llm-prompt
;; :system "You are concise."
;; :prompt "Summarize in 3 bullets:\\n{{text}}"}]
;; :functions [{:id :llm-input
;; :language :clojure
;; :code "(fn [input] {:text (:text input)})"}]
'(let [prepared (flow/step :function :prepare-llm-input
{:ref :llm-input
:input (flow/input)})
result (flow/step :llm :summarize
{:connection :ai
:model "gpt-4o-mini"
:template :summary
:data prepared
:output {:format :json}
:tools {:mode :propose
:allowed ["http" "db" "function"]}
:max-iterations 5})]
result)
OpenAI Responses Hosted Shell (Phase 1)
'(flow/step :llm :support-agent
{:connection :ai
:model "gpt-5.2"
:prompt "Analyze this support email and draft a reply."
:openai {:responses {:shell {:environment {:type :container_auto
:network-policy {:type :allowlist
:allowed-domains ["gmail.googleapis.com"]}}}
:tool-choice "required"
:store false}}})