Output Artifacts (Final Output Viewers)
Quick Answer
Use this guide to shape flow final outputs for UI rendering (markdown, media, and groups) using stable viewer envelopes.
Flows always produce a final output: the value returned by the :flow form.
The Breyta UI shows this final output as a user-facing artifact (separate from debug inspection):
- On the run page,
Run data -> Artifactsopens a dedicated artifact sidepeek. - The canonical deep-link remains the run Output page.
Requestremains a debug entrypoint and is intentionally separate from artifacts.
This document describes how flow authors can shape the final output for good presentation.
Where users see output
For each run, output can be accessed from two user-facing surfaces:
| Surface | Purpose |
|---|---|
Run page sidepeek (Artifacts) | Primary, quick in-context output inspection. |
Output route (/:workspace-id/runs/:flow-slug/:run-id/output) | Canonical full-page output view and shareable deep-link. |
When a run has no output yet:
- Running/pending runs show
Output not available yet. - Terminal runs show
No output captured.
The viewer envelope (recommended)
Return an envelope map with these namespaced keys:
| Key | Meaning |
|---|---|
:breyta.viewer/kind | Viewer type to render (allowlisted). |
:breyta.viewer/value | Value payload for that viewer. |
:breyta.viewer/options | Optional viewer config (title, alt text, etc.). |
Example: Markdown report
{:breyta.viewer/kind :markdown
:breyta.viewer/options {:title "Summary"}
:breyta.viewer/value "# Report\n\nHello."}
Example: image (typically a Breyta-generated signed URL)
{:breyta.viewer/kind :image
:breyta.viewer/options {:title "Screenshot" :alt "Screenshot"}
:breyta.viewer/value "https://example.com/image.png"}
Example: audio/video (typically a Breyta-generated signed URL)
{:breyta.viewer/kind :audio
:breyta.viewer/options {:title "Audio"}
:breyta.viewer/value "https://example.com/audio.wav"}
{:breyta.viewer/kind :video
:breyta.viewer/options {:title "Video"}
:breyta.viewer/value "https://example.com/video.mp4"}
Breyta-managed media (GCS) (no public URL required)
In Breyta, images/audio/video are usually stored inside the Breyta system (blob storage, backed by GCS). The browser still needs a src URL that it can fetch, but that URL does not need to be public — it can be a time-limited signed URL generated by Breyta.
Recommended pattern: persist the media as a blob and return the persisted blob result (Breyta will mint/refresh a signed URL when rendering the run Output page):
(let [download (flow/step :http :download-video
{:connection :api
:path "/video.mp4"
:method :get
:persist {:type :blob
:tier :ephemeral
:content-type "video/mp4"}})]
{:breyta.viewer/kind :video
:breyta.viewer/options {:title "Video"}
:breyta.viewer/value download})
Tip: you can also return the persisted blob result directly and let the UI infer the viewer (it uses :content-type and fetches a signed URL as needed):
(let [download (flow/step :http :download-audio
{:connection :api
:path "/audio.wav"
:persist {:type :blob
:tier :ephemeral
:content-type "audio/wav"}})]
download)
For HTTP-downloaded media, prefer :tier :ephemeral on the :http step unless the artifact is intentionally durable and should live like a retained file beyond the immediate workflow.
Notes:
| Note | Implication |
|---|---|
| Signed URLs expire. | UI refreshes signed URLs at render time. |
| Need long-lived download links. | Generate fresh signed URL via new run or dedicated download flow. |
Multi-part output (group, recommended)
Use :group when you want to return multiple artifacts in one run. In v1 this is the recommended pattern for multi-artifact output:
{:breyta.viewer/kind :group
:breyta.viewer/items
[{:breyta.viewer/kind :markdown
:breyta.viewer/options {:title "Summary"}
:breyta.viewer/value "# Hello"}
{:breyta.viewer/kind :image
:breyta.viewer/options {:title "Image"}
:breyta.viewer/value "https://example.com/image.png"}
{:breyta.viewer/kind :raw
:breyta.viewer/options {:title "Raw"}
:breyta.viewer/value {:ok true :data [1 2 3]}}]}
Inference (optional, no envelope)
If you don’t use an envelope, the UI may infer a media viewer from common shapes:
{:url "https://example.com/file.png" :content-type "image/png"}
{:signed-url "https://example.com/file.wav" :content-type "audio/wav"}
If inference doesn’t match what you want, wrap the value in an explicit envelope.
Supported viewers (currently)
| Viewer | Use |
|---|---|
:raw | Fallback for arbitrary structured output. |
:text | Plain text content. |
:markdown | Rich text rendering. |
:image | Image URL/blob rendering. |
:audio | Audio URL/blob rendering. |
:video | Video URL/blob rendering. |
:group | Multi-part output in one envelope. |
JSON compatibility
If the final output is JSON (string keys), the UI also recognizes:
| JSON key | Meaning |
|---|---|
"breyta.viewer/kind" | Viewer type. |
"breyta.viewer/value" | Viewer payload value. |
"breyta.viewer/options" | Optional viewer config. |
"breyta.viewer/items" | Multi-part items for :group. |
Guidance
| Guidance | Why |
|---|---|
| Prefer explicit envelopes for end-user-facing flows. | Produces predictable rendering behavior. |
| Keep outputs reasonably sized. | UI truncates large raw outputs by default. |
| Prefer URLs/resource refs for media over huge inline strings. | Better performance and reliability; Data URIs suit only small demos. |
Related
Generating images from AI APIs (base64 responses)
Many image generation APIs (OpenAI, Google Imagen, Stability AI, etc.) return images as base64-encoded strings rather than URLs. You cannot pass a base64 string directly as :breyta.viewer/value — the UI needs either a URL or a Breyta blob reference.
For APIs that can exceed the 256 KB inline result limit, do not rely on the old inline-first pattern (HTTP -> function reads :body directly). Start by persisting the HTTP step, then use :load in the downstream function step to hydrate the persisted response before decoding the base64.
The working pattern is three steps:
- Call and persist the image API response — HTTP step with
:persist {:type :blob} - Load, decode, and persist the binary image — Function step with
:input {:resp resp}and:load [:resp] - Return a viewer envelope — Use the persisted image blob as the viewer value
(let [;; Step 1: Call and persist the image API response
resp (flow/step :http :generate-image
{:type :http
:connection :image-api
:method :post
:path "/images/generations"
:timeout 120
:json {"model" "gpt-image-1.5"
"prompt" prompt
"size" "1024x1024"
"quality" "low"
"output_format" "jpeg"}
:persist {:type :blob
:tier :ephemeral
:filename "image-response.json"}})
;; Step 2: Load the persisted HTTP response, decode base64 → binary bytes, persist as blob
;; breyta.sandbox/base64-decode-bytes must be called with full namespace
img (flow/step :function :save-image
{:type :function
:input {:resp resp}
:load [:resp]
:persist {:type :blob
:filename "image.jpeg"
:content-type "image/jpeg"}
:code '(fn [{:keys [resp]}]
(-> resp
:body
:data
first
:b64_json
breyta.sandbox/base64-decode-bytes))})]
;; Step 3: Return the persisted image blob as the viewer value
{:breyta.viewer/kind :image
:breyta.viewer/options {:title "Generated image"}
:breyta.viewer/value img})
Use :tier :ephemeral on the HTTP response persist when that response is just a temporary handoff. The derived image blob persisted from the function step uses the retained default today.
For multiple images, use :group:
{:breyta.viewer/kind :group
:breyta.viewer/items [{:breyta.viewer/kind :image
:breyta.viewer/value landscape-img
:breyta.viewer/options {:title "Landscape"}}
{:breyta.viewer/kind :image
:breyta.viewer/value portrait-img
:breyta.viewer/options {:title "Portrait"}}]}
Why not return the base64 string directly as the blob?
Storing a base64 string with :content-type "image/jpeg" creates a text blob. The UI will not render it as an image. The breyta.sandbox/base64-decode-bytes step converts it to actual binary bytes first.
Inline result size limit
Base64 image data is typically 100–300 KB per image, and gpt-image-1 / gpt-image-1.5 JPEG responses at 1024x1024 often exceed the 256 KB inline limit on a single image. Persist the HTTP step first when response size is uncertain or routinely large, then hydrate it with :load in the function step.
See also: breyta.sandbox helpers in Step Function reference
API response formats — common gotchas
Different image APIs return data differently. Make sure you're reading from the right path:
| API | Base64 response path | Notes |
|---|---|---|
| OpenAI gpt-image-1 / gpt-image-1.5 | (-> resp :body :data first :b64_json) | Always base64. output_format must be "jpeg", "png", or "webp" — not "b64_json" (DALL-E syntax). |
| OpenAI DALL-E 3 | (-> resp :body :data first :b64_json) | Only when response_format: "b64_json". Default returns a URL — skip the decode step. |
| Google Imagen (Vertex AI) | (-> resp :body :predictions first :bytesBase64Encoded) | Content type in (-> resp :body :predictions first :mimeType). |