
Cargando...
Build, test, and deploy multi-step generative AI workflows visually using a drag-and-drop canvas
Amazon Bedrock Prompt Flows is a fully managed capability within Amazon Bedrock that enables builders to visually create, test, and deploy multi-step generative AI workflows by connecting prompts, foundation models, and AWS services on a drag-and-drop canvas. It eliminates the need to write orchestration code, allowing teams to chain prompt templates, add conditional logic, and integrate data sources into cohesive AI pipelines. Prompt Flows is designed for rapid prototyping and production deployment of complex LLM-powered workflows with built-in versioning and alias management.
Visually orchestrate multi-step generative AI workflows by connecting foundation models, prompts, and AWS services without writing custom orchestration logic.
Use When
Avoid When
Visual drag-and-drop canvas
Build flows graphically in the AWS Console without writing orchestration code
Prompt node integration
Directly embed Bedrock Prompt Management templates as nodes within a flow
Foundation Model node
Invoke any Bedrock-supported foundation model as a step in the workflow
Condition node (branching logic)
Add if/else conditional logic to route flow execution based on model outputs or input values
Iterator node
Loop over arrays of inputs, processing each element through downstream nodes
Collector node
Aggregate outputs from iterator loops back into a single array for downstream processing
Lambda function node
Invoke AWS Lambda functions mid-flow for custom business logic or data transformation
Knowledge Base retrieval node
Query Amazon Bedrock Knowledge Bases (RAG) as a step within the flow
Agent node
Invoke a Bedrock Agent as a node within a Prompt Flow for nested agentic behavior
Storage node (S3)
Read from or write to Amazon S3 as a flow step
Flow versioning and aliases
Publish immutable versions and use aliases for deployment management (e.g., prod, staging)
Flow testing in console
Test flows interactively in the Bedrock console before publishing
InvokeFlow API
Invoke flows programmatically via the Bedrock Runtime API with streaming support
Streaming responses
InvokeFlow supports response streaming for real-time output delivery
IAM-based access control
Flows respect IAM policies; each node's permissions are governed by the flow's execution role
Encryption at rest and in transit
Supports AWS KMS customer-managed keys for data encryption
CloudWatch logging and tracing
Flow execution traces and logs available in CloudWatch for debugging and audit
Cross-region inference
Prompt Flows invokes models in the same region; cross-region routing requires separate architecture patterns
RAG-augmented multi-step flow
high freqUse a Knowledge Base retrieval node to fetch relevant context, pass it to a foundation model node for generation, then optionally route output through a condition node based on confidence or content — the canonical RAG pipeline without custom code
Custom logic injection mid-flow
high freqInsert Lambda nodes between model calls to perform data validation, format transformation, database lookups, or business rule enforcement — extends flow capability beyond what prompt engineering alone can achieve
Centralized prompt governance in flows
medium freqReference versioned prompt templates from Bedrock Prompt Management as nodes in a flow — ensures prompt changes are tracked, versioned, and reusable across multiple flows without duplication
Nested agent orchestration
medium freqInvoke a Bedrock Agent as a node within a Prompt Flow when a workflow step requires autonomous tool use and multi-turn reasoning — combines the visual orchestration of Flows with the agentic power of Agents
Document processing pipeline
medium freqRead documents from S3 as flow input, process through multiple model nodes for extraction/summarization/classification, then write structured results back to S3 — batch document intelligence without custom orchestration code
Serverless AI API endpoint
medium freqExpose a Prompt Flow as a REST API by fronting InvokeFlow calls with API Gateway + Lambda — provides a managed, scalable endpoint for web/mobile applications to trigger AI workflows
Prompt Flows is for VISUAL ORCHESTRATION of multi-step workflows — if a question describes 'connecting multiple prompts and services without writing orchestration code' or 'drag-and-drop AI pipeline,' the answer is Prompt Flows, not Agents or Step Functions
You are NOT charged for the Prompt Flows service itself — costs come entirely from underlying resources (tokens, Lambda, Knowledge Bases). If an exam question asks about cost optimization in Prompt Flows, the answer focuses on reducing model token consumption or Lambda duration, NOT on reducing 'flow executions' as a billable unit
When a scenario describes 'visually connecting multiple AI steps without writing orchestration code,' always choose Prompt Flows over Agents, Step Functions, or custom Lambda chains
Prompt Flows has ZERO orchestration cost — you only pay for what each node consumes (tokens, Lambda, Knowledge Base, S3). Cost optimization means reducing token usage, not flow executions.
Flows = deterministic pre-defined path; Agents = autonomous runtime reasoning. This distinction drives almost every Flows vs. Agents question on AIF-C01.
Know the node types cold: Input, Output, Prompt, Foundation Model, Condition, Iterator, Collector, Lambda, Knowledge Base, Agent, Storage (S3). Exam questions may describe a scenario and ask which node type to use — Iterator+Collector is the key pattern for processing arrays/lists
Versioning in Prompt Flows is IMMUTABLE — once you publish a version, it cannot be edited. Use ALIASES to point traffic to specific versions and enable safe deployment strategies (blue/green, canary). This mirrors Lambda versioning behavior exactly.
Prompt Flows uses an IAM EXECUTION ROLE that must have permissions for every AWS service each node calls — if a flow fails with an access error, the fix is updating the flow's execution role, not the caller's IAM policy
When a question asks about DEBUGGING or OBSERVABILITY for Prompt Flows, the answer involves CloudWatch Logs and the execution trace — Prompt Flows emits detailed node-level trace data that shows inputs/outputs at each step, similar to X-Ray traces but native to Bedrock
Prompt Flows supports STREAMING via InvokeFlow — if a question asks how to deliver real-time, token-by-token output from a flow to an end user, streaming InvokeFlow is the answer, not polling or async callbacks
Common Mistake
Amazon Bedrock Prompt Flows and Amazon Bedrock Agents are interchangeable — both orchestrate AI workflows, so either works for any use case
Correct
They serve fundamentally different purposes: Prompt Flows is a VISUAL, deterministic workflow orchestrator for chaining pre-defined steps — you define the exact sequence. Agents is an AUTONOMOUS, dynamic orchestrator that decides at runtime which tools to call and in what order to achieve a goal. Use Flows when the workflow is known; use Agents when the AI must reason about what to do next.
This is the #1 confusion on the AIF-C01 exam. The key differentiator is deterministic vs. autonomous — Flows = you define the path, Agents = the model decides the path
Common Mistake
Prompt Flows charges a per-execution fee in addition to underlying service costs
Correct
There is NO separate orchestration charge for Prompt Flows. You only pay for what the nodes consume: foundation model tokens, Lambda invocations, Knowledge Base queries, S3 operations, etc. The flow orchestration itself is free.
Candidates often assume a managed orchestration service has its own pricing tier (like Step Functions charges per state transition). Bedrock Prompt Flows does not — this affects cost estimation questions
Common Mistake
You need to write Python or JavaScript code to build a Prompt Flow
Correct
Prompt Flows is a LOW-CODE/NO-CODE visual builder. You build flows on a canvas in the AWS Console by dragging and connecting nodes. Code is only needed if you add a Lambda node for custom logic — and even then, the flow orchestration itself requires no code.
The entire value proposition of Prompt Flows is eliminating orchestration code. Questions testing service selection often hinge on whether the solution requires coding expertise
Common Mistake
Published flow versions can be updated in place when you need to fix a prompt or change a node
Correct
Flow versions are IMMUTABLE once published. To make changes, you edit the flow draft, then publish a NEW version. Aliases can then be updated to point to the new version, enabling zero-downtime deployments.
Immutability is a core concept tested in deployment/versioning questions. Candidates who don't know this may choose wrong answers about 'updating a version' vs. 'creating a new version'
Common Mistake
Prompt Flows can only connect to other Bedrock services — it cannot integrate with non-Bedrock AWS services
Correct
Prompt Flows integrates with AWS Lambda, Amazon S3, Amazon Bedrock Knowledge Bases, Amazon Bedrock Agents, and Bedrock foundation models. The Lambda node is the universal integration point — through Lambda, a flow can connect to virtually any AWS service or external API.
Understanding the Lambda node as an extensibility escape hatch is critical for architecture questions about integrating flows with DynamoDB, RDS, third-party APIs, etc.
FLOWS = 'Find, Link, Orchestrate, Without Scripting' — the four things Prompt Flows does for you visually
Node types memory trick — 'I Can Find Lost Kittens And Sleep' = Input, Condition, Foundation-model, Lambda, Knowledge-base, Agent, Storage (output implied)
Versioning rule: 'PUBLISH = PERMANENT' — once published, a version is frozen forever; aliases are the flexible pointer layer on top
Pricing reminder: 'FLOWS are FREE to orchestrate, you pay for what FLOWS through them' — costs live in the nodes, not the canvas
CertAI Tutor · AIF-C01 · 2026-03-07
In the Same Category
Comparisons
Guides & Patterns