Basically, Azure Prompt Flow is your easy button for building apps with those big AI models (LLMs). It lets you quickly connect the LLM, your specific instructions (prompts), and any extra code into a simple workflow. This makes creating, testing, tweaking, and launching your AI applications way, way faster and less complicated. Â
🛍️ Generative AI Order Flow: Inventory Perspective
Here is simple order processing use case that illustrates how an AI Agent, built within Azure AI Foundry, can handle an order request, interact with an inventory system, and provide a smart, context-aware response. The core is the Prompt Flow, orchestrating the LLM's intelligence with external tools and data.
Step by Step Processing: Order Flow with Generative AI
Now lets take a deeper look what Microsoft Foundry components could be used under the hood to automate and integrate with Microsoft Services for each of these agents
Here Prompt Flow acts as the crucial middle layer, allowing the LLM (the Brain) to utilize the Tool (the Hands) to interact with the real-world business system before generating a final, informed response
🎠LLM vs. Prompt Flow: The Communication Mechanism
In Azure AI Foundry's Prompt Flow, the interaction between the intelligent model and the deterministic workflow is managed through Function Calling (or Tool Calling).
The LLM's job is not to execute code, but to determine the best course of action based on the user's request.
Pre-Step: The Prompt Flow gives the LLM (e.g., GPT-4o) a list of available tools in the System Prompt. This list includes the function signature for the Inventory Check Tool, specifying its name (inventory_check), description, and required parameters (item_id, quantity).
Action (Step 3): When the user asks, "Do you have 15 units of the 'Delta-9' widget?", the LLM realizes it cannot answer this question from its general training data.
The "Hint": Instead of generating a natural language answer, the LLM generates a structured output: a JSON object that represents the function call. This is the signal to the Orchestrator.
Example LLM Output (The Hint/Signal):
JSON
{
  "function": "inventory_check",
  "arguments": {
    "item_id": "Delta-9 widget",
    "quantity": 15
  }
}
2. The Prompt Flow Orchestrator's Role: The Action Agent (Step 4)
The Prompt Flow is the deterministic, reliable workflow engine built to interpret and execute the LLM's instructions.
Action (Step 4): The Prompt Flow Orchestrator receives the LLM's output. It sees that the output is not a final text response, but the structured JSON object requesting a function call.
Execution: The Orchestrator immediately intercepts this JSON, extracts the function name (inventory_check) and the parameters, and then triggers the actual execution of the pre-defined Inventory Check Tool (which is a Python script, Logic App, or API call).
Why the Orchestrator? This separation of concerns ensures safety, determinism, and control. The LLM is a probabilistic engine and cannot handle security or complex workflow logic. The Prompt Flow provides the necessary control plane (auditing, retries, security context, and routing) to safely interact with the sensitive, deterministic systems of record (like SAP).
In case you wondering how the retrieval lookup execution happened in LLM or within the flow orchestrator itself ? Lets recognize the two distinct steps involved:
LLM Decides: The LLM uses its intelligence to decide what needs to be done ("I need to call the inventory tool with these arguments").
Prompt Flow Does: The Prompt Flow uses its deterministic logic to take that LLM decision and execute the underlying system integration securely and reliably.
The LLM is the intent router, and the Prompt Flow is the execution engine.