To create a Large Language Model (LLM) application with prompt flow, you need to understand prompt flow’s core components.
Understand a flow
Prompt flow is a feature within Azure Machine Learning that allows you to author flows. Flows are executable workflows often consist of three parts:
- Inputs: Represent data passed into the flow. Can be different data types like strings, integers, or boolean.
- Nodes: Represent tools that perform data processing, task execution, or algorithmic operations.
- Outputs: Represent the data produced by the flow.

Similar to a pipeline, a flow can consist of multiple nodes that can use the flow’s inputs or any output generated by another node. You can add a node to a flow by choosing one of the available types of tools.
Explore the tools available in prompt flow
Three common tools are:
- LLM tool: Enables custom prompt creation utilizing Large Language Models.
- Python tool: Allows the execution of custom Python scripts.
- Prompt tool: Prepares prompts as strings for complex scenarios or integration with other tools.
Each tool is an executable unit with a specific function. You can use a tool to perform tasks like summarizing text, or making an API call. You can use multiple tools within one flow and use a tool multiple times.
Leave a Reply