inference:models
Lists all available models from the configured inference provider
Lists all available models from the configured inference provider
Options
| Name | Type | Description |
|---|---|---|
provider | string | The inference provider whose models to list (e.g. "ollama", "openai") |
Outputs
| Name | Type | Description |
|---|---|---|
provider | string | The provider that was queried |
models | int | The number of models available from the provider |
How is this guide?
inference
Makes LLM inference requests using the configured provider and its model. Supports tools, LLM parameter settings and cost calculation with user currency conversion. Pass a template path to the `prompt` field, or a string prompt to render dynamic prompts with variables.
Map
Transform a list by running sub-steps per item and collecting results.