LLM
AI & KnowledgeLLM Node
Overview
Supports the following operations: Gemini 2.0 Pro (Enterprise), OPENAI_API_KEY, ANTHROPIC_API_KEY.
Configuration
| Property | Type | Default | Description |
|---|---|---|---|
model | string | — | Configuration field for model |
apiKeySecret | string | — | Configuration field for apiKeySecret |
systemPrompt | string | — | Configuration field for systemPrompt |
userPrompt | string | — | Configuration field for userPrompt |
temperature | string | — | Configuration field for temperature |
maxTokens | string | — | Configuration field for maxTokens |
responseSchema | string | — | Configuration field for responseSchema |
Output Variables
Reference these variables in downstream nodes using the {{node.field}} syntax.
{{llm.text}}anyOutput field: text
{{llm.raw_response}}anyOutput field: raw_response
{{llm.tokens_used}}anyOutput field: tokens_used
Examples
Basic Usage
Example configuration:
{
"model": "value",
"apiKeySecret": "value",
"systemPrompt": "value",
"userPrompt": "value",
"temperature": "value",
"maxTokens": "value",
"responseSchema": "value"
}Tips & Best Practices
- •Ensure all required fields are configured.
- •Check the output variables to see available data.
Try LLM in Your Workflow
Create a free account and start building AI-powered workflows with the LLM node.
Open Workflow Editor