Credentials
Credentials store encrypted secrets — API keys, passwords, OAuth tokens, SSH keys — that agents and tools need to authenticate with external services.
Managing Credentials
Go to Settings → Credentials to create, edit, or delete credentials.
Each credential has:
| Field | Description |
|---|---|
| Name | Display name (e.g., "OpenAI Production Key") |
| Key | Environment variable name used for injection (e.g., OPENAI_API_KEY) |
| Category | What the credential is used for (see below) |
| Type | The credential format (see below) |
| Fields | The actual secret values (encrypted at rest) |
Categories
| Category | Used for |
|---|---|
llm_provider | LLM API keys (OpenAI, Anthropic, etc.) |
mcp_service | MCP server authentication |
runtime | Sandbox runtime access |
communication | Email, Slack, messaging services |
ssh_key | SSH key pairs |
custom | Anything else |
Credential Types
| Type | Fields |
|---|---|
| API Token | Single token value |
| API Key Pair | Key + secret pair |
| Username / Password | Username + password |
| Email SMTP | Host, port, username, password |
| OAuth 2.0 | Client ID, client secret, tokens |
| AWS SES | Access key, secret key, region |
| SSH Key | Private key, optional passphrase |
| Custom | Arbitrary key-value pairs |
How Credentials Are Used
Credentials are injected as environment variables when the engine runs a workflow. The credential's key field becomes the env var name, and the secret value is decrypted and passed to the engine process.
This means agents and tools access credentials the same way any application reads env vars — no special SDK or API required.
Platform Keys vs. Your Own Keys
Credentials have two modes:
- Platform key — ORQO provides the API key. No secret fields needed. Usage is deducted from your subscription's credit balance. This is the default for new organizations — you can start running workflows immediately.
- Your own key — You provide your own API key from the provider. Usage is billed directly by the provider. Your platform credits are not consumed.
When a credential is set to use the platform key, it displays "Platform key" in the UI instead of a masked token value.
Linking to LLM Configs
LLM configurations reference a credential to authenticate with the model provider. When you create an LLM config, you select both a model and the credential that holds the API key for that provider. If the credential uses the platform key, no additional setup is needed.