Skip to main content

Credentials

Credentials store encrypted secrets — API keys, passwords, OAuth tokens, SSH keys — that agents and tools need to authenticate with external services.

Managing Credentials

Go to Settings → Credentials to create, edit, or delete credentials.

Each credential has:

FieldDescription
NameDisplay name (e.g., "OpenAI Production Key")
KeyEnvironment variable name used for injection (e.g., OPENAI_API_KEY)
CategoryWhat the credential is used for (see below)
TypeThe credential format (see below)
FieldsThe actual secret values (encrypted at rest)

Categories

CategoryUsed for
llm_providerLLM API keys (OpenAI, Anthropic, etc.)
mcp_serviceMCP server authentication
runtimeSandbox runtime access
communicationEmail, Slack, messaging services
ssh_keySSH key pairs
customAnything else

Credential Types

TypeFields
API TokenSingle token value
API Key PairKey + secret pair
Username / PasswordUsername + password
Email SMTPHost, port, username, password
OAuth 2.0Client ID, client secret, tokens
AWS SESAccess key, secret key, region
SSH KeyPrivate key, optional passphrase
CustomArbitrary key-value pairs

How Credentials Are Used

Credentials are injected as environment variables when the engine runs a workflow. The credential's key field becomes the env var name, and the secret value is decrypted and passed to the engine process.

This means agents and tools access credentials the same way any application reads env vars — no special SDK or API required.

Platform Keys vs. Your Own Keys

Credentials have two modes:

  • Platform key — ORQO provides the API key. No secret fields needed. Usage is deducted from your subscription's credit balance. This is the default for new organizations — you can start running workflows immediately.
  • Your own key — You provide your own API key from the provider. Usage is billed directly by the provider. Your platform credits are not consumed.

When a credential is set to use the platform key, it displays "Platform key" in the UI instead of a masked token value.

Linking to LLM Configs

LLM configurations reference a credential to authenticate with the model provider. When you create an LLM config, you select both a model and the credential that holds the API key for that provider. If the credential uses the platform key, no additional setup is needed.