Onpilot AI engages a small set of third-party service providers ("Sub-Processors") to deliver the Services. Each Sub-Processor is bound by contract to confidentiality and data protection obligations no less protective than those in our Data Processing Agreement.
What This Page Covers
This page lists every Sub-Processor that may process Customer Data in connection with the Onpilot AI managed SaaS. It does not list infrastructure providers used purely for our corporate operations (for example, accounting or HR) where no Customer Data is processed.
Current Sub-Processors
The Sub-Processors below are engaged for all Onpilot AI customers on the managed SaaS tier:
| Sub-Processor | Purpose | Location | Customer Data Processed |
|---|---|---|---|
| Microsoft Azure | Cloud infrastructure (compute, storage, managed Postgres) and Azure OpenAI inference for the managed model tier | Canada Central | All Customer Data; LLM message content for managed-tier inference (zero-data-retention configuration) |
| Cloudflare | DNS, CDN, DDoS protection, and secure tunnel between edge and origin | Global | Request metadata only; no Customer Data at rest |
| WorkOS | Authentication, single sign-on, directory sync, and multi-factor authentication | United States | End-user identifiers and authentication metadata |
| Stripe | Subscription billing and payment processing | United States | Billing contact details, payment instrument metadata (Stripe holds card data; Onpilot AI never receives full PAN) |
| Resend | Transactional email delivery (verification, alerts) | United States | Recipient email addresses and email content |
| Infisical | Secrets and credentials management | United States | No Customer Data; Onpilot AI service credentials only |
| Composio | Third-party tool integration broker (when a customer connects an external app to an agent) | United States | Connection metadata, OAuth tokens for the external apps a customer connects, and the data the agent reads or writes through those tools |
BYOM (Bring Your Own Model) Providers
Customers on Business and Enterprise tiers may opt to route inference to their own large-language-model provider account. When a customer enables BYOM, the selected provider becomes a Sub-Processor for that customer's data only. The provider relationship is governed by the customer's direct contract with that vendor; Onpilot AI passes the contract guarantees through.
| Provider | Tier | Data Retention Default |
|---|---|---|
| Anthropic | Business / Enterprise BYOM | Zero data retention available on enterprise plans |
| OpenAI | Business / Enterprise BYOM | Zero data retention available on enterprise plans |
| Azure OpenAI Service (customer's tenant) | Enterprise BYOM | Per customer's Azure agreement |
| AWS Bedrock | Enterprise BYOM | Per customer's AWS agreement |
Notification of Changes
Onpilot AI will provide at least 30 days' advance notice before adding a new Sub-Processor that processes Customer Data, by:
- Updating this page with the proposed addition
- Notifying the Controller's designated contact by email (the address on file for billing or DPA correspondence)
Emergency replacements (for example, when a Sub-Processor is discontinued or fails to meet our security standards) may be actioned with shorter notice; we will notify Controllers as soon as practicable.
How to Object
A Controller may object to the addition of a new Sub-Processor on reasonable, data-protection-related grounds within the notice period by contacting us at the address below. We will work in good faith to find an alternative arrangement. If no resolution can be reached, the Controller may terminate the affected Service in accordance with the termination provisions of the underlying agreement.
Contact
Questions about Sub-Processors, our DPA, or your data protection rights:
Email: privacy@onpilot.ai
See also our Data Processing Agreement and Privacy Policy.
© 2026 Onpilot AI. All rights reserved.