D
DocuBench

Subprocessors

The following third-party providers may be used to operate the Service.

Last Updated: February 9, 2026

Amazon Web Services (AWS)

  • Role: Cloud infrastructure and AI model hosting (via Amazon Bedrock)
  • Data processed: Account data; usage metadata; User Content submitted for inference
  • Purpose: Hosting, storage, networking, and model inference via Bedrock
  • Processing location: U.S. regions
  • Notes: Bedrock provides access to models hosted by AWS; model availability and data handling depend on AWS/Bedrock terms and configured region.

Google Cloud Platform (GCP)

  • Role: Cloud infrastructure and AI model hosting (including managed model services)
  • Data processed: Account data; usage metadata; User Content submitted for inference
  • Purpose: Hosting and model inference
  • Processing location: U.S. regions
  • Notes: Model/service availability and data handling depend on GCP terms and configured region.

Anthropic

  • Role: Direct model API provider
  • Data processed: User Content submitted for inference; usage metadata
  • Purpose: Model inference
  • Processing location: Per Anthropic terms
  • Notes: Used when the Service routes to Anthropic models directly (outside AWS and GCP).

OpenAI

  • Role: Direct model API provider
  • Data processed: User Content submitted for inference; usage metadata
  • Purpose: Model inference
  • Processing location: Per OpenAI terms
  • Notes: Used when the Service routes to OpenAI models directly.

Stripe (if applicable)

  • Role: Payment processing
  • Data processed: Billing contact info; transaction metadata
  • Purpose: Subscription billing
  • Processing location: Per Stripe terms
  • Notes: We do not store full card numbers.