Amazon S3 / Cloud Storage
StorageStore, retrieve, and manage files across 13+ cloud storage providers. Agents upload processed documents, generate secure links, manage buckets, and handle bulk operations across Amazon S3, Cloudflare R2, DigitalOcean Spaces, and more.
What This Integration Enables
The S3 integration is provider-agnostic: it works with Amazon S3, Cloudflare R2, DigitalOcean Spaces, Backblaze B2, MinIO, Wasabi, and 7+ additional providers through the same action set. Agents use it as the document layer in multi-step workflows: processed documents land here, files are retrieved from here for downstream processing, and temporary access links are generated here for delivery. **Supported providers include:** Amazon S3, Cloudflare R2, DigitalOcean Spaces, Backblaze B2, MinIO, Wasabi, and more.
Without FlowRunner
With FlowRunner
Use Case Scenarios
Document Processing Archive
An AP agent processes vendor invoices through Parseur and PDF.co. After each invoice is validated and entered into Acumatica, the agent uploads the original PDF to S3 with a structured path: `invoices/[vendor-id]/[year]/[invoice-number].pdf`. It stores metadata: vendor name, amount, and Acumatica bill ID. When an auditor requests an invoice 18 months later, the agent retrieves it from S3 using the bill ID as a reference. No manual filing. No lost documents.
Secure Contract Delivery
A contract is signed in DocuSign. The agent downloads the executed document, uploads it to S3 with a structured path under the deal ID, and generates a presigned URL valid for 7 days. It emails the presigned URL to the client. The client downloads their signed contract securely without requiring S3 credentials or bucket access. The URL expires automatically.
Batch Report Distribution
Every month-end, the finance agent generates P&L, balance sheet, and AR aging reports from QuickBooks and Xero. Each report is uploaded to S3 under `reports/[year]/[month]/`. The agent generates presigned URLs for each report and sends them to the appropriate stakeholders via email. The finance team accesses current reports from a consistent location without anyone emailing spreadsheets.
Human-in-Loop Highlight
When an agent is about to delete files from storage (either single or bulk deletion), it does not do so silently for files above a defined age or size threshold. It sends a Slack message to the storage administrator: "Preparing to delete 47 files from [bucket/path]. These files are 90+ days old and match the cleanup policy. Confirm deletion or cancel?" The administrator reviews the list and confirms. Bulk deletions have human awareness before they execute. Data does not disappear without a logged approval.
Agent Capabilities
12 actionsBucket Management
3- Create Bucket Creates a new storage bucket. Used in environment provisioning workflows: when a new client or project is onboarded, the agent creates a dedicated bucket with appropriate naming and configuration.
- Delete Bucket Removes an empty bucket. Used in project archival and cleanup workflows.
- List Buckets Returns all buckets in the account. Used in administrative and reporting workflows.
Object Operations
9- Upload Object Stores a file in a specified bucket and path. Used in document workflow endpoints: processed invoices, signed contracts, generated reports, and extracted data files land here after processing.
- Upload Object from URL Downloads a file from a URL and stores it in the bucket. Used when files need to be transferred from external sources directly to storage without routing through the agent's processing layer.
- Get Object Metadata Returns file metadata without downloading the file: size, content type, last modified date, custom metadata. Used for verification and auditing without the overhead of a full download.
- Check Object Exists Verifies whether a specific file exists in a bucket. Used as a guard before creating files to avoid overwrites, or before retrieval to handle missing files gracefully.
- List Objects Returns the contents of a bucket or path prefix. Used in batch processing workflows where agents need to find and process all files in a location.
- Copy Object Copies a file from one location to another within or across buckets. Used in archival workflows where active files are copied to long-term storage.
- Delete Object Removes a specific file. Used in cleanup workflows after processing or in data governance workflows with retention policies.
- Delete Multiple Objects Removes a list of files in a single operation. Used for bulk cleanup after batch processing.
- Get Presigned URL Generates a time-limited, secure URL for a specific file. Used when files need to be delivered to external parties without granting permanent access: a secure link to a signed contract, a temporary download link for an invoice, a time-limited access URL for an audit package.
Start building with Amazon S3 / Cloud Storage
$100 in credits. No card required. Connect in minutes.