To initiate a bulk export query, use the /bulkexport/submit endpoint.

The /bulkexport/submit endpoint request includes the Elasticsearch query that you want to run, and some additional information about the S3 storage location, file format and size, and file compression options.

Access to the /bulkexport/submit endpoint can be restricted to control the S3 destinations that are permitted. A user submitting a bulk export query must have an appropriate chaos:query:export:submit permission for the target S3 destination in addition to any existing permissions required for the query itself.


A sample submit request follows. The fields are described after the example.

POST /bulkexport/submit
  "params": { ... your Elasticsearch query ... },
  "exportOptions": {
    "queryName": "optional name",
    "compression": "NONE" | "GZIP",
    "destination": "{BUCKET NAME}/{S3 PATH PREFIX}",
    "unitFileSize": 100,
    "fileFormat": "JSON" | "CSV"
  "type" : "elastic"

Description of Fields

  • params – Type or paste your Elasticsearch query in this field.
  • exportOptions – The options include the following fields:
    • queryName – An optional name for the export that appears in the Bulk Export UI menu.
    • compression – Specify the compression to use for the exported JSON files, such as GZIP (default) or None.
    • destination – Specify the identifier of the S3 bucket where you want to store the exported result set files.
    • unitFileSize – Specify a maximum file size for the resulting files. The default is 100 MB.
    • fileFormat – Specify the format of the exported files as JSON (default) or CSV.
  • type – Set the type of bulk query export to elastic.


The response for a successful submit request returns an identifier that can be used with the /bulkexport/status endpoint to poll for progress and completion status.

  "id": "{TRACKING UID}"  



RBAC Controls for Submit

To submit a bulk export request, your account must have permission to use the bulk export feature itself, and must have permission to read and write to the designated S3 bucket in your customer account.

  "Actions": [  
  "Effect": "Allow",  
  "Resources": [  
  "Version": "1.0"