Event export

The system logs all fulfillmenttools events and makes them available in a daily export. The daily event files are saved in a Google Cloud Storage bucket. This enables users to leverage data from their fulfillment process for additional processing and analytics.

This functionality is not activated by default. To enable it, contact the Professional Service team at [email protected].

Accessing the bucket

Google offers multiple client libraries for accessing Google Cloud Storage buckets and you can also use the Google Cloud Storage API directly. Another alternative is to use the gcloud CLI from the Google Cloud SDK.

Accessing the bucket requires a Google service account. Upon request, fulfillmenttools provides a service account key for authentication. This service account only grants access to this specific bucket; it cannot be used to access the fulfillmenttools API or its client applications.

The service account key will be a file named {YOUR_TENANT_NAME}-<key_id>.json. This file contains a JSON representation of the service account.

This file contains the service account key and allows access to the bucket, so make sure to store it securely. See the official Google documentation for further information.

File and folder structure

The bucket is named {YOUR-TENANT-NAME}-{TIER}-public-events-export-bucket and is only accessible by authorized accounts.

For each day, the system creates a folder in the bucket using the pattern YYYY-MM-DD, which contains the events of the corresponding day.

Screenshot of a Google Cloud Storage bucket showing folders named by date.
Bucket folder structure

Each folder contains one or more gzipped files. Each of these files contains multiple lines, where one line corresponds to one event in the fulfillmenttools platform.

Files inside folder

Each line of a file is a JSON object containing metadata fields (id, eventName, created) and the event payload.

{
  "id": "b0a0e08e-d7d0-4694-ad6c-bbf47b3fc1c6",
  "eventName": "order-created_event-v1",
  "payload": {
    ...  // Event
  },
  "created": "2024-02-01T09:20:39.411067+00:00"
}

Retention

The system stores data in the buckets for 90 days. After this period, the data is automatically deleted.

gsutil example

The following example demonstrates how to copy files from a Google Cloud Storage bucket using the Google gsutil command-line interface.

Check the official Google documentation about gsutil for further details.

Activate the service account

The service account email ID is located in the client_email field of the provided key file. Use this ID to activate the service account:

$ gcloud auth activate-service-account <SERVICE_ACCOUNT_EMAIL_ID> --key-file=<PATH_TO_KEY_FILE>

The command gcloud auth list lists all credentialed accounts and shows the active one. The gcloud config list command can be used to check the current configuration.

List files in the bucket

The following commands list the contents of the bucket:

# List all date folders in the bucket
$ gsutil ls gs://ocff-<TENANT_NAME>-<TIER>-public-events-export-bucket

gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-16/
gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-17/
gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-18/
gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-19/
...

# List all files for a specific date
$ gsutil ls gs://ocff-<TENANT_NAME>-<TIER>-public-events-export-bucket/2024-02-01

gs://ocff-mytenant-pre-public-events-export-bucket/2024-02-01/000000000000.json.gz
gs://ocff-mytenant-pre-public-events-export-bucket/2024-02-01/000000000001.json.gz

Download the whole bucket

Individual files or the entire bucket can be downloaded to a local folder:

$ gsutil cp -r gs://ocff-<TENANT_NAME>-<TIER>-public-events-export-bucket ./

Last updated