Event export

All fulfillmenttools events will be logged and made available in a daily export. The daily event files are saved in a Google Cloud Storage bucket. This enables you to leverage the data from your fulfillment process for additional processing and analytics.

This functionality is not activated by default, if you would like to use it, please contact us at [email protected].

Accessing the bucket

Google offers multiple client libraries for accessing Google Cloud Storage buckets and you can also use the Google Cloud Storage API directly. Another alternative is to use the gcloud CLI from the Google Cloud SDK.

You will need to use a Google service account to use any of these methods. On request, fulfillmenttools will provide you with a service account key that you have to use for authentication. The service account can only be used to access this particular bucket. It cannot be used to access the fulfilmenttools API or the client applications.

The service account key will be a file named {YOUR_TENANT_NAME}-<key_id>.json. This file contains a JSON representation of the service account.

This file contains the service account key and allows access to the bucket, so make sure to store it securely. See the official Google documentation for further information.

File and folder structure

The bucket is named {YOUR_TENANT_NAME}-<tier>-public-events-export-bucket and can be accessed in a variety of ways. Access to this bucket is restricted, so only authorized accounts can access this data.

For each day a folder is created in the bucket (with the pattern YYYY-MM-DD), containing the events of the corresponding day:

Bucket folder structure

Each folder contains one or more gzipped files. Each of these files contains multiple lines, one line corresponds to one event in the fulfillmenttools platform:

Files inside folder

Each line of the file is a JSON object containing some metadata fields and the event payload:

{
  "id": "b0a0e08e-d7d0-4694-ad6c-bbf47b3fc1c6",
  "eventName": "order-created_event-v1",
  "payload": {
    ...  // Event
  },
  "created": "2024-02-01T09:20:39.411067+00:00"
}

Retention

Data in the buckets is stored for 90 days, after 90 days, the data is deleted.

gsutil example

Here's an example how to copy files from a Google Cloud Storage bucket using the Google gsutil CLI.

Check the official Google documentation about gsutil for further details.

Activate the Service Account

The service account email id can be found in the provided key file under client_email. Use this to switch to the service account:

$ gcloud auth activate-service-account <service account email id> --key-file=<Path to the key file>

You can use gcloud auth list to list all credentialed accounts and check the active one. Also, you may want to check your setup using gcloud config list.

List files in the bucket

Here's how to list the contents of the bucket:

$ gsutil ls gs://ocff-<tenant_name>-<tier>-public-events-export-bucket

gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-16/
gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-17/
gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-18/
gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-19/
...

$ gsutil ls gs://ocff-<tenant_name>-<tier>-public-events-export-bucket/2024-02-01

gs://ocff-mytenant-pre-public-events-export-bucket/2024-02-01/000000000000.json.gz
gs://ocff-mytenant-pre-public-events-export-bucket/2024-02-01/000000000001.json.gz

Download the whole bucket

You can either download individual files or the whole bucket to a local folder:

$ gsutil cp -r gs://ocff-<tenant_name>-<tier>-public-events-export-bucket ./

Last updated