Public Event Export

How to retrieve public event export files from Google Cloud Storage

The architecture of the fulfillmenttools platform follows the MACH principles. Business features are implemented using independent microservices which send and consume events.

You can get access to all those (business) events from the platform in a daily export. The export files are stored in a cloud storage bucket. This enables you to leverage the data from your fulfillment process for additional processing and analytics. Note, that this functionality is not active by default on your tenant. If you would like to use it, please get in touch.

The bucket is named ocff-<tenant_name>-<tier>-public-events-export-bucket and can be accessed in a variety of ways. Access to this bucket is restricted so that only authorized accounts can access this data.

Accessing the Bucket

Multiple client libraries exist for accessing Google Cloud Storage buckets and you can also use the Cloud Storage API directly. Another alternative is to use the gcloud CLI from the google Cloud SDK.

To use any of these methods you will need to use a Google service account. On request, fulfillmenttools will provide you with a service account key that you have to use for authentication. The service account can only be used to access this particular bucket. It cannot be used to access the fulfilmenttools API or the client apps.

The service account key will be a file named ocff-<tenant_name>-<tier>-<key_id>.json. This file contains a JSON representation of the service account.

This file contains the service account key and allows access to the bucket, so make sure to store it securely. See the official documentation for further information.

File / Folder Structure

For each day a folder is created in the bucket (with the pattern YYYY-MM-DD), containing the events of the corresponding day.

Each folder contains one ore more gzipped files. Each of these files contains multiple lines, one line corresponds to one event in the fulfillmenttools platform.

Each line of the file is a JSON object containing some metadata fields and the event payload:

{
  "id": "b0a0e08e-d7d0-4694-ad6c-bbf47b3fc1c6",
  "eventName": "order-created_event-v1",
  "payload": {
    ...  // the event
  },
  "created": "2024-02-01T09:20:39.411067+00:00"
}

Retention

Data in the buckets is stored for 90 days, after 90 days, the data is deleted.

Example

Here's an example how to copy files from a Cloud Storage bucket using the gsutil CLI tools:

Activate the Service Account

The service account email id can be found in the provided key file under client_email. Use this to switch to the service account:

$ gcloud auth activate-service-account <service account email id> --key-file=<Path to the key file>

You can use gcloud auth list to list all credentialed accounts and check the active one. Also, you may want to check your setup using gcloud config list.

List Files in the Bucket

Here's how to list the contents of the bucket:

$ gsutil ls gs://ocff-<tenant_name>-<tier>-public-events-export-bucket

gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-16/
gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-17/
gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-18/
gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-19/
...

$ gsutil ls gs://ocff-<tenant_name>-<tier>-public-events-export-bucket/2024-02-01

gs://ocff-mytenant-pre-public-events-export-bucket/2024-02-01/000000000000.json.gz
gs://ocff-mytenant-pre-public-events-export-bucket/2024-02-01/000000000001.json.gz

Download the whole Bucket

You can either download individual files or the whole bucket to a local folder:

$ gsutil cp -r gs://ocff-<tenant_name>-<tier>-public-events-export-bucket ./

Check the official documentation on Google's gsutil tool for further details.

Last updated