Public Event Export
Last updated
Last updated
The architecture of the fulfillmenttools platform follows the MACH principles. Business features are implemented using independent microservices which send and consume events.
You can get access to all those (business) events from the platform in a daily export. The export files are stored in a cloud storage bucket. This enables you to leverage the data from your fulfillment process for additional processing and analytics. Note, that this functionality is not active by default on your tenant. If you would like to use it, please get in touch.
The bucket is named ocff-<tenant_name>-<tier>-public-events-export-bucket
and can be accessed in a variety of ways. Access to this bucket is restricted so that only authorized accounts can access this data.
Multiple client libraries exist for accessing Google Cloud Storage buckets and you can also use the Cloud Storage API directly. Another alternative is to use the gcloud CLI from the google Cloud SDK.
To use any of these methods you will need to use a Google service account. On request, fulfillmenttools will provide you with a service account key that you have to use for authentication. The service account can only be used to access this particular bucket. It cannot be used to access the fulfilmenttools API or the client apps.
The service account key will be a file named ocff-<tenant_name>-<tier>-<key_id>.json
. This file contains a JSON representation of the service account.
This file contains the service account key and allows access to the bucket, so make sure to store it securely. See the official documentation for further information.
For each day a folder is created in the bucket (with the pattern YYYY-MM-DD
), containing the events of the corresponding day.
Each folder contains one ore more gzipped files. Each of these files contains multiple lines, one line corresponds to one event in the fulfillmenttools platform.
Each line of the file is a JSON object containing some metadata fields and the event payload:
Data in the buckets is stored for 90 days, after 90 days, the data is deleted.
Here's an example how to copy files from a Cloud Storage bucket using the gsutil
CLI tools:
Activate the Service Account
The service account email id can be found in the provided key file under client_email.
Use this to switch to the service account:
You can use gcloud auth list
to list all credentialed accounts and check the active one. Also, you may want to check your setup using gcloud config list
.
List Files in the Bucket
Here's how to list the contents of the bucket:
Download the whole Bucket
You can either download individual files or the whole bucket to a local folder:
Check the official documentation on Google's gsutil
tool for further details.