# Event export

The system logs all [fulfillmenttools events](https://docs.fulfillmenttools.com/documentation/getting-started/eventing/available-events) and makes them available in a daily export. The daily event files are saved in a [Google Cloud Storage](https://cloud.google.com/storage/docs) bucket, enabling users to leverage data from their fulfillment process for additional processing and analytics.

{% hint style="info" %}
Event export functionality isn't activated by default. Use the [contact form](https://ocfulfillment.atlassian.net/servicedesk/customer/portal/1/group/11/create/39) to ask us to enable it for you.
{% endhint %}

## Accessing the bucket

[Google offers multiple client libraries](https://cloud.google.com/storage/docs/reference/libraries) for accessing Google Cloud Storage buckets, and you can also use the [Google Cloud Storage API ](https://cloud.google.com/storage/docs/apis)directly. Another alternative is to use the [gcloud CLI](https://cloud.google.com/sdk/docs/install) from the Google Cloud SDK.

Accessing the bucket requires a [Google service account](https://cloud.google.com/docs/authentication#service-accounts). Upon request, fulfillmenttools provides a service account key for authentication. This service account grants access only to this specific bucket. It can't be used to access the fulfillmenttools API or its client applications.

The service account key will be a file named `{projectId}-{key_id}.json`. This file contains a JSON representation of the service account.

{% hint style="info" %}
This file contains the service account key and allows access to the bucket, so make sure to store it securely. See the [official Google documentation](https://cloud.google.com/iam/docs/best-practices-for-managing-service-account-keys?_ga=2.192306863.-481233758.1659361152) for further information.
{% endhint %}

## File and folder structure

The bucket is named `{projectId}-{TIER}-public-events-export-bucket` and is only accessible by authorized accounts.

For each day, the system creates a folder in the bucket with the pattern `YYYY-MM-DD` that contains the events for that day.

<figure><img src="https://content.gitbook.com/content/Lrrr5jgTsDuR38gNJIrm/blobs/roI2mVP3IA8DM7gg5UH5/Bildschirmfoto%202024-02-06%20um%2009.11.16.png" alt="Screenshot of a Google Cloud Storage bucket showing folders named by date."><figcaption><p>Bucket folder structure</p></figcaption></figure>

Each folder contains one or more gzipped files. Each of these files contains multiple lines, each corresponding to an event in fulfillmenttools.

<figure><img src="https://content.gitbook.com/content/Lrrr5jgTsDuR38gNJIrm/blobs/CQmw3IzrHv8BiTBQLzTu/Bildschirmfoto%202024-02-06%20um%2009.13.24.png" alt=""><figcaption><p>Files inside folder</p></figcaption></figure>

Each line of a file is a JSON object containing metadata fields (`id`, `eventName`, `created`) and the event `payload`.

```json
{
  "id": "b0a0e08e-d7d0-4694-ad6c-bbf47b3fc1c6",
  "eventName": "order-created_event-v1",
  "payload": {
    ...  // Event
  },
  "created": "2024-02-01T09:20:39.411067+00:00"
}
```

## Retention

The system stores data in the buckets for 90 days. After this period, the data is automatically deleted.

## gcloud CLI example

The following example demonstrates how to copy files from a Google Cloud Storage bucket using the [Google Cloud CLI.](https://docs.cloud.google.com/sdk/docs/initializing)

{% hint style="info" %}
Check the official [Google documentation about gsutil](https://cloud.google.com/storage/docs/gsutil) for further details.
{% endhint %}

### Activate the service account

The service account email ID is located in the `client_email` field of the provided key file. Use this ID to activate the service account:

```sh
$ gcloud auth activate-service-account {SERVICE_ACCOUNT_EMAIL_ID} --key-file={PATH_TO_KEY_FILE}
```

The command `gcloud auth list` lists all credentialed accounts and shows the active one. The `gcloud config list` command can be used to check the current configuration.

### List files in the bucket

The following commands list the contents of the bucket:

```sh
# List all date folders in the bucket
$ gcloud storage ls gs://{projectId}-{TIER}-public-events-export-bucket

gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-16/
gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-17/
gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-18/
gs://ocff-mytenant-pre-public-events-export-bucket/2024-01-19/
...

# List all files for a specific date
$ gcloud storage ls gs://{projectId}-{TIER}-public-events-export-bucket/2024-02-01

gs://ocff-mytenant-pre-public-events-export-bucket/2024-02-01/000000000000.json.gz
gs://ocff-mytenant-pre-public-events-export-bucket/2024-02-01/000000000001.json.gz
```

### Download the whole bucket

Individual files or the entire bucket can be downloaded to a local folder:

```sh
$ gcloud storage cp gs://{projectId}-{TIER}-public-events-export-bucket
```
