link:http://aws.amazon.com/sqs/[Amazon SQS] queue. The S3 Event Notifications
should be configured to funnel events to a single SQS queue per table. The
Delta S3 Loader will take **all** messages from a single queue and insert those
into a single table.
Additionally, for source buckets which have _multiple_ types of data in them you may use link:https://docs.aws.amazon.com/AmazonS3/latest/userguide/notification-how-to-filtering.html[filtering on event notifications] to specify different object prefixes, etc.
For example, in a bucket named `audit_logs` bucket that has data prefixed with:
* `databricks/workspaceId=123/*.json`
* `tableau/*.json`
* `admin_console/domain=github.com/*.json`
A deployment of Delta S3 Loader to _only_ process the `admin_console` events
into an Delta table would require the following event configuration:
<1> Specify a destination Delta Lake table path in S3.
<2> Annotate the partition columns to help the loader partition data properly.
<3> Specify the input SQS queue by ARN
== Environment Variables
When running in an AWS Lambda, Delta S3 Loader should be configured solely with environment variables. In a standalone mode the daemon can be configured with command line options _or_ environment variables
|===
| Name | Required | Description
| `RUST_LOG`
| No
| Define the log level for the process: `error`, `warn`, `info`, `debug`.
|===
=== Authentication/Authorization
Delta S3 Loader assumes that the right AWS environment variables, such as
`AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` are defined in the environment.
Under the hood the Delta S3 Loader is not responsible for
authentication/authorization so please consult the