delta
delta copied to clipboard
[Spark] Add an integration test for DynamoDB Commit Owner
Which Delta project/connector is this regarding?
- [X] Spark
- [ ] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other (fill in here)
Description
Adds an integration test for the DynamoDB Commit Owner. Tests the following scenarios
- Automated dynamodb table creation
- Concurrent reads and writes
- Table upgrade and downgrade
The first half of the test is heavily borrowed from dynamodb_logstore.py
.
How was this patch tested?
Test runs successfully with real DynamoDB and S3. Set the following environment variables (after setting the credentials in ~/.aws/credentials):
export S3_BUCKET=<bucket_name>
export AWS_PROFILE=<profile_name>
export RUN_ID=<random_run_id>
export AWS_DEFAULT_REGION=<region_that_matches_configured_ddb_region>
Ran the test:
./run-integration-tests.py --use-local --run-dynamodb-commit-owner-integration-tests \
--dbb-conf io.delta.storage.credentials.provider=com.amazonaws.auth.profile.ProfileCredentialsProvider \
spark.hadoop.fs.s3a.aws.credentials.provider=com.amazonaws.auth.profile.ProfileCredentialsProvider