AU-05 - Audit Process - Migrate logs to S3 buckets
💡 Summary
As a system architect, I need to ensure that log data is stored in a secure location approved for ATO validation
Motivation and context
Migrate log deliveries from Cloudtrail / Cloudwatch to the DHS required S3 bucket
Acceptance criteria
- [x] Capture evidence from current Cloudtrail and Cloudwatch log groups identifying log delivery
- [x] Move log delivery for all scripts and components from Cloudtrail / Cloudwatch to S3
- [x] Make sure Cloudtrail logs data events in addition to management events
- [x] Application logging (Cloudwatch) + Cloudtrail logs will be delivered to prescribed bucket(s)
- [x] Log retrieval tests complete successfully
- [ ] Data can be retrieved and viewed in new Audit system
Cloudwatch and Cloudtrail
@Matthew-Grayson / @colin-tim can you please also revert the changes in https://github.com/cisagov/crossfeed/compare/d72c02c041cd9aea7a0c561a3997ce65a0c46044...master? Then we can start afresh with PRs for this task.
In Crossfeed commercial, CloudTrail logging of data events is now actively managed by Terraform. This includes CloudWatch log delivery and storage in a dedicated CloudTrail S3 bucket. We still need to implement the delivery of other CloudWatch logs to their respective buckets. And all tasks still need to be implemented in Crossfeed gov cloud.
@epicfaace We're seeing there's no built-in way to extract cloudwatch logs into an s3 bucket.
Does this link to create a lambda function for it make sense? https://medium.com/dnx-labs/exporting-cloudwatch-logs-automatically-to-s3-with-a-lambda-function-80e1f7ea0187
On the commercial side, CloudWatch log groups are now backed up to S3 daily using a Lambda called cloudwatchToS3.