ddbimport
ddbimport copied to clipboard
StepFunction not installing
Hi
Looking at ddbimport looks great, I used from local PC works ok just slow. so trying stepfuction version and getting stuck.
C:\Users\antho\Desktop>ddbimport -install -stepFnRegion=eu-west-2
{"level":"info","ts":1641669772.9100273,"caller":"cmd/main.go:215","msg":"installing ddbimport Step Function","v":""}
{"level":"info","ts":1641669776.592445,"caller":"cmd/main.go:255","msg":"uploading Lambda zip","v":""}
{"level":"info","ts":1641669824.615106,"caller":"cmd/main.go:285","msg":"zip upload complete","v":""}
{"level":"info","ts":1641669824.615106,"caller":"cmd/main.go:288","msg":"updating ddbimport stack","v":""}
{"level":"info","ts":1641669825.3889296,"caller":"cmd/main.go:316","msg":"ddbimport step function succesfully deployed","v":""}
Seems ok, no errors but running the import remote dies with :-
{"level":"fatal","ts":1641670138.7249568,"caller":"cmd/main.go:352","msg":"ddbimport state machine not found. Have you deployed the ddbimport Step Function?"
If I login to AWS console nothing in the stepfunction admin so looks like its not installed. I've looked at code but not a GO developers so getting bogged down.
Thanks
Hi, just a tip, since you mention that uploading is slow.
Make sure that your table has the read/write capacity set to "On-Demand", otherwise, AWS will throttle your import requests
Hi @Tony-OAA - sorry to hear the Step Function isn't getting installed there. The installation creates a CloudFormation stack, which you should be able to see in the AWS Console.
The most common problems are:
- Installing in the wrong account (the tool uses your local AWS creds or environment, so it's pretty easy to install it in the wrong account).
- Installing into, or looking in the wrong region (I think everyone who's ever worked on AWS has done this!).
It could be that the CloudFormation stack failed in some way, so I'd be interested in seeing the output of the CloudFormation stack.
I'm happy to catch up on a video chat to troubleshoot after UK working hours, e.g. 6pm.
For me. I need to replace "S3Key": "/ddbimport.zip"
in the CloudFormation template with "S3Key": "ddbimport.zip"
.
Ah, that's good to know @thuantan2060, thanks!
Yesterday, AWS released a feature to create a new table containing data - https://aws.amazon.com/about-aws/whats-new/2022/08/amazon-dynamodb-supports-bulk-imports-amazon-s3-new-dynamodb-tables/
I haven't tried it out yet, but I suspect that it will be the best or cheapest way to do it.
Write units in DynamoDB are $1.25 per million write request units, which are 1KB each, so around $1 per GB to import data.
Reading through the launch details, data import from S3 costs around $0.15 per GB of uncompressed data, and doesn't consume any write capacity, or, use Lambda functions. So I think it should be cheaper.
No idea how you debug the AWS solution, how fast it is, how you deal with issues etc. though.