serverless-plugins icon indicating copy to clipboard operation
serverless-plugins copied to clipboard

SLS Offline DynamoDB Streams: handler function triggered only once

Open mcopik opened this issue 4 years ago • 2 comments

Hi!

I've been trying to use the offline plugins to perform local testing of my lambda integration with DynamoDB streams. Unfortunately, it seems that I can't achieve consistent behavior. I began by starting both DynamoDB and the plugin instance in a single run of sls offline:

custom:                      
  serverless-offline:          
    host: 0.0.0.0           
    port: 4000    
    dontPrintOutput: false     
  dynamodb:              
    start:    
      port: 8000                
      migrate: true
      noStart: false                                                                                                   
    stages:                   
      - ${self:provider.stage}
  serverless-offline-dynamodb-streams:
    apiVersion: '2013-12-02' 
    endpoint: http://0.0.0.0:8000
    region: us-east-1       
    skipCacheInvalidation: false

functions:                  
  writer:                           
    handler: aws/writer.handler
    environment:  
      VERBOSE_LOGGING: ${env:FK_VERBOSE}
    events:              
      - http:                       
          path: writer
          method: POST      
      - stream:                             
          enabled: true
          type: dynamodb                                                                                               
          batchSize: 1
          startingPosition: LATEST
          maximumRetryAttempts: 3
          arn:
            Fn::GetAtt:
              - writerTable
              - StreamArn

resources:
  Resources:
    writerTable:
      Type: AWS::DynamoDB::Table
      Properties:
        TableName: ${self:provider.environment.DYNAMODB_TABLE}-write-queue
        ProvisionedThroughput:
          ReadCapacityUnits: 5
          WriteCapacityUnits: 5
        AttributeDefinitions:
          - AttributeName: key
            AttributeType: S
          - AttributeName: timestamp
            AttributeType: S
        KeySchema:
          - AttributeName: key
            KeyType: HASH
          - AttributeName: timestamp
            KeyType: RANGE
        StreamSpecification:
          StreamViewType: NEW_IMAGE

However, this worked only on a second item inserted into the database. I read about the issue with adding streams to an empty DB (#84), so I change the workflow: started DB, added items and started offline using the existing DDB instance. This time I receive a callback to lambda on the very first invocation but nothing later? Pretty sure my lambda terminates correctly since I receive the last output before exit. I tried NEW_AND_OLD_IMAGES stream type and nothing changed.

All entries used the same key, so they should be located in a single shard. I changed that as well but the behavior is still the same.

mcopik avatar Jan 08 '21 00:01 mcopik

Had same issue, make sure you call callback in your handler:

() => {
    return (event, context, callback) => {
      console.log('handler')
      console.log('event', JSON.stringify(event))
      callback()
    }
  }

dmitriy-baltak avatar Mar 31 '21 12:03 dmitriy-baltak

@Alivejke Thanks for the tip! However, I'm running Python functions, and there are no callbacks there.

mcopik avatar Apr 17 '21 22:04 mcopik