amplify-category-api icon indicating copy to clipboard operation
amplify-category-api copied to clipboard

Is there a way to specifiy thedynamodb tablename when creating a api resource using amplify?

Open djkingh01 opened this issue 5 years ago • 40 comments

Note: If your question is regarding the AWS Amplify Console service, please log it in the official AWS Amplify Console forum

** Which Category is your question related to? ** amplify

** What AWS Services are you utilizing? ** app sync

** Provide additional details e.g. code snippets **

djkingh01 avatar Mar 08 '19 00:03 djkingh01

Currently whenever I try to create resources using amplify it concatenates a long randon string? How can this be prevented?

djkingh01 avatar Mar 08 '19 00:03 djkingh01

You can specify the resource names that are prompted during the execution of the add command. Others are auto generated for you and you are not advice change them because they may be referenced by other resources. The CLI uses long random string to prevent accidental name clashes.

UnleashedMind avatar Mar 08 '19 02:03 UnleashedMind

In my understanding the resources that are prompted during add command execution are for Project name and env. This has little to do with generated tablenames so it does not solve my problem. Generated tablenames are created from the typenames in graphql schema. We have a usecase where we are storing the same table for storing multiple types. How can achieve that using amplify?

Additionally when we deploy to production we want to control our tablenames.

djkingh01 avatar Mar 08 '19 14:03 djkingh01

@djkingh01 Currently the @model directive expects to contain a single data type, however, we are looking at generlizing this to allow overloaded indexes & multiple types per table but there is not set date on when this will be available. This being said, you are able to write your own resolvers that target your own data sources within the CLI using custom stacks.

The process to do this would be to

  1. Create a DynamoDB Table resource within a custom CloudFormation stack.
  2. Create a scoped down IAM role that has access to your table and it's indexes as well as a trust policy for appsync.amazon.com.
  3. Create an AppSync Datasource resource that points to your IAM role and DynamoDB table.
  4. Add resolvers that target your Datasource. These resolvers can return whatever type of data is stored in your DynamoDB table.

See https://aws-amplify.github.io/docs/cli/graphql#api-category-project-structure for more info.

mikeparisstuff avatar Mar 08 '19 21:03 mikeparisstuff

I have a follow up question as I was trying out the solution you suggested. If I did follow your solution will I be able to declare entities using schema.graphql and auto generate any code or will I have to do everything manually?

djkingh01 avatar Mar 12 '19 12:03 djkingh01

@djkingh01 For now, yes, you would have to implement everything yourself. We do want to support the use case where you can adopt an existing table for an @model type, but it has not yet been implemented. We have looked at a few approaches including expanding @model with new arguments as well as adding a new directive such as @table. If you are interested and want to contribute a potential design, I would be happy to review it.

P.S. We have also discussed opening up the CLI such that you can write and run your own transformers as part of the normal workflow. I'm curious how many people would want this feature?

mikeparisstuff avatar Mar 12 '19 17:03 mikeparisstuff

I believe a lot of people should be interested. My usecase is such that we have an existing DynamoDB and code for CRUD with a pk/sk hierarchichal access pattern in a single table. Amplify is a great way to easily migrate out backend to graphql and reduce testing footprint. All we need is @sk and @datasource support in someway. @sk so that the generated resolvers use a @pk/@sk before using queries. @datasource - I can manually define or use custom cloudformation template. Then use amplify to define and author the schema and go live. I am sure I am not the only one with this problem. In its current form the system only works for people who have new projects and that too I am not sure how many can have tables in their production with randomely generated tablenames that they cannoty control.

djkingh01 avatar Mar 15 '19 12:03 djkingh01

It took me a long time to get these right so sharing for others benefit.

  1. Below is a Todo schema.graphql file that supports PrimaryKey, SortKey.
  2. Resolvers

`type Todo { pk: String! sk: String! displayName: String notes: String }

input TodoKey { pk: String! sk: String! }

input CreateTodoInput { pk: String! sk: String! displayName: String notes: String }

type ModelTodoConnection { items: [Todo] nextToken: String }

input ModelTodoFilterInput { pk: ModelStringFilterInput sk: ModelStringFilterInput displayName: ModelStringFilterInput notes: ModelStringFilterInput and: [ModelTodoFilterInput] or: [ModelTodoFilterInput] not: ModelTodoFilterInput }

input ModelBooleanFilterInput { ne: Boolean eq: Boolean }

input ModelFloatFilterInput { ne: Float eq: Float le: Float lt: Float ge: Float gt: Float contains: Float notContains: Float between: [Float] }

input ModelIDFilterInput { ne: ID eq: ID le: ID lt: ID ge: ID gt: ID contains: ID notContains: ID between: [ID] beginsWith: ID }

input ModelIntFilterInput { ne: Int eq: Int le: Int lt: Int ge: Int gt: Int contains: Int notContains: Int between: [Int] }

enum ModelSortDirection { ASC DESC }

input ModelStringFilterInput { ne: String eq: String le: String lt: String ge: String gt: String contains: String notContains: String between: [String] beginsWith: String }

type Mutation { createTodo(input: CreateTodoInput!): Todo updateTodo(input: UpdateTodoInput!): Todo deleteTodo(input: TodoKey!): Todo }

type Query { getTodo(query: TodoKey!): Todo listTodos(query: ModelTodoFilterInput, filter: ModelTodoFilterInput, limit: Int, nextToken: String): ModelTodoConnection }

type Subscription { onCreateTodo: Todo @aws_subscribe(mutations: ["createTodo"]) onUpdateTodo: Todo @aws_subscribe(mutations: ["updateTodo"]) onDeleteTodo: Todo @aws_subscribe(mutations: ["deleteTodo"]) }

input UpdateTodoInput { pk: String sk: String displayName: String notes: String }`

a. CreateTodo `## [Start] Prepare DynamoDB PutItem Request. ** $util.qr($context.args.input.put("createdAt", $util.time.nowISO8601())) $util.qr($context.args.input.put("updatedAt", $util.time.nowISO8601())) $util.qr($context.args.input.put("__typename", "Todo")) { "version": "2017-02-28", "operation": "PutItem", "key": { "pk": $util.dynamodb.toDynamoDBJson($util.defaultIfNullOrBlank($ctx.args.input.pk, $util.autoId())), "sk": $util.dynamodb.toDynamoDBJson($util.defaultIfNullOrBlank($ctx.args.input.sk, $util.autoId())) }, "attributeValues": $util.dynamodb.toMapValuesJson($context.args.input), "condition": { "expression": "attribute_not_exists(#pk) and attribute_not_exists(#sk) ", "expressionNames": { "#pk": "pk", "#sk": "sk" } } }

[End] Prepare DynamoDB PutItem Request. **`

b. DeleteTodo #if( $authCondition ) #set( $condition = $authCondition ) $util.qr($condition.put("expression", "$condition.expression AND attribute_exists(#pk) AND attribute_exists(#sk) ")) $util.qr($condition.expressionNames.put("#pk", "pk")) $util.qr($condition.expressionNames.put("#sk", "sk")) #else #set( $condition = { "expression": "attribute_exists(#pk) and attribute_exists(#sk) ", "expressionNames": { "#pk": "pk", "#sk": "sk" } } ) #end #if( $versionedCondition ) $util.qr($condition.put("expression", "($condition.expression) AND $versionedCondition.expression")) $util.qr($condition.expressionNames.putAll($versionedCondition.expressionNames)) #set( $expressionValues = $util.defaultIfNull($condition.expressionValues, {}) ) $util.qr($expressionValues.putAll($versionedCondition.expressionValues)) #set( $condition.expressionValues = $expressionValues ) #end { "version": "2017-02-28", "operation": "DeleteItem", "key": { "pk": $util.dynamodb.toDynamoDBJson($ctx.args.input.pk), "sk": $util.dynamodb.toDynamoDBJson($ctx.args.input.sk) }, "condition": $util.toJson($condition) }

c. Update Todo `#if( $authCondition && $authCondition.expression != "" ) #set( $condition = $authCondition ) $util.qr($condition.put("expression", "$condition.expression AND attribute_exists(#pk) AND attribute_exists(#sk)"))

$util.qr($condition.expressionNames.put("#pk", "pk")) $util.qr($condition.expressionNames.put("#sk", "sk")) #else #set( $condition = { "expression": "attribute_exists(#pk) and attribute_exists(#sk) ", "expressionNames": { "#pk": "pk", "#sk": "sk" }, "expressionValues": {} } ) #end

Automatically set the updatedAt timestamp. **

$util.qr($context.args.input.put("updatedAt", $util.time.nowISO8601())) $util.qr($context.args.input.put("__typename", "Todo"))

Update condition if type is @versioned **

#if( $versionedCondition ) $util.qr($condition.put("expression", "($condition.expression) AND $versionedCondition.expression")) $util.qr($condition.expressionNames.putAll($versionedCondition.expressionNames)) $util.qr($condition.expressionValues.putAll($versionedCondition.expressionValues)) #end #set( $expNames = {} ) #set( $expValues = {} ) #set( $expSet = {} ) #set( $expAdd = {} ) #set( $expRemove = [] ) #foreach( $entry in $util.map.copyAndRemoveAllKeys($context.args.input, ["pk","sk"]).entrySet() ) #if( $util.isNull($entry.value) ) #set( $discard = $expRemove.add("#$entry.key") ) $util.qr($expNames.put("#$entry.key", "$entry.key")) #else $util.qr($expSet.put("#$entry.key", ":$entry.key")) $util.qr($expNames.put("#$entry.key", "$entry.key")) $util.qr($expValues.put(":$entry.key", $util.dynamodb.toDynamoDB($entry.value))) #end #end #set( $expression = "" ) #if( !$expSet.isEmpty() ) #set( $expression = "SET" ) #foreach( $entry in $expSet.entrySet() ) #set( $expression = "$expression $entry.key = $entry.value" ) #if( $foreach.hasNext() ) #set( $expression = "$expression," ) #end #end #end #if( !$expAdd.isEmpty() ) #set( $expression = "$expression ADD" ) #foreach( $entry in $expAdd.entrySet() ) #set( $expression = "$expression $entry.key $entry.value" ) #if( $foreach.hasNext() ) #set( $expression = "$expression," ) #end #end #end #if( !$expRemove.isEmpty() ) #set( $expression = "$expression REMOVE" ) #foreach( $entry in $expRemove ) #set( $expression = "$expression $entry" ) #if( $foreach.hasNext() ) #set( $expression = "$expression," ) #end #end #end #set( $update = {} ) $util.qr($update.put("expression", "$expression")) #if( !$expNames.isEmpty() ) $util.qr($update.put("expressionNames", $expNames)) #end #if( !$expValues.isEmpty() ) $util.qr($update.put("expressionValues", $expValues)) #end { "version": "2017-02-28", "operation": "UpdateItem", "key": { "pk": { "S": "$context.args.input.pk" }, "sk": { "S": "$context.args.input.sk" }

}, "update": $util.toJson($update), "condition": $util.toJson($condition) }`

d. ListTodo with support for queries ##* Using limit with filter expression can cause issues because imp. Dynamodb evaluates filter after limit. So filter could return 2 results that do not match filter expression*# #set( $limit = $util.defaultIfNull($context.args.limit, 10) ) { "version": "2017-02-28", "operation": "Query", "query" : #if( $context.args.query ) $util.transform.toDynamoDBFilterExpression($ctx.args.query) #else null #end, "filter": #if( $context.args.filter ) $util.transform.toDynamoDBFilterExpression($ctx.args.filter) #else null #end, "limit": $limit, "nextToken": #if( $context.args.nextToken ) "$context.args.nextToken" #else null #end }

e. GetTodo { "version": "2017-02-28", "operation": "GetItem", "key": { "pk": $util.dynamodb.toDynamoDBJson($ctx.args.query.pk), "sk": $util.dynamodb.toDynamoDBJson($ctx.args.query.sk) } }

djkingh01 avatar Mar 19 '19 13:03 djkingh01

I created 1 stack for the custom table, datasource and role? How do I point to this stack from another stack where the schema definitions are created?

djkingh01 avatar Mar 20 '19 20:03 djkingh01

@mikeparisstuff

Currently the @model directive expects to contain a single data type, however, we are looking at generlizing this to allow overloaded indexes & multiple types per table but there is not set date on when this will be available. This being said, you are able to write your own resolvers that target your own data sources within the CLI using custom stacks.

My team is very interested in this feature. We've recently had a Well Architected Review with an AWS Solutions Architect and our AWS rep and one of the issues that came out of that was that we should consolidate all our types into one big table. We were told that was the proper usage of DynamoDB. I know you guys build features based on demand, so I am just commenting to vote on it.

Thanks for all your hard work!

timoteialbu avatar Apr 06 '19 19:04 timoteialbu

+1. I'd like to use the auto-generated code but against a different table than the amplify cli default as well. This is needed to do https://github.com/aws-amplify/amplify-cli/issues/1037 and aws-amplify/amplify-category-api#453.

Would also like an option to create the table in a different region than the amplify default.

hisham avatar Apr 08 '19 22:04 hisham

@djkingh01 For now, yes, you would have to implement everything yourself. We do want to support the use case where you can adopt an existing table for an @model type, but it has not yet been implemented. We have looked at a few approaches including expanding @model with new arguments as well as adding a new directive such as @table. If you are interested and want to contribute a potential design, I would be happy to review it.

P.S. We have also discussed opening up the CLI such that you can write and run your own transformers as part of the normal workflow. I'm curious how many people would want this feature?

@mikeparisstuff I am interested in the @table directive (specifying a current table for @model..to support the best-practices of single table desing). Single Table is "multiples types per table" design,

  1. Are you any further on this design? can I get a beta version? 2.) Are you openinging up the CLI to custome transformers/directives?

clinicalinkgithub avatar Jul 20 '19 13:07 clinicalinkgithub

@djkingh01 For now, yes, you would have to implement everything yourself. We do want to support the use case where you can adopt an existing table for an @model type, but it has not yet been implemented. We have looked at a few approaches including expanding @model with new arguments as well as adding a new directive such as @table. If you are interested and want to contribute a potential design, I would be happy to review it.

P.S. We have also discussed opening up the CLI such that you can write and run your own transformers as part of the normal workflow. I'm curious how many people would want this feature?

@mikeparisstuff I am interested in the @table directive (specifying a current table for @model..to support the best-practices of single table desing). Single Table is "multiples types per table" design,

Are you any further on this design? can I get a beta version? 2.) Are you openinging up the CLI to custome transformers/directives?

clinicalinkgithub avatar Jul 20 '19 13:07 clinicalinkgithub

+1

romislovs avatar Oct 23 '19 11:10 romislovs

+1

andreikrasnou avatar Oct 24 '19 11:10 andreikrasnou

+1

jon144 avatar Oct 31 '19 16:10 jon144

+1

dtelaroli avatar Nov 14 '19 16:11 dtelaroli

+1

mnishiguchi avatar Dec 08 '19 02:12 mnishiguchi

+1

alexkates avatar Dec 17 '19 14:12 alexkates

+1

EphemeralX avatar Jan 21 '20 03:01 EphemeralX

+1

givenm avatar Apr 19 '20 15:04 givenm

+1

justinsamuel92 avatar Apr 23 '20 22:04 justinsamuel92

+1

akash-jose avatar Apr 24 '20 03:04 akash-jose

+1 I think it will be pretty useful if we can specify the table name.

MehdiTAZI avatar May 10 '20 17:05 MehdiTAZI

+1 is this released yet? :)

vladimirpekez avatar May 18 '20 20:05 vladimirpekez

+1

Hernanm0g avatar Jun 13 '20 04:06 Hernanm0g

+1

castri1 avatar Jul 03 '20 22:07 castri1

+1 🔥🔥🔥 !!

MauriceWebb avatar Jul 22 '20 17:07 MauriceWebb

+1

Sparkboxx avatar Sep 30 '20 19:09 Sparkboxx

+1

wedwards-inirv avatar Oct 26 '20 20:10 wedwards-inirv