๐ฐ Beginners new to AWS CDK, please do look at my previous articles one by one in this series.
If in case missed my previous article, do find it with the below links.
๐ Original post at ๐ Dev Post aws-cdk-101-sam-local-to-test-and-debug-lambda-function
๐ Reposted at ๐ dev to @aravindvcyber aws-cdk-101-sam-local-to-test-and-debug-lambda-function-1afj
Also, we have started to develop an open source project which we would be using to play around with refracting the architecture as well as learn CDK stuff at the same time we will provide something useful for our community. Find more about this discussed in the article below.
๐ Original project post at ๐ Dev Post aws-cdk-101-projects-cdk-stackresourcedrift-events-forwarded-to-cool-slack-posts-event-forwarder
๐ Reposted project post at ๐ dev to aws-cdk-101-projects-cdk-stackresourcedrift-events-forwarded-to-cool-slack-posts-event-forwarder-1m0m
Dynamodb local with sam invoke ๐ก
Earlier in our article, we have seen how to set up local integration testing of the lambda using sam invoke. Dynamodb local will not only be a cool integration setup to inspect in locally, but also it can help to understand the dynamodb data model using NoSQL workbench.
So why do we need now โ
In our previous article, we mentioned that we have sam local to test the function locally. But we don't have the local dynamodb. We set up that brings that into the same docker network in this article to fix the below error.
Thank you fully the error slack hook post the exception clearly to my slack
Docker composes dynamodb local ๐
Let us use docker compose to spin up dynamodb local
Dynamodb local container ๐คก
I prefer to use the docker mode of setting up dynamodb local since it was very much predictable and can also use docker compose yml file alongside.
Here in this snippet of yaml file below, you can find a simple dynamodb docker container bound to cloud
network and local volume ddb-data
which has been already pre-created.
version: "3.5"
services:
dynamo:
container_name: local-ddb
image: amazon/dynamodb-local
networks:
- cloud
ports:
- "8000:8000"
volumes:
- ddb-data:/home/dynamodblocal
working_dir: /home/dynamodblocal
command: "-jar DynamoDBLocal.jar -sharedDb -dbPath ."
networks:
cloud:
external: true
volumes:
ddb-data:
external: true
Create the docker network ๐ฆ
Here let us define a docker network to be shared by our local cloud containers.
docker network create -d bridge cloud
Create the docker volume ๐
Here let us create a new docker volume and mount it to our local folder
docker volume create \
--driver local \
--opt type=none \
--opt o=bind \
--opt device=*******Your Local Volume folder*********** \
ddb-data
Start and Stop dynamodb local ๐ฑ
Navigate to the directory you have defined the docker compose file. Open the terminal and run the below command to start it.
docker-compose up -d
And to stop the container run the below command
docker-compose down
Validate the ddb connection local ๐ด
Run the below command in the local terminal to make sure the port is correct as per your docker-compose you have defined above.
aws dynamodb list-tables --endpoint-url http://localhost:8000
You may get some results like the below.
{
"TableNames": [
]
}
Create local table ๐ธ
Before that if the table already exists you can delete it as shown below.
aws dynamodb delete-table --table-name eventStores9 --endpoint-url http://localhost:8000
Net us create the table by copying the schema from the cloud table by running the below describe-table command
aws dynamodb describe-table --table-name eventStores9 --profile av > eventStores9.json
Note in the above step, I have not used the endpoint but rather used my AWS named profile av
to connect. In your case, you can have your own and write the output to the file eventStores9.json
Now run the below command using the file created above (note it is expected to throw error)
aws dynamodb create-table --cli-input-json file://eventStores9.json --endpoint-url http://localhost:8000
When you get the above error, you simply have to remove those nodes from the JSON file you are using and the cleanup version of the file is already checkin to the local of this project to help you identify.
And then you can run the same command again, if it does not throw any error it should create the table and will give the description of the table as output.
aws dynamodb create-table --cli-input-json file://eventStores9.json --endpoint-url http://localhost:8000
Validate by listing the tables locally :cherry_blossom:
aws dynamodb list-tables --endpoint-url http://localhost:8000
You may get some results like the below.
{
"TableNames": [
"eventStores9"
]
}
And there you have it your local dynamodb table is ready to help you.
NoSQL Workbench ๐ฒ
docs.aws.amazon.com/amazondynamodb/latest/d..
You may also try this tool which can help you model dynamodb much faster and even push it to the cloud or local connection. However, this tool does not support LSI in the data modeling stage. Hope it is available soon. If it supports LSI, we would have directly imported the cloud formation template without parameters to create the schema in the tool itself. I tried it but is ignoring the LSI. So that is why I used the AWS CLI to generate the table.
Still, this tool can help us with the Operation Builder
feature which will connect to the dynamodb tables directly and help with a range of other features and even help with some code snippets as well to design.
Create a new database connection ๐
Inspect the metadata and LSI and GSI definitions
With this interface, you can directly query, inspect and manipulate the local table data during testing and debugging.
Further, you can also use LSI and GSI in your operations seamlessly.
Code changes to the function ๐
Code changes to the function to enable local ddb connection. Just like how we identified AWS_SAM_LOCAL, we can do the same here as well in the dynamodb-util.ts
file.
Take note of the name of the host it is the container name we defined in docker-compose yaml
if(process.env.AWS_SAM_LOCAL) {
options.endpoint = 'http://local-ddb:8000'
}
const dynamo = AWSXRay.captureAWSClient(new DynamoDB(options));
Run script changes ๐ฒ
Here we will readily know that sam CLI scripts are part of the run scripts as we demonstrated in the previous article. Here we need to include the docker network while using sam local invoke
to be able to connect to used other
{
"sam:i": "sam local invoke $npm_config_fname --docker-network cloud -e $npm_config_event | jq .",
"sam:i:debug": "sam local invoke $npm_config_fname --docker-network cloud -d 9999 -e $npm_config_event --log-file logs/sam-debug-logs.txt --debug | jq .",
}
This should help to connect the function with local dynamodb as expected.
Post testing events results in dynamodb ๐ฐ
I have run all the test events I have prepared in the last article in events
and I can see the results in the local table.
npm run t:1:1 -- --event=events/event_stack_create_complete.json
npm run t:1:1 -- --event=events/event_stack_delete_complete.json
npm run t:1:1 -- --event=events/event_stack_update_complete.json
npm run t:1:1 -- --event=events/event_resouce_create_complete.json
npm run t:1:1 -- --event=events/event_drift_detection_complete.json
Scan dynamodb items
Normal Query
Query using LSI_STATUS
Debugging the data in vscode ๐ฟ
npm run d:1:func1 -- --event=events/event_stack_create_complete.json
Verify that it is passing through the sam local script section when it is true and it resets the dynamodb endpoint.
With this approach, we will be able to monitor data moving in and out of the dynamodb table in local
If you don't like to use js debugging when using typescript with webpack you can add the below to your launch config attach configuration sourceMapPathOverrides
as discussed in the previous article.
{
"sourceMapPathOverrides": {
"meteor://๐ปap{workspaceFolder}/*",
"webpack:///./{workspaceFolder}/node_*",
"webpack://?:{workspaceFolder}/*"
}
Then directly you can place break points in the typescript files and you can cleanly debug them as follows.
Sample typescript debugging screenshots
Sample typescript debugging screenshots with a specific point of interest
Conclusion โฒ
This will be extremely useful when you repeatedly iterate and make code changes to the processor and you could use the test event to test it swiftly using local dynamodb setup and integration.
We will be talking about more similar engineering concepts as we refactor and refine the event forwarder project. Keep following for similar posts on engineering with IaC primarily using AWS CDK and Serverless.
Also, feel free to contribute to the progress of the below solution with your comments, and issues, maybe you can also do a pr if you feel it can help our community.
๐ Original project post at ๐ Dev Post aws-cdk-101-projects-cdk-stackresourcedrift-events-forwarded-to-cool-slack-posts-event-forwarder
๐ Reposted project post at ๐ dev to aws-cdk-101-projects-cdk-stackresourcedrift-events-forwarded-to-cool-slack-posts-event-forwarder-1m0m
โญ We have our next article in serverless and IaC, do check out
๐ Thanks for supporting! ๐
Would be great if you like to โ Buy Me a Coffee, to help boost my efforts ๐.
๐ Original post at ๐ Dev Post aws-cdk-101-dynamodb-local-setup-and-integrating-with-sam-invoke
๐ Reposted at ๐ dev to @aravindvcyber aws-cdk-101-dynamodb-local-setup-and-integrating-with-sam-invoke-527f