Random Wits

Life is too short for a diary

Random

$ random stuff
    ├── tags
    ├── bookshelf
    ├── resources
    ├── quotes
    ├── habits
    └── about me
say Hello

Back to Top


Fri 23 Oct 2020

Export DynamoDB Table to S3 Bucket Using Lambda Function

Tags: aws lambda function dynamodb s3

Dynamodb is a great NoSQL service by AWS. Often it’s required to export data from the dynamodb table .

First, let us review our use case. Our lambda function will read from table from dynamodb and export JSON to s3.

Using boto3 resource


We can create a payload to test our lambda function

{
   "TableName": "DynamoDB_Table_name",
   "s3_bucket": "s3_bucket_name",
   "s3_object": "s3_object_name",
   "filename": "output.json"
}

However, sometimes we might encounter errors for certain values in DynamoDB.

TypeError: Object of type Decimal is not JSON serializable. 

We can use a JSONEncoder class to update our lamda function.

Using boto3 client

Another way to export data is to use boto3 client. It’s a low level AWS services.

However boto3 client will generates dynamodb JSON. A simple python script to convert it back to normalized JSON using dynamodb_json library.

import time
import uuid
from datetime import datetime
from decimal import Decimal

from dynamodb_json import json_util as json2
import json
import sys

filename = sys.argv[1]
output = sys.argv[2]

with open(filename) as f:
    data = json.load(f)
	
data_new = json2.load(data)

with open(output, 'w') as outfile:
    json.dump(data_new, outfile)


comments powered by Disqus