If you have problems with serverless framework or another cloud formation setup to create the tables because they already exist there is another option:
- delete the tables (make a backup)
- deploy
- delete the tables (no backup)
- retore the tables from the backup
Note: The restore could last for hours.
If you really want to do dump and restore continue reading…
Directly into the local Dynamodb
To get your data locally from an aws Dynamodb you should spin your local Dynamodb server
docker run -p 8000:8000 amazon/dynamodb-local
Let’s pull and create the schema in the local.
$TABLE=essential-events
aws dynamodb describe-table \
--table-name $TABLE | \
jq '.Table | del(.TableId, .TableArn, .ItemCount, .TableSizeBytes, .CreationDateTime, .TableStatus, .ProvisionedThroughput.NumberOfDecreasesToday)' | \
tee $TABLE-schema.json
aws dynamodb create-table --cli-input-json file://$TABLE-schema.json --endpoint-url http://localhost:8000
aws dynamodb list-tables --endpoint-url http://localhost:8000
Now as we have the local Dynamodb we can read from the AWS and import into the local.
The tricky part is that the cloud is hostile, you can only import max 25 items and no more than 16mb and so on…
So instead of having a nice export/import one-liners, we have to write a short script.
This script will read 25 records and import them to the local Dynamodb.
TABLE=essential-events
maxItems=25
index=0
DATA=$(aws dynamodb scan --table-name $TABLE --max-items $maxItems)
((index+=1))
echo $DATA | jq ".Items | {\"$TABLE\": [{\"PutRequest\": { \"Item\": .[]}}]}" > inserts.jsons
aws dynamodb batch-write-item --request-items file://inserts.jsons --endpoint-url http://localhost:8000
nextToken=$(echo $DATA | jq '.NextToken')
while [[ "${nextToken}" != "" ]]
do
DATA=$(aws dynamodb scan --table-name $TABLE --max-items $maxItems --starting-token $nextToken)
((index+=1))
echo $DATA | jq ".Items | {\"$TABLE\": [{\"PutRequest\": { \"Item\": .[]}}]}" > inserts.jsons
aws dynamodb batch-write-item --request-items file://inserts.jsons --endpoint-url http://localhost:8000
nextToken=$(echo $DATA | jq '.NextToken')
done
By saving it to files
As this probably will be the development table, we would like to do multiple imports from the exported data. Here is a version of the upper script which stores the data in files so that you can re-import later.
TABLE=essential-events
maxItems=25
index=0
DATA=$(aws dynamodb scan --table-name $TABLE --max-items $maxItems)
((index+=1))
echo $DATA | cat > "$TABLE-$index.json"
nextToken=$(echo $DATA | jq '.NextToken')
while [[ "${nextToken}" != "" ]]
do
DATA=$(aws dynamodb scan --table-name $TABLE --max-items $maxItems --starting-token $nextToken)
((index+=1))
echo $DATA | cat > "$TABLE-$index.json"
nextToken=$(echo $DATA | jq '.NextToken')
done
Now we have a bunch of table-NNN.json files. When I want to do the import I do
for x in `ls *$TABLE*.json`; do
cat $x | jq ".Items | {\"$TABLE\": [{\"PutRequest\": { \"Item\": .[]}}]}" > inserts.jsons
aws dynamodb batch-write-item --request-items file://inserts.jsons
done
How to “truncate” Dynamodb table
$TABLE=essential-events
aws dynamodb describe-table \
--table-name $TABLE | \
jq '.Table | del(.TableId, .TableArn, .ItemCount, .TableSizeBytes, .CreationDateTime, .TableStatus, .ProvisionedThroughput.NumberOfDecreasesToday)' | \
tee $TABLE-schema.json
aws dynamodb delete-table --table-name $TABLE
aws dynamodb create-table --cli-input-json file://$TABLE-schema.json
Credits for the truncation go to https://medium.com/@samnco/deleting-content-in-dynamodb-from-the-cli-831ce5ab083c