All Questions
Tagged with amazon-dynamodb amazon-s3
452
questions
0
votes
1
answer
29
views
Redshift COPY with JSON PATH Case Sensitivity
I want to copy a file from S3 to Redshift. Content is in camelCase:
{"accountId":{"S":"1acb4"}}
Redshift columns are in all lowercase. This is marshalled Dynamo JSON, so ...
0
votes
1
answer
24
views
Add A Single Column CSV To DynamoDb Table Via S3
I have a large CSV file of 265,000,000 domain names, one domain per line. I want to upload this file to AWS S3 and then upload the file to DynamoDb. I have some questions.
Can I make the domain on ...
0
votes
0
answers
11
views
Can I use a single DynamoDB table for state locking in multiple terraform projects? [duplicate]
I currently have multiple projects and I’m using one S3 bucket for my state backend . These multiple project state files are separated by paths in the S3 bucket. Now, I want to implement DynamoDB ...
0
votes
1
answer
22
views
Return the account name for the most successful creator on the platform, in terms of total income from the AWS dynamodb table
I want to conduct a query to find out account name who has higest income.But i couldnt able to do so in AWS partiQL editor.
the table is like
Account Country Account Name Account Type Premium ...
0
votes
2
answers
67
views
How To Control Object Type When Importing Large File From S3 To DynamoDB
I have a csv sitting in an S3 Bucket about 900,000 rows long, and within that csv I have two columns phone and ttl.
I am able to successfully import this csv into a new DynamoDB table, however I am ...
0
votes
1
answer
384
views
AWS Glue Python Jobs VS AWS Glue Spark Jobs
I have a usecase where I have to create an AWS Glue ETL job to update data stored in S3 Objects to an existing table in DDB. Few properties to be considered are :
Currently the dataset is approx 40 MB ...
0
votes
1
answer
38
views
Create custom IAM policy to hide S3 objects based on a catalog hosted in DynamoDB
I have an S3 bucket containing objects and I have a catalog in DynamoDB where I have information about the S3 objects such as
id (the primary key) corresponding to the object path in the S3
creation ...
0
votes
0
answers
55
views
Upload PySpark Dataframe into a ION format file to S3
Current issues:
I can upload a PySpark dataframe into a compressed CSV file to S3. However, I want to test out DynamoDB import and see if I can upload ION format files to S3. The reason is that I have ...
-1
votes
1
answer
166
views
Efficient solution to load million records in dynamo Db
Am looking for suggestions from experts to solution a use case where we do data migration to dynamo db . The number we are expecting here is 10 to 15 million records . All these records will be ...
0
votes
0
answers
22
views
AWS - Make multiple HTTP requests in pipeline resolver function
I have this pipeline resolver that has two functions, first function scans a DynamoDB.
Second function reads the data from the scan and then makes an HTTP request to an s3 bucket (I did it this way ...
0
votes
1
answer
152
views
Best way of monitoring exports from AWS DynamoDB to the S3 bucket
I'm trying to add any monitoring of exports from AWS DynamoDB to the S3 bucket. I tried to find any AWS Cloudwatch metrics or logs related to exports but didn't spot any. I'm using the ...
0
votes
1
answer
162
views
bulk load parquet data from s3 to dynamob table
UPDATE 02/06:
So, for now I was able to accomplish said task as below:
%python
from pyspark.sql import functions as F
import zlib
import json
import boto3
from datetime import datetime
from pyspark ...
1
vote
1
answer
697
views
Export DynamoDb data to a newly created S3 bucket
I'm trying to figure out the solutions of how exporting DynamoDB tables to a newly created S3 buckets. The bucket size is around 700TB (700000 GB). I have looked at different solutions which I have ...
0
votes
2
answers
104
views
AWS Lambda Functions Using Serverless Framework: SQS Queue Empty, DynamoDB Table Not Updating
I'm currently working on a project where I've set up AWS Lambda functions using the Serverless Framework to process jobs from an SQS queue and create entries in a DynamoDB table. The deployment seems ...
0
votes
1
answer
251
views
Efficient method to import data from s3 to dynamodb by masking specific values of dynamodb attributes
I have a dynamodb dump in s3 , I am trying to create new dynamodb tables from the dump. In the source table I have a particular attribute of type Map that I want to mask in the destination table.
I am ...