Skip to main content
The 2024 Developer Survey results are live! See the results

All Questions

Tagged with
-3 votes
1 answer
11 views

How can I upload a CSV file from a local folder to a specific folder within an S3 bucket using Python?

If I have a CSV file stored in a specific local folder, how can I upload it to a specific folder within an S3 bucket using Python?
Buddhadeb Mondal's user avatar
0 votes
1 answer
35 views

AWS boto3 can't create a bucket - Python

Iam facing an issue that I my code is not successfully to create bucket in AWS using boto3 python. Below my code import boto3 s3 = boto3.resource('s3') def create_bucket(bucket_name, region='us-east-...
Kusuma ningrat's user avatar
0 votes
0 answers
33 views

Looking to skip over specific "subfolder" when using list_objects_v2 in AWS Lambda Function

I have a AWS Lambda function that will "move" objects in my S3 bucket to another location in the same bucket (changes the key) after a certain amount of time. I have this location, the ...
BarbaricBrew's user avatar
0 votes
0 answers
13 views

Connect to multiple aws accounts using ARN" in python

Friends, I need help in "Connect to multiple aws accounts using ARN" in python, got below link for reference. https://docs.aws.amazon.com/IAM/latest/UserGuide/tutorial_cross-account-with-...
Pratik Asthana's user avatar
1 vote
1 answer
109 views

Reading data from S3 in pyflink

I want to set a datastream in Pyflink where I want to read data from an S3 bucket and do some processing with it but I'm unable to read files from the bucket i have provided a minimal code snippet and ...
Sumit Kashyap's user avatar
0 votes
1 answer
210 views

Extract very large zip file on AWS S3 using Lamda Functions

I'm trying to read a very large zip file on a s3 bucket and extract its data on another s3 bucket using the code below as lambda function: import json import boto3 from io import BytesIO import ...
Ali AzG's user avatar
  • 1,953
0 votes
1 answer
168 views

How to process files in an S3 bucket one by one through AWS Lambda until no more files remain

The scenario is this : I am expecting to receive files from various sources through Appflow into an S3 bucket location. The files may come in at different times of the day, multiple times. I am ...
Healer77Om's user avatar
0 votes
0 answers
20 views

Resource object filter is returning prefix without file

I have this issue, but don't know why is happening: s3 = boto3.resource('s3') bucket = s3.Bucket('bucket_name') for obj in bucket.objects.filter(Prefix='prefix/a/'): print(obj) s3.ObjectSummary(...
imatiasmb's user avatar
  • 111
0 votes
0 answers
33 views

Python reading jpg file from s3 unidentified image error

#boto3 version = 1.28.3 #botocore version = 1.31.65 import boto3 from PIL import Image from io import BytesIO s3 = boto3.client('s3') new_obj = s3.get_object(Bucket=bucket, Key="path/to/...
data_person's user avatar
  • 4,366
0 votes
1 answer
70 views

How to use Python's S3FS to delete a file that contains dashes and brackets in the filename?

I have a file named data_[2022-10-03:2022-10-23].csv.gzip in S3, inside a bucket and folder named s3://<bucket_name>/data/cache/ I am attempting to delete this file using S3FS. When I attempt to ...
Edy Bourne's user avatar
  • 6,099
2 votes
1 answer
209 views

Read the latest S3 parquet files partitioned by date key using Polars

I have parquet files stored in s3 location which are partitioned by date key. Using Polars, i need to read the parquet file(s) from the latest date key folder. Here is an example of my s3 structure: ...
Balaji Venkatachalam's user avatar
0 votes
0 answers
25 views

Abort s3 bucket uploading progress using boto3

I want to implement one S3 bucket folder uploading API and abort API. The uploading API should shows progress of uploading and abort API should abort an upload that's currently in progress and cancel ...
code_monkey's user avatar
-1 votes
1 answer
280 views

write file (python) to S3 from databricks without using boto3?

I have my access key and security key for AWS but I would like to write the file in the S3 bucket without exposing my access key and security key in python code.
harry's user avatar
  • 201
0 votes
2 answers
34 views

Amazon Sagemaker Deploy SIngle Cell of code to higher instance

I am currently running code on amazon sagemaker jupyter notebook (not jupyterLab, just a plain jupyter notebook) on the 'ml.t3.2xlarge' instance. There is one line of code shown below, where I am ...
Sar99's user avatar
  • 43
1 vote
1 answer
161 views

Lambda hangs while uploading to S3, while uploading from a local server works just fine

The lambda is in a public subnet, but the S3 bucket is public regardless. The lambda has the FullS3Access IAM role, and I tried making an endpoint for S3 in my VPC, to no avail. The bucket's only ...
Nikolaisyl's user avatar

15 30 50 per page
1
2 3 4 5
61