Skip to main content
The 2024 Developer Survey results are live! See the results

Questions tagged [amazon-s3]

Amazon S3 (simple storage service) is an online object storage service from Amazon Web Services. QUESTIONS MUST BE ABOUT PROGRAMMING. Questions about general S3 support, functionality, configuration, etc. are OFF-TOPIC.

-1 votes
0 answers
7 views

How Can I Reduce the LCP Metric When Serving Images from an S3 Bucket

I am currently serving images from an Amazon S3 bucket for a web application and utilizing S3 Transfer Acceleration to speed up the delivery. Despite this, I’m experiencing a high Largest Contentful ...
0 votes
0 answers
6 views

Disussion: Data ingestion from sharepoint to Snowflake

Hi looking for suggestions to ingest data from sharepoint to Snowflake. Currently we have power automate workflow which triggers the data copy from sharepoint to S3. Then we use snowpipe copy command ...
-1 votes
1 answer
33 views

pd.to_datetime() not consistently working to convert objects

I have been working with this data (csv) that exists in an AWS S3 bucket. When I am pulling the data I have to transform all the columns to their correct dtypes. All other dtypes are working properly ...
29 votes
2 answers
13k views

Permanently restore Glacier to S3

I'm wondering whether there is an easy way to permanently restore Glacier objects to S3. It seems that you can restore Glacier objects for the certain amount of time you provide when restoring to S3. ...
161 votes
41 answers
386k views

The AWS Access Key Id does not exist in our records

I created a new Access Key and configured that in the AWS CLI with aws configure. It created the .ini file in ~/.aws/config. When I run aws s3 ls it gives: A client error (InvalidAccessKeyId) ...
0 votes
0 answers
10 views

Uploading from OneDrive/Sharepoint in Laravel App

I'm looking for a solution to allow users to authenticate then upload files from their Sharepoint/OneDrive to S3 within a Laravel application. Has anyone been able to accomplish this before? I've only ...
0 votes
2 answers
31 views

Too many "Authorized committer" errors after upgrading to Pyspark==3.5.1

The problem I have recently upgraded my apps to run on Spark3.5.1+YARN3.3.6, and observing frequent failures saying "Authorized committer". The apps run PySpark and I observe the error ...
0 votes
0 answers
10 views

How to connect Athena ODBC using AWS learner lab

I downloaded the Athena ODBC to connect to amazon athena. When I go to configure a data source in ODBC data sources 64bit on my own computer, I can't seem to get the correct secret access key or ...
0 votes
1 answer
27 views

AWS Serverless framework, stack deploy stuck "CREATE_IN_PROGRESS"

I have a problem while using the Serverless framework to deploy my stack to AWS. I'm running this serverless deploy --stage tst --region eu-west-3 --verbose Then 2 of my functions get stuck in "...
0 votes
1 answer
16 views

How much data in S3 will be backup for the first time?

Let say I have a S3 bucket which I created it and uploaded files to it a year ago. I kept it unchanged for a year. Now I turn on backup on it using continuous backup and set the total retention period ...
1 vote
2 answers
1k views

How to set a BucketPolicy through cloudformation after April 2023 ACL restriction

I'm following testdrive.io's serverless-fastapi course. It uses cloudformation to setup the bucket and bucket policy. The bucket is created just fine, however there are issues creating the bucket ...
17 votes
4 answers
13k views

java.net.URI get host with underscores

I got a strange behavior of that method: import java.net.URI URI url = new URI("https://pmi_artifacts_prod.s3.amazonaws.com"); System.out.println(url.getHost()); /returns NULL URI url2 = ...
0 votes
2 answers
1k views

How can I import a very large csv into dynamodb?

So I have very large csv file in my s3 database (2 mil+ lines) and I want to import it to dynamodb. What I tried: Lambda I manage to get the lambda function to work, but only around 120k lines were ...
-3 votes
1 answer
11 views

How can I upload a CSV file from a local folder to a specific folder within an S3 bucket using Python?

If I have a CSV file stored in a specific local folder, how can I upload it to a specific folder within an S3 bucket using Python?
3 votes
1 answer
1k views

Testing thanos object storage upload without waiting 2 hours

I was working on a docker-compose file using Prometheus to remote write data to a Thanos receiver. This data would then be queried by Thanos Querier as well as uploaded to minIO. When configuring the ...

15 30 50 per page
1
2 3 4 5
3403