0

I have a 3rd party library that calls a delegate every time there is a buffer of data, and I need to get this buffer into a stream and send the stream to S3.

So I use a FileStream to write to a temporary file:

parser.FileHandler += (buffer, bytes) =>
{
    fileStream.Write(buffer, 0, bytes);
};

..and then the file is written to S3:

fileStream.Position = 0;
TransferUtility transferUtility = new(client);
transferUtility.UploadAsync(stream, BucketName, filename).Wait();

MemoryStream is not an option since the files being sent could be very large.

I can't help thinking that there is a better way - how to get the buffered data pieces into S3 directly without using a temporary file or MemoryStream?

But it seems impossible because the library sending the data only provides chunks, not a stream.

3
  • You could implement a specialized Stream class based on the parser that allows trasferUtility to read the data from your stream coming from the parser as it arrives. Commented Jul 6 at 7:28
  • S3 allows chunked data uploads, might be worth doing a separate upload per file. Commented Jul 7 at 0:29
  • Thanks but S3 only allows up to 10,000 parts. I don't know what part size will be sent to me, they could be as small as 1Kb.
    – Andy
    Commented Jul 7 at 7:24

0

Browse other questions tagged or ask your own question.