Uploading large amounts of data

I need to use large amounts of data (~200GB) which cannot be that uncommon. What is the recommended method or dealing with this? Spinning up a GPU instance and copying into it is out of the question, I do not have a Gbps upload.

So it has to be upload to some service and transferring it from there, which is doable, but feels wasteful, as it gets deleted and there is some cost, particularly egress. Does Lambda have some solution, so I do not have to move the data unnecessarily?

All right, How to Transfer Data to Lambda Cloud GPU Instances claims that Lambda does not offer persistent storage (although this may be a bit out of date as there is a Storage item in my account). In that case, a qučstion for everyone: what cloud solution do you use? Backblaze looks good, but I am not sure about data transfer speed.

@comodoro what did you figure out for uploading large data? I have the same issue.

As I do not need it very often, I went with Backblaze, chunking the download into pieces and checking them afterwards. Speed is okay, it takes more time to unpack the data. Disadvantage is egress cost, for frequent uses of the data I would consider Wasabi.