Streaming uploads
Apparently the Foundation library is already doing multipart uploads for very large files; this is why we can't use Backblaze or Cloudflare with Matrix Media Repo -- they don't support the Amazon S3 checksum headers for multipart uploads.
So since we are already doing streaming multipart uploads, it would be nice if we could avoid reading the whole file into memory, and then storing a whole other copy of the file when we encrypt it.
Instead, we should figure out how to create a special InputStream
that reads data and encrypts it, and provide this encrypted stream to the URLSession
.