When uploading big files, they are split into chunks, uploaded one by one and then re-assembled back into the original file. If you reach the “Processing files” stage, that normally means the chunks are uploaded and the re-assembly started…
The most common error at this stage is “error when assembling chunks” and that most of the time means slow processing: the CPU is too weak to finish the job in the allotted time…
I haven’t seen cases where 1.4GB file can be uploaded and 1.7GB cannot.
Must have something to do with S3 as storage…
This is most likely one of the chunks…
EDIT:
Discussion here might help (changing chunk sizes)… But S3 is some sort of exception…