Error: video upload failed. Error: Server returned unexpected status code - 413
For my backend i'm using NodeJS. At the moment i'm getting a buffered image via a post from the frontend. So to upload the video i have to use a stream. But with video's larger then 100MB i have to used chunked uploads.
At the moment i have this:
cloudinary.uploader.upload_chunked_stream(
{
public_id: this.getPublicId(userId),
resource_type: "video",
transformation: [{ height: "640", crop: "scale", quality: "80" }],
format: "mp4",
folder: process.env.ENVIRONMENT === "DEV" ? "test" : "production",
},
(error, result) => {
if (result) {
//do something
} else {
//do something
}
}).end(arrayBuffer);
There is absolutely no documentation about this function so i thought i would ask it here.
Whenever i try to upload a video above 100 mb with this method i get this error:
Error: video upload failed. Error: Server returned unexpected status code - 413
Does anybody have an idea why this is?
-
Hi Thijmen,
Would you mind trying the following code?
cloudinary.uploader.upload_large(
{
public_id: this.getPublicId(userId),
resource_type: "video",
transformation: [{ height: "640", crop: "scale", quality: "80" }],
format: "mp4",
folder: process.env.ENVIRONMENT === "DEV" ? "test" : "production",
},
(error, result) => {
if (result) {
//do something
} else {
//do something
}
}).end(arrayBuffer);Thanks for letting me know if that works for you.
Best,
Loic
0 -
Hi Loic,
First off thanks for your reaction.
As upload_large requires a path as first parameter it will not work. Also it returnes a promise so i can't call .end() on it.0 -
Would you be able to provide us the cloud name to the account? If you wish, you can open a ticket with us internally if you wish to share it that way. I want to look over logs for further details about the request.
With that said, are you not able to get the path of the file for upload_large?0 -
The file is getting transferred from an app to the backend as a application/octet-stream, so sadly at this point the only way for us is using a buffer. We are working on improving our app and also changing this but for the moment this is the way it goes sadly.
Our cloud name of the account is citylegends. It is a paid account so uploading up to 300mb should be ok0 -
Hi Thijmen.
I've been looking through the logs for your account but there are so many uploads it's hard to see the wood for the trees, so to speak.
Could you please give us an approximate timeframe when you encountered this issue, as well as your timezone? That'll help us pinpoint the issue.
Thansk.
0 -
Hi Danny,
I just recreated the issue for you.
Our time zone is Amsterdam so CEST or UTC/GMT +2 hoursI recreated in the following timeframes:
09/10/2021, 11:17:10 AM
09/10/2021, 11:18:10 AM
Thanks
0 -
Hi Thijmen,
Thanks for preparing those examples.
On our side, I can see the corresponding API calls, and what I see in both cases is that we successfully received a chunk of the file containing 20 MB, and then the next request was detected as being for approximately 104 MB, and that second request received a 413 error
I can't see from your example code what could cause a problem like that, though.
Is your example file 104MB, or 124 MB? that might help identify what exactly is happening, but it seems like the chunking mechanism isn't working correctly for some reason.
What's creating the arrayBuffer in your example? Previous use-cases I've seen for this used the "streamifier" library which I believe has worked for other customers in the past with similar use-cases
Thanks,
Stephen0 -
Hi Stephen,
The file is 137 MB but, does cloudinary handle the chunking?
This is the exact code i used the last time to reproduce the issue is this:
const stream = cloudinary.uploader.upload_chunked_stream(
{
public_id: this.getPublicId(userId),
resource_type: "video",
transformation: [{ height: "640", crop: "scale", quality: "90" }],
format: "mp4",
folder: process.env.CLOUDINARY_FOLDER,
},
(error, result) => {
if (result) {
resolve({
//do something
});
} else {
//do something
}
},
);
streamifier.createReadStream(arrayBuffer).pipe(stream);0 -
Hi Thijmen,
That example code is using upload_stream rather than upload_chunked_stream, so I'm actually surprised we saw two different requests on our side - I thought we'd get the entire file in one request, which would fail with a terminated connection or with an HTTP 413 error
Does it work like this? (In this example, the stream is from streamifier creating a stream from a local file):
var cloudinary = require('cloudinary').v2;
var fs = require('fs');
// SDK config is set via CLOUDINARY_URL environment variable in this example
var options = {resource_type: 'auto', tags:['test','tickets', 'delete_me'], folder:'tmp/tickets'};
let cld_upload_stream= cloudinary.uploader.upload_chunked_stream(
options,
function(error, result) {
console.log(error,result);
}
);
var file_reader = fs.createReadStream('/Users/stephen/Downloads/<FILE>.mov').pipe(cld_upload_stream);Regards,
Stephen
0 -
Hi Stephen,
I'm very sorry for the confusion, when i reproduced the errors i indeed used the function upload_chunked_stream.
I copied the current code we have and i put it back to upload_stream and only allow files up to 100mb for now.When i use upload_chunked_stream i get the 413 error.
0 -
OK, that's confusing; there shouldn't be a difference between the example I sent, and with you using an almost-identical example where the stream comes from your arrayBuffer rather than the output from streamifier - do you see the same problem with a standalone test like my example code? Is it possible the issue is related to how the buffer is created and passed around your code?
0 -
I will try your exact code and see if there are any differences.
I will come back to you when i have more results.Thanks for all the help so far
0
Post is closed for comments.
Comments
12 comments