Correct way of uploading from buffer?
Hi,
I have been researching for hours and I found out two solutions (using NodeJs), so that I can pass a buffer instead of the actual image. Documentation says it accepts buffer and that maybe working for other platforms, but not for node -- these issues can be found on the npm store. The only way I have successfully uploaded images are using these two methods:
-
Hi,
Thanks for your message.
You can indeed upload from a buffer and the way to do this is via the upload_stream method. You'll first need to turn the buffer into a readable stream, which I'm doing using the streamifier library in my example. I'm also setting the file to be uploaded into the "foo" folder, but of course, you can add any other upload parameters to that object that you need.
let cloudinary = require("cloudinary").v2;
let streamifier = require('streamifier');
let cld_upload_stream = cloudinary.uploader.upload_stream(
{
folder: "foo"
},
function(error, result) {
console.log(error, result);
}
);
streamifier.createReadStream(req.file.buffer).pipe(cld_upload_stream);May I please ask you to try this and let me know how it goes.
Best regards,
Aleksandar
4 -
Thank you very much for your reply Aleksandar, finally someone helps. I have tried this function before, but not with pipe method. I have two issues with it this way:
cloudinary.v2.uploader.upload_stream({ format: 'jpg' }, (err, res) => { if (err) { console.log(err); } else { console.log(`Upload succeed: ${res}`); // filteredBody.photo = result.url; } }).end(req.file.processedImage);
The problems with it is that when I try to add the url to the photo so I can store in DB, it returns a blank result. So I tried using async function and await the result of the upload_stream, but is seems to not return a promise, even though it says it returns a promise.
I will now try your solution, but do you perhaps know if this function returns a promise or when can i find more about it (seems to not be in the docs)?
0 -
It should indeed work with Promises too. Perhaps you can also try something along the lines of:
let cloudinary = require("cloudinary").v2;
let streamifier = require('streamifier');
let uploadFromBuffer = (req) => {
return new Promise((resolve, reject) => {
let cld_upload_stream = cloudinary.v2.uploader.upload_stream(
{
folder: "foo"
},
(error: any, result: any) => {
if (result) {
resolve(result);
} else {
reject(error);
}
}
);
streamifier.createReadStream(req.file.buffer).pipe(cld_upload_stream);
});
};
let result = await uploadFromBuffer(req);I'll take a look internally whether we have notes on why it isn't present in the main documentation. On a similar note, the callback approach is mentioned in the ReadMe section of the repository - https://github.com/cloudinary/cloudinary_npm#cloudinaryupload_stream
5 -
I really want to say thank you for your efforts on helping me! I really appreciate it.
1. What I meant with return promises is: if I await
cloudinary.uploader.upload
I obtain very different results than if I await
cloudinary.uploader.upload_stream
awaiting uploader.upload, returns the successful upload info, like url, public_id,etc.., meanwhile with upload_stream it doesn't.
After your response I understood that maybe I wasn't passing the right parameters.
2. I know maybe this is a very noob question, but is buffering an image the right way to go? I mean, I want for users to upload an image and then send it to cloudinary, without it being in the server, but as i've read to few posts on buffering images in Cloudinary, it makes me think if i'm on track.
3. When images are buffered to cloudinary, does this take up server memory or the clients browser memory? Because maybe buffering images can be more harmful than saving to server and then deleting with the file system.
I hope i'm not being a pain to you with so many questions, but I really like Cloudinary and want to make things work well.
0 -
Hi,
Based on your last comment, I would just like to confirm that what you mean when saying "After your response, I understood that maybe I wasn't passing the right parameters." is that you've managed to resolve the problem with the missing response in the Promise of the upload_stream() method, by fixing an issue you had with the parameters. Is that correct?To answer your second question, based on what you are trying to achieve, I would say the best approach will be embedding our Upload Widget in your site's code, so your clients will upload their media directly from their browser to Cloudinary servers.
With the buffer stream approach, the file indeed isn't saved on your server, but it does go through your server's memory, meaning it is still uploaded from your server and not from the client-side.
To avoid any usage of your server's resources, the way to go is to either use the Upload Widget mentioned above or a direct upload from the browser using our jQuery library, which its upload functionality is based on blueimp's jQeury file uploader.
I hope this answers all of your questions :)0 -
Hi,
What is working for me is:
//converting buffer to usable format
const imageFile=dataUri.format(req.file.processedImage.info.format,req.file.processedImage.data);const result = await cloudinary.uploader.upload(imageFile.content, {public_id: req.file.filename,folder: "profile/"});//using the result to save to database0 -
Let's try and take a step back and use the upload_stream() method to upload a locally stored image with a promise.
Please try and use the code below to see if you get the response printed to your console once the promise has been fulfilled:
var fs = require("fs");
let streamUpload = (file_name) => {
return new Promise((resolve, reject) => {
let stream = cloudinary.uploader.upload_stream(
(error, result) => {
if (result) {
resolve(result);
} else {
reject(error);
}
}
);
fs.createReadStream(file_name).pipe(stream);
});
};
async function async_func(file_name) {
let result = await streamUpload(file_name);
console.log(result);
}
let file_name = __dirname + '/my_picture.jpg';
async_func(file_name);Just please make sure that you first temporarily add a photo to the same directory as the script with the name of 'my_picture.jpg' or alternatively change the path of the file_name variable.
This has been tested on my machine and the response was received similarly to the response from the regular upload() method.
Once we ensure this works for you as expected, we will move on to replace the local file upload with a buffer image.Best,
Raz1 -
Hi Raz,
I'll keep this short to make it easier to continue:
Yes it worked!
0 -
Excellent, thank you for the update, I'm glad to hear that.
So now let's proceed with installing Streamifier via npm, as Aleksander previously suggested.Once installed, kindly try and change the upload of the local file to upload a buffer-converted string:
First, add:
let streamifier = require('streamifier');
And then make sure you keep the same code but only replace the local file with the actual buffer request.
let streamUpload = (req) => {
return new Promise((resolve, reject) => {
let stream = cloudinary.uploader.upload_stream(
(error, result) => {
if (result) {
resolve(result);
} else {
reject(error);
}
}
);
fs.createReadStream(req.file.buffer).pipe(stream);
});
};
async function async_func(req) {
let result = await streamUpload(req);
console.log(result);
}
async_func(req);In case you are still experiencing any issues with getting back the upload response when running the above, please open a support ticket at support@cloudinary.com and one of our agents will be happy to further help you with getting this issue resolved.
Best,
Raz3 -
Thank you all for your help! Outstanding service.
0 -
You're most welcome! Happy that we could help 🙂
0 -
Great support indeed.
do you plan to return a promise from:
cloudinary.uploader.upload_stream
to simplify code?
0 -
Hey Neckaros,
Thanks for your feedback :)
We do intend to return a promise from the upload_stream() and some other methods as well. This is expected to take place in one of our near-future releases, however, we currently don't have an ETA to share yet.
Thank you for your patience and understanding 🙏🏻.
Best,
Raz1 -
To be able to use async/await with cloudinary, you can do something like this
const cloudinary = require("cloudinary").v2;
const { promisify } = require("util");
const cloudinaryUploadLarge = promisify(cloudinary.uploader.upload_large).bind(
cloudinary
);let response = await cloudinaryUploadLarge(name, {
resource_type: "raw",
public_id: your public id,
});
console.log(response)0 -
I found out that the solution of passing the 2 parameters (error, result) doesn't work anymore cause now it returns the error on the response so i changed this part like this:
let stream = cloudinary.uploader.upload_stream(
(result) => {
if (!result.error) {
resolve(result);
} else {
reject(result.error);
}
}
);0 -
@Aspiiire YOUTUBE
The callback has the first parameter as error and the second as result. This is expected behavior.let stream = cloudinary.uploader.upload_stream(
(error, result) => {
if (result) {
resolve(result);
} else {
reject(error);
}
}
);0 -
in cloudinary.config.ts
import { v2 as cloudinary } from 'cloudinary';
function to upload image using buffer
cloudinary.config({
cloud_name: process.env.CLOUDINARY_CLOUD_NAME,
api_key: process.env.CLOUDINARY_API_KEY,
api_secret: process.env.CLOUDINARY_API_SECRET,
});
export { cloudinary };import { UploadApiResponse } from 'cloudinary';
import streamifier from 'streamifier';
import { cloudinary } from './config/cloudinary.config';
async uploadImage(buffer: Buffer): Promise<UploadApiResponse> {
return new Promise((resolve, reject) => {
const uploadStream = cloudinary.uploader.upload_stream(
{
folder: 'events',
upload_preset: 'ml_default',
},
(error: Error, result: UploadApiResponse) => {
if (result) resolve(result);
else reject(error);
},
);
streamifier.createReadStream(buffer).pipe(uploadStream);
});
}1 -
I am hitting a file too large error when uploading a video via a stream. How can you increase the upload size over 100mb? Do you just specify the chunk size or do you have to use something like the upload_large method?
0 -
Hi Adam.
Thanks for getting in touch.
This is actually an account limitation. The maximum file size for video is in fact 100MB on our free plan, however if you contact us via support@cloudinary.com we may be able to come up with a solution for you.
Thanks,
-Danny0 -
And could you suggest a way how to do it with a larger stream (larger than 100mb) in chunks?
0 -
Hi Norbert,
If you're using Cloudinary's Node SDK as the other commenters on this thread were, there's a dedicated method in the SDK for that, named `upload_large_stream`
If you're implementing the upload yourself, there's a guide here with the specific requirements for how the chunks should be structured: https://support.cloudinary.com/hc/en-us/articles/208263735-Guidelines-for-self-implementing-chunked-upload-to-Cloudinary
Thanks,
Stephen0 -
Hi,
I'm trying to upload large file with upload_large_stream method but I'm getting
cloudinary.uploader.upload_large_stream is not a function.
here is my code:
let cld_upload_stream = cloudinary.uploader.upload_large_stream({folder: video,format: 'mp4',resource_type:"video",chunk_size: 6000000,},(error: any, result: any) => {console.log('result', result);if (result) {resolve(result);} else {console.log('stream error', error)reject(error);}});streamifier.createReadStream(file.buffer).pipe(cld_upload_stream);0 -
Hi Ibrar,
I'd like to ensure that you are utilizing the Cloudinary Node SDK for development? The `upload_large_stream` method is unique to the Node SDK and not available for use outside of it.
0 -
Yes, I'm using Cloudinary Node SDK version 1.26.3.
0 -
Hello Ibar,
Thank you for getting back.
I understand you have opened a ticket regarding this issue.
I did review the ticket and you are able to `upload_large_stream` but now it seems you are reaching the 100MB max video file size limit for Free accounts. More on limits here: https://cloudinary.com/pricing/compare-plans
Can we continue communicating via the ticket?
Thanks,
Thomas0 -
I'm using a Plus plan and still getting cloudinary.uploader.upload_large_stream is not a function.
0 -
Hi Ibrar,
Thanks for getting back.
So can you try with `cloudinary.uploader.upload_chunked_stream` instead? I just tested this on a 110mb image via node.js and it uploaded fine.
I'm looking forward to your response.
Kind Regards,
Thomas0 -
Hey Thomas,
Would you be kind enough to share like a code snippet? Also like what are the options that we need to pass.
I am actually running into the same problem and for some reason the SDK is not doing chunking correctly.
It creates the first chunk as almost 19MB and then the second chunk holds the rest of the data as buffer. I am passing the resource_type as 'raw' after trying various time with 'video' to no avail.return new Promise((resolve, reject) => {
Readable.from(file.buffer).pipe(
this.cloudinary.uploader.upload_chunked_stream(
{
...defaultOptions,
...options,
},
(error: UploadApiErrorResponse, result: UploadApiResponse) => {
if(error) {
console.log(error);
reject(error);
}
console.log('CLOUDINARY_SERVICE RESULT IN CALLBACK>>>>>>>>', result);
resolve(result);
}
)
);
});0 -
Hi Abdul,
Thanks for getting back.
Here is a code snippet I used for testing:
require('dotenv').config();
const cloudinary = require('cloudinary').v2;
const fs = require('fs');
const { performance } = require('perf_hooks');
var startTime = performance.now()
let stream = cloudinary.uploader.upload_chunked_stream({timeout:1800000, resource_type: 'video'},
function(error, result){
console.log(result, error);
timer();
}
);
fs.createReadStream('myvideo.mp4').pipe(stream);
function timer() {
varendTime = performance.now();
console.log(`Took ${endTime - startTime} milliseconds`)
}So when using upload_chuncked_stream, we seem to default to resource_type of raw so set this to video per snippet above.
The chunk size of about 20MB sounds correct, that is the default but configurable via the chunk_size parameter.
Does your upload complete at all? And what is your upload speed?
Can you also test using the standard upload_large method and check if you get the same issue with that? Here is a code snippet for upload_large:
cloudinary.uploader.upload_large("myvideo.mp4",
{timeout:1800000, resource_type: 'video'},
function(error, result) {console.log(result, error); });For reference, all of my tests were run via the examples in this repo: https://github.com/cloudinary-training/cld-fundamentals-for-developers
I'm looking forward to your response.
Kind Regards,
Thomas0
Post is closed for comments.
Comments
29 comments