There are a variety of ways to migrate your existing images to Cloudinary, both using the UI and the API.
Via The API
Cloudinary's upload API- the upload API only supports a single file upload at a time, but it does have a high concurrency rate, so you can write a script and use multi-threads (up to 10 threads) to upload many files at once. You can also use asynchronous calls, and tell Cloudinary to do the upload in the background by adding the 'async' parameter and setting it to 'true'. We recommend using one of our SDKs for this purpose.
Note that there is no rate limit on the upload API method, with that said, there is a concurrency limit when performing many simultaneous operations. You can preform up to 10 simultaneous upload requests. If you need to upload using more than 10 threads please contact our support.
To speed up the upload process, you can provide a file's existing HTTP/HTTPS, S3, or Google Cloud Storage URL as the file parameter to the upload API instead of sending the actual data. This allows for a much faster migration process because we can retrieve the images from the specified location instead of your code needing to download and upload the files to us.
To specify S3 or Google Cloud Storage URLs, please ensure that the bucket can be accessed by Cloudinary and that you've specified which Cloudinary accounts should be allowed to copy images from the bucket. More info here.
If you are using Ruby, Cloudinary's Ruby GEM includes a Migrator tool for managing the migration process for you automatically.
Cloudinary CLI - Using one of the File Management commands, such as 'migrate' to upload a list of external media URLs, or 'upload_dir' to upload a local folder to Cloudinary.
Via the Media Library UI
Media Library upload widget - Selecting multiple files at once, including dragging and dropping folders onto the Media Library window.
The Media Library uses the upload API method so it is not rate limited as well.
Auto-upload remote resources - This feature allows you to link a folder in your Cloudinary account with a corresponding folder on your existing server or in an existing Amazon S3 bucket or Google Cloud Storage bucket. This would upload the images lazily (i.e., every time they are specifically called by the end-user).
Comments
8 comments
Hi Orly !
I'm using Node.js , How can I Upload all the images in a folder??
I'm in a project where the client should be able to dump all the images inside a folder
Thanx
Hi,
When uploading directly from the browser, you can set the 'multiple' attribute and allow drag & drop of multiple files. On the other hand server-side upload API requires uploading one file at the time.
Feel free to open a support ticket and share more information regarding your specific use-case so we can advise further.
Is it possible to add Azure Blobs or Storage accounts the say way S3 works?
Hi,
It's on our road map to support other sources for our auto upload feature.
In the meantime, it's possible to auto upload from any publicly available remote location (i.e., http/https).
Hi guys
This seems a bit like a hole in your otherwise outstanding and unique offering. Your clients might have directories with a LOT of images. I have found that the initial bulk uploading is exceptionally painful.
Ideally one could drag and drop a tgz or zip file into Cloudinary, which then extracts and stores the images. Doesn't seem too hard to do? Accept .tgz and zip files which get extracted?
Secondly, your API should ideally also accept tgz / zip files to extract and store.
Hi Rode,
Thank you for this suggestion, I will open a feature request on your behalf.
Cheers,
Yakir
here ia a tutorial for multiple image with cloudinary and node
https://tycodez.hashnode.dev/upload-multiple-images-to-cloudinary-node-mongodb-express-ck2qn9jgq006qiws1ck5ci9gi
Thank you very much for sharing this :)
Please sign in to leave a comment.