To migrate your existing images to Cloudinary, you can write a short script that traverses your images and upload them one-by-one using Cloudinary's upload API.
Cloudinary has client integration libraries for many development languages and frameworks which will simplify the calls to our upload API and allow you to integrate Cloudinary upload into your existing workflows.
If you are using Ruby, Cloudinary's Ruby GEM includes a Migrator tool for managing the migration process for you automatically: https://cloudinary.com/documentation/rails_image_and_video_upload#migrating_assets_to_cloudinary
To speed up the upload process, you can provide a file's existing HTTP/HTTPS, S3 or Google Cloud Storage URL as the file parameter to our upload API instead of sending the actual data. This allows for a much faster migration process because we can retrieve the images from the specified location instead of your code needing to download and upload the files to us.
To specify S3 or Google Cloud Storage URLs, please ensure that the bucket can be accessed by Cloudinary and that you've specified which Cloudinary accounts should be allowed to copy images from the bucket: https://cloudinary.com/documentation/upload_images#private_storage_url
Another option to help you migrate existing resources automatically is our auto-upload remote resources feature, which allows you to link a folder in your Cloudinary account with a corresponding folder on your existing server or in an existing Amazon S3 bucket or Google Cloud Storage bucket
Finally, using our Media Library UI you can also upload multiple files at once, including dragging and dropping folders onto the Media Library window: https://cloudinary.com/documentation/dam_upload_store_assets#upload_options
Comments
8 comments
Hi Orly !
I'm using Node.js , How can I Upload all the images in a folder??
I'm in a project where the client should be able to dump all the images inside a folder
Thanx
Hi,
When uploading directly from the browser, you can set the 'multiple' attribute and allow drag & drop of multiple files. On the other hand server-side upload API requires uploading one file at the time.
Feel free to open a support ticket and share more information regarding your specific use-case so we can advise further.
Is it possible to add Azure Blobs or Storage accounts the say way S3 works?
Hi,
It's on our road map to support other sources for our auto upload feature.
In the meantime, it's possible to auto upload from any publicly available remote location (i.e., http/https).
Hi guys
This seems a bit like a hole in your otherwise outstanding and unique offering. Your clients might have directories with a LOT of images. I have found that the initial bulk uploading is exceptionally painful.
Ideally one could drag and drop a tgz or zip file into Cloudinary, which then extracts and stores the images. Doesn't seem too hard to do? Accept .tgz and zip files which get extracted?
Secondly, your API should ideally also accept tgz / zip files to extract and store.
Hi Rode,
Thank you for this suggestion, I will open a feature request on your behalf.
Cheers,
Yakir
here ia a tutorial for multiple image with cloudinary and node
https://tycodez.hashnode.dev/upload-multiple-images-to-cloudinary-node-mongodb-express-ck2qn9jgq006qiws1ck5ci9gi
Thank you very much for sharing this :)
Please sign in to leave a comment.