New cool feature in s3cmd

Sebastian Otaegui

 5 min read

Amazon S3 is an online file storage web service offered by Amazon Web Services. Amazon S3 provides its service through REST, SOAP or Bittorrent web services.

s3cmd is a command line tool written in the Python language to interact with the Amazon S3 service.

Until recently the version 1.5.0 of the s3cmd tool was in alpha phase and you could only get it from source or third party package repositories.

The version 1.5.0 has been released about 3 weeks ago and adds some very cool features, one of the new options in this new release is the support of multi-part uploads.

How do I install s3cmd?

There are a few different options, of which a couple is discussed on http://s3tools.org website.

  • If you are using OSX and are using Homebrew, you can do brew install s3cmd and that will pull and install version 1.5.0 of the tool.
  • If you are using Linux and are using Linuxbrew you can also do brew install s3cmd.
  • If you are not using either, then you can install the s3cmd by doing pip install -V s3cmd==1.5.0a3 (for some reason the PyPy repository still has an outdated release).

What is that new feature that was mentioned in the title?

Well, Amazon S3 has some throttling limitations when uploading large files, see here:

loading

With the new version of s3cmd, when you can add the new option to the put command in this way s3cmd put --continue-put, the upload, a large file, will be split into parts and the s3cmd tool will upload them individually to be joined on the server side (Amazon S3). The output of this new command looks like it can be seen below (I am truncating the logs for brevity purposes):

loading

So, if you already used s3cmd go upgrade to the latest version ASAP!

Editors Note: Spantree now recommends using the [AWS CLI] (aws s3 ...) instead of s3cmd as it is more actively maintained.