Login

S3 static media uploader

Author:
phlex
Posted:
December 1, 2008
Language:
Python
Version:
1.0
Score:
-1 (after 1 ratings)

This is a bastardisation of a few of the Amazon s3 file uploader scripts that are around on the web. It's using Boto, but it's pretty easy to use the Amazon supplied S3 library they have for download at their site. It's mostly based on this and this.

It's fairly limited in what it does (i didn't bother os.walking the directory structure), but I use it to quickly upload updated css or javascript. I'm sure it's a mess code wise, but it does the job.

This will first YUI compress the files, and then gzip them before uploading to s3. Hopefully someone might find this useful. It will also retain the path structure of the files in your MEDIA_ROOT directory.

To use it, set up your Amazon details, download the YUI Compressor, and then enter the folder you wish to upload to s3, and basically run the script - python /path/to/s3_uploader.py

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
import email
import mimetypes
import os
import re
import time

from datetime import datetime, timedelta
from django.utils.text import compress_string

from your_project.settings import MEDIA_ROOT

AWS_ACCESS_KEY_ID = 'YOUR_AWS_ACCESS_KEY'
AWS_SECRET_ACCESS_KEY = 'YOUR_AWS_SECRET_KEY'
BUCKET_NAME = 'YOUR_AWS_BUCKET_NAME'
YUI_COMPRESSOR = 'java -jar /path/to/yuicompressor.jar %s -o %s'
MEDIA_TYPES = ('text/css', 'application/javascript', 'application/x-javascript',)

try:
    from boto.s3.connection import S3Connection
    from boto.s3.key import Key
except ImportError:
    raise ImportError('Please install Boto => http://code.google.com/p/boto/')

def update_s3():
    this_path = os.path.abspath('.')
    this_folder = re.sub(MEDIA_ROOT, '', this_path)
    for item in os.listdir('.'):
        if item.lower() in ('.svn', '.ds_store',):
            continue
        filename = os.path.normpath(item)
        content_type = mimetypes.guess_type(filename)[0]
        if os.path.isfile(filename):
            if content_type in MEDIA_TYPES:
                content = yui_file(this_path, filename)
            else:
                content = open('%s/%s' % (this_path, filename,), 'rb').read()
            headers = {}
            headers['Content-Type'] = content_type if content_type else 'text/plain'
            expires = (datetime.now() + timedelta(days=365*2)).timetuple()
            headers['Expires'] = '%s GMT' % email.Utils.formatdate(time.mktime(expires))
            if headers['Content-Type'] in MEDIA_TYPES:
                headers['Content-Encoding'] = 'gzip'
                content = compress_string(content)
            s3_filename = '%s/%s' % (this_folder, filename,) if len(this_folder) > 1 else filename

            store_in_s3(s3_filename, content, headers)

def store_in_s3(filename, content, headers):
    conn = S3Connection(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)
    b = conn.create_bucket(BUCKET_NAME)
    k = Key(b)
    k.key = filename
    k.set_contents_from_string(content, headers, replace=True)
    k.set_acl('public-read')

def yui_file(this_path, filename):
    old_file = '%s/%s' % (this_path, filename,)
    new_file = '/tmp/%s' % filename
    status = os.system(YUI_COMPRESSOR % (old_file, new_file,))
    if status != 0:
        raise BuildError('Woops! Something went wrong with the YUI Compressor!')
    compressed = open(new_file, 'rb').read()
    os.remove(new_file)
    return compressed

if __name__ == "__main__":
    update_s3()

More like this

  1. Template tag - list punctuation for a list of items by shapiromatron 3 months, 1 week ago
  2. JSONRequestMiddleware adds a .json() method to your HttpRequests by cdcarter 3 months, 2 weeks ago
  3. Serializer factory with Django Rest Framework by julio 10 months, 1 week ago
  4. Image compression before saving the new model / work with JPG, PNG by Schleidens 11 months ago
  5. Help text hyperlinks by sa2812 11 months, 3 weeks ago

Comments

Please login first before commenting.