Login

Run and cache only one instance of a heavy request

Author:
farnsworth
Posted:
August 17, 2010
Language:
Python
Version:
1.2
Score:
0 (after 0 ratings)

I have many heavy views which run slowly when accessed at same time in multiple threads. I make this decorator to allow run only one view at the time and cache returned result. Other threads will wait to complete first thread and use response from the cache if executed thread put it to cache.

Also I take idea of MintCache to refresh staled cache and return cached (stale) response while response refreshed in the cache.

Usage:

 @single_cacheable(cache_timeout=60,
              stale_timeout=30,
              key_template='my_heavy_view-{arg1}-{arg2}')
 def heavy_view(request, arg1, arg2):
     response = HttpResponse()

     ... your code here

     # cache timeout may be set from inside of view
     response._cache_timeout = cache_time

     return responce

The "key_template" is a template for cache key. Some my views have additinal parameter "cache_time" which set from parent page and request.path is different for these pages but they need to be cached not depending of this parameter.

Variable "key_template" in the example used named agrument, if you have no named paramaters you need to use 'my_heavy_view-{1}-{2}-{...}' where {1},{2}, {...} arguments of view.

The line with setting "key" variable may be changed to: key = "sc_"+request.get_full_path() if you need to take full URL path as cache key.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
from time import time
from threading import Lock
from django.core.cache import cache
from django.utils.cache import patch_response_headers

def single_cacheable(cache_timeout=60, stale_timeout=60, key_template=''):
    def paramed_decorator(func):
        func.__func_lck = Lock()
        def decorated(request, *args, **kw):
            lck = func.__func_lck
            if cache_timeout != 0 and request.method == "GET":
                key = "sc_"+key_template.format(*args, **kw)
                val = cache.get(key)
                ctout = cache_timeout
                if val is None:
                    acq_ret = lck.acquire(0)
                    try:
                        if acq_ret == False: # other thread have lock
                            lck.acquire(1) # wait while other thread finished
                            val = cache.get(key) # check for responce cached by other thread 
                            if val is None:
                                resp = func(request, *args, **kw)
                                if hasattr(resp, '_cache_timeout'):
                                    ctout = resp._cache_timeout
                                patch_response_headers(resp, ctout)
                            else:
                                refresh_tm, resp = val
                                return resp
                        else: # we are first thread what acquired lock
                            resp = func(request, *args, **kw)
                            if hasattr(resp, '_cache_timeout'):
                                ctout = resp._cache_timeout
                            patch_response_headers(resp, ctout)
                        cache.set(key, (ctout + time(), resp), ctout + stale_timeout)
                    finally:                        
                        lck.release()
                else:
                    refresh_tm, resp = val
                    if time() > refresh_tm and refresh_tm > 0:
                        if lck.acquire(0) == False: # other thread refreshed cache
                            return resp
                        try:
                            cache.set(key, (0, resp), stale_timeout) # mark responce as stale
                            resp = func(request, *args, **kw) # compute fresh value
                            if hasattr(resp, '_cache_timeout'):
                                ctout = resp._cache_timeout
                            patch_response_headers(resp, ctout)
                            cache.set(key, (ctout + time(), resp), ctout + stale_timeout)
                        finally:                        
                            lck.release()
                    else:
                        return resp
                    
                return resp
            
            else:
                with lck: # wait while other thread finished
                    resp = func(request, *args, **kw) # run the view
                                
            return resp
        decorated.__doc__ = func.__doc__
        decorated.__dict__ = func.__dict__
        return decorated 
    return paramed_decorator

More like this

  1. Template tag - list punctuation for a list of items by shapiromatron 10 months, 2 weeks ago
  2. JSONRequestMiddleware adds a .json() method to your HttpRequests by cdcarter 10 months, 2 weeks ago
  3. Serializer factory with Django Rest Framework by julio 1 year, 5 months ago
  4. Image compression before saving the new model / work with JPG, PNG by Schleidens 1 year, 6 months ago
  5. Help text hyperlinks by sa2812 1 year, 6 months ago

Comments

Please login first before commenting.