Snippet List
Standard memcache client uses pickle as a serialization format. It can be handy to use json, especially when another component (e.g. backend) doesn't know pickle, but json yes.
- memcache
- cache
- json
- memcached
- pickle
A very simple decorator that caches both on-class and in memcached:
@method_cache(3600)
def some_intensive_method(self):
return # do intensive stuff`
Alternatively, if you just want to keep it per request and forgo memcaching, just do:
@method_cache()
def some_intensive_method(self):
return # do intensive stuff`
- memcache
- cache
- decorator
- memcached
- decorators
- caching
This solves the problem of losing sessions data when you restart memcached. So you use a different memcached instance for sessions which you rarely restart.
Use the above code and add the following to you settings.py
SESSION_ENGINE = "kwippyproject.session_backend"
SESSION_CACHE = 'memcached://127.0.0.1:11200/'
(Above assumes that your session's memcached is running on port 11200)
Feel free to contact me in case you need help.
By [Dipankar sarkar](http://dipankar.name)
[email protected]
- django
- python
- memcached
- sessions
Request-phase cache middleware that checks to make sure the Cache server is running. Starting it if it is not. This is run for every request, it checks to see if it can get a defined item out of the cache, if that fails it tries to set it. Failing that it decides the server is probably crashed, so goes though and attempts to connect to the server. Failing a connection it will launch a new server.
This is probably not useful on large scale multi server deployments as they likely have their own testing for when services crash, but I am using it in a shared hosting environment where I have to run my own copy of memcache manually and cannot setup proper services testing, so I use this to just make sure the cache server is still running.
- middleware
- cache
- memcached
This is intended as an alternative to http://www.djangosnippets.org/snippets/155/
Put this in your own cache.py and import it instead of django.core.cache and use it the same way. We left out the "add" function but it shouldn't be too hard to make if you want it.
From the above post: "The purpose of this caching scheme is to avoid the dog-pile effect. Dog-piling is what normally happens when your data for the cache takes more time to generate than your server is answering requests per second. In other words if your data takes 5 seconds to generate and you are serving 10 requests per second, then when the data expires the normal cache schemes will spawn 50 attempts a regenerating the data before the first request completes. The increased load from the 49 redundant processes may further increase the time it takes to generate the data. If this happens then you are well on your way into a death spiral
MintCache works to prevent this scenario by using memcached to to keep track of not just an expiration date, but also a stale date The first client to request data past the stale date is asked to refresh the data, while subsequent requests are given the stale but not-yet-expired data as if it were fresh, with the undertanding that it will get refreshed in a 'reasonable' amount of time by that initial request."
- cache
- memcached
- caching
- mintcache
cache_smart template tag is a drop in replacement for default cache tag by Django but with the added bonus to be more resistant against dog-pile/stampeding effect.
This snippet uses a extra cache entry to store a stale time so we don't have to pickle/unpickle to store this extra value.
If this cache entry returns None, as in expired it will reset the stale timeout 30 seconds in the future so further calls will just return the old value while this request is regenerating the new value.
**warning**
Don't use both cache template tags!
I've been working with a data set where a single object won't fit into memcached's 1 Mb slab limit. The two functions have been useful to me for debugging the size of the data structure once pickled, and if said pickled data structure is greater than 1 Mb.
These functions assume CACHE_BACKEND is memcached, obviously.
MintCache is a caching engine for django that allows you to get by with stale data while you freshen your breath, so to speak.
The purpose of this caching scheme is to avoid the dog-pile effect. Dog-piling is what normally happens when your data for the cache takes more time to generate than your server is answering requests per second. In other words if your data takes 5 seconds to generate and you are serving 10 requests per second, then when the data expires the normal cache schemes will spawn 50 attempts a regenerating the data before the first request completes. The increased load from the 49 redundant processes may further increase the time it takes to generate the data. If this happens then you are well on your way into a death spiral
MintCache works to prevent this scenario by using memcached to to keep track of not just an expiration date, but also a stale date The first client to request data past the stale date is asked to refresh the data, while subsequent requests are given the stale but not-yet-expired data as if it were fresh, with the undertanding that it will get refreshed in a 'reasonable' amount of time by that initia request
I don't think django has a mechanism for registering alternative cache engines, or if it does I jumped past it somehow. Here's an excerpt from my cache.py where I'v just added it alongside the existing code. You'll have to hook it in yourself for the time being. ;-)
More discussion [here](http://www.hackermojo.com/mt-static/archives/2007/03/django-mint-cache.html).
11 snippets posted so far.