cloud

Memcache Basics On Google App Engine

memcache on GAEThis week is packed with great content on “Google cloud platform”. I will summaries some of the talks we having in our office. The first talk I gave was on leveraging memcache when you developing with Google App Engine. Here is the summary:

What is Memcache?

Memcache is an in-memory key-value pairs data store. You can use the put(key, value) and get(key) in order to save and fetch data from it. key or value can be anything that is serializable. It’s important to remember that with GAE – Memcache is a shared service accessed via App Engine APIs. So it’s not a RAM base caching layer that you might have on a single server but a service that you will make a network call each time you write/read from it.

Why Do We Use Memcache?

  • Improve  Performance
  • Reduce  Cost

In this case we can ‘kill two birds with one stone’ as it goes hand in hand. You can think on each hit in memecache as a free call you got. If your hit ratio is good – you will have better performances and your cost will go down dramatically.

What Do We Use Memcache For?

  • Caching your modules/blocks/data before your datastore/DB.
  • Cache entities for low-latency reads.
  • Integrated into most ORM frameworks: ndb, Obectify etc’
  • Caching for read heavy operations.
  • User authentication token and session data.
  • APIs call or other computation results  like: Semi-durable Shared state Cross App Instances, sessions, counters etc’.

How to Use Memcache? Which APIs?

Caveats And Solutions

  • Memcache Is Not Transactional -> You want to use getIdentifiable() and putIfUntouched(…) for optimistic locking.
    • Memcache Is Volatile - It’s important to handle cache-miss gracefully! You wish to implements write-through logic by backing memcache with datastore in your application!
      Psst… Objectify and ndb do it for you.
  • Memcache Is A Limited Resource
    • Only need to cache what is useful and necessary! You might wish to have a good logging in place so your application can ‘learn’ over time what are the most important 20% that will be providing for 80% of the calls.
    • Your application should function without memcache -> So test it with and without it and make sure you are still able to act on all your features even when this layer is not up.
    • Dedicated Memcache can help can more control.
  • Dedicated Memcache
    • In cases you wish to have more predictable performances and have control on the amount of cache size you can choose to work with this option. You can enable dedicated memcache from the admin console. You will set the fixed cache capacity that is exclusive to your application. Later you will be billed by the GB per hour of cache size.
    • Please remember – Whether shared or dedicated, Memcache is not a durable storage. It’s always smart to plan for your application to function without Memcache.

Key Takeaways

  • Memcache is supported natively in GAE -> Take advantage of it to improve your GAE application performance/cost.
  • Memcache supports open standard JCache API.
  • Many advanced features are available by GAE Memcache APIs to suit your application’s need: Batch, Atomic, Asynchronous operations.
  • Seamless integration with GAE Datastore in a few libraries like Python ndb and Java Objectify.
  • Read-frequently and write-rarely data is most suitable in combining with Memcache
  • Handle Memcache’s volatility in your application.
  • Use Memcache wisely, it is not an unlimited resource.
  • Be strong.

Full slides with more details:

Standard

2 thoughts on “Memcache Basics On Google App Engine

Comments are closed.