Nginx Caching Functionality

In this post we will explore some of the caching capabilities of nginx.

Previously we have explored some functionality of nginx as load balancer. Below are the articles –

1> Configure node.js application server with nginx
2> Configuring Load Balancer with Nginx and Node.js
3> Nginx as a Load Balancer – some details as we explored

We expect reader knowledge level is at intermediate stage for web application, where he/she need to work on application performance tuning.

Now below are some discussion about nginx caching capabilities. First the Diagram –

Web Caching through Nginx

  • Nginx can serve static files efficiently without sending any request to web / application server.
  • Nginx can work as cache server on top of web/application servers.

Nginx proxy requests to web / application servers (via HTTP, FastCGI etc – though we have used http only), to increase performance with serving static files will increase performance of applications while sending dynamic requests to application servers. Also nginx can act as load balancer and caching server at the same time.

In a caching server, the static requests as well as many of HHTP GET, HEAD Requests can be cached depending on application situation.

Some functionality of the Cache Server:

  • Send HTTP request to Application Server Server if the request is not needed to be cached or the cache time expierd
  • Serve responses against HTTP Requests from cache or from application server as and when needed

Now some examples of serving static files and configurations  from our previous example configuration

server {
      …
      location ~ ^/(images/|img) {
           root /nodeapps/nodeexpress4mongoangular;
           access_log off;
           expires max;
}

Here the static files will be served from /nodeapps/nodeexpress4mongoangular path and the request for file with <<root-path>>/images or <<root-path>>/img will be served  here. The files which will be served, will expect images or img folder correspondingly from the root path as configured within the nginx configuration file.

Here are some variations of configuration –

  location ~ * \.(json|xml) {
           expires -1;
}
 Here no cache will be done as expires is -1
  location ~ * \.(jpg|jpeg|png) {
           expires 1M;
           add_header Cache-Control “public”;
}
Here caching is for 1 month. add_header Cache-Control “public” means any system can cache those resources.
proxy_cache_path /tmp/nginx keys_zone=cache_zone:20m inactive=120m;
Here proxy_cache_path is for the data location where the cache files are to be stored.
key_zone is the name of the zone, which is to be referred for further configurations and 20M means 20 Megabyte space is to be allocated for cache.
inactive=120m means, after 120 minutes of a particular request serving, if no same request further comes, the cache file is to be deleted.
proxy_cache_key “$scheme$request_method$host$request_uri”
The above means the proxy key saving scheme, that is how a request url will be cached with the required content.
 location / {    proxy_cache server_zone;
    add_header X-Proxy-Cache $upstream_cache_status;
    …
}

Here for proxy cache, the server_zone will be referred. This particular directive will be needed to be configured, when nginx is to be used as cache server.
 add_header X-Proxy-Cache $upstream_cache_status is a useful header can be included as a configuration directive to understand – whether the resources, which are configured for cache, are actually served from cache as per configuration or not.
So above are the main congurations, which we have used for our nginx configuration. Currently we are working on dynamic request serving with proper cache, which we will update in our next post/s.
Reference : A very useful article about nginx caching is here.
If you find this article helpful, you can connect us in Google+ and Twitter

Leave a Reply

Your email address will not be published. Required fields are marked *