This website uses cookies

Our website, platform and/or any sub domains use cookies to understand how you use our services, and to improve both your experience and our marketing relevance.

Configure Varnish Cache And Speed Up Your Application Load Time 10x Faster

Updated on July 7, 2021

10 Min Read

Irrespective of the size and complexity of your application, users will prefer to stay or keep coming back if they enjoy the speed at which the dynamic content from your application is being rendered. Improved performance, quick page delivery, and rapid server response are also some of the few factors which keep users hooked to your application.

These and many more are the contributing factors for any successful and popular web application. While there are several techniques and strategies available to help speed up modern web applications, performing caching has proven to be much more valuable.

Every request to a particular page on your website is directly communicating and getting responses from your server. So each time a user visits a particular page, your front-end makes an HTTP call to fetch the appropriate contents and your server is bound to respond accordingly.

For a small scale web application, this might not be too much but once your application starts to scale, a huge number of requests to your web server will gradually start becoming too much to handle. This is often considered an unnecessary overload.

Serving a web page takes up a lot of resources – especially when it is dynamically generated like those created via PHP. Therefore, in this Varnish cache tutorial, I will show you how to increase the speed of your PHP application without necessarily scaling vertically or horizontally but using Varnish.

Stop Wasting Time on Servers

Cloudways handle server management for you so you can focus on creating great apps and keeping your clients happy.

What is Varnish Cache and Why to Use it

Varnish cache is a web application accelerator also known as caching HTTP reverse proxy. It acts more like a middle man between your client (i.e. user) and your web server. That means, instead of your web server to directly listen to requests of specific contents all the time, Varnish will assume the responsibility.

Once a request comes in for the first time, Varnish in PHP will direct it to the web server for an appropriate response. This response will be cached by Varnish in PHP before being sent to the client. Interestingly, any subsequent request for such content will just be served up directly from the Varnish cache instead of going straight to the web server. With this in place, your web application will be able to manage a huge number of concurrent requests from several users as the server will not even be started. This will result in a magical increase in performance for your application.

PHP Varnish uses Varnish Configuration Language (VCL) to allow you to modify its behavior by adding logic to manipulate requests. You can manipulate responses coming back from the web server, remove cookies, or add headers to the response.

Varnish workflow

Get /some-web-page (client) —→ Cache –→ Get /some-web-page (Server) —-→ Cache –→ Back to client.

varnish cache workflow

Let’s assume the first request to the page takes about 200ms.

Now, once caching the content of such page completely, the flow changes entirely and the request from the client to that same page won’t hit the server directly again, because Varnish has already cached it earlier. Take a look at the illustration below:

GET /some-web-page (client ) -→ Cache (HIT) . —– > And back to the client

A subsequent request to the page 10ms…..

Awesome right?

Hence, the idea is to reduce the number of requests sent to your backend server as much as possible. This will in return increase page rendering speed for your web application.

Getting started with Varnish Configuration

Now, as you have proceeded with the basic introduction of Varnish and its caching capabilities, it’s time to now configure Varnish and use it for a PHP application. Getting started with Varnish is easy. To properly set up, we will use and pull an existing application from GitHub and deploy it to the DigitalOcean server. After that, we will configure Varnish.

Next, set up a free account on DigitalOcean, and once completing that, click on the “Create” button, and from the dropdown, select “Droplets” to create a new droplet.

Scroll down to select Ubuntu 16.04 and note that you should select the size of a droplet, the smallest size should suffice for this varnish cache tutorial.

Scroll down to add an SSH key, if you have one. This will help you to easily log in to the server later, otherwise you will need to check your inbox for the default password for your droplet. Next, add a hostname. I have named mine varnish-demo. Now click on Create to start creating the droplet.

Once the process is complete, you will see the new droplet in the list of droplets. Go ahead and add SSH into your droplet using the IP address.

Get Ready for Core Web Vitals Update

Ebook to Speed Up Your Website Before You Start Losing Traffic.

Thank You

Your list is on it’s Way to Your Inbox.

Install Nginx web server

You can now update the package list and install Nginx. Run the following command to perform that:

sudo apt update
sudo apt install -y nginx
  • Note: -y flag is to respond with a default answer to any question asked during the installation

On Ubuntu 16.04, Nginx is pre-configured to start running upon installation, so once the installation is complete, you can visit the IP address of your droplet in the browser:

http://server_domain_or_ip

Pull demo from the GitHub

I will use Git to pull the sample project from GitHub, it comes pre-installed with Ubuntu 16.04. To check whether it is already installed on your server or not, use the below-given command:

git -v

If you get a message that Git is not installed by default, then you will have to run the following command to install it:

apt install -y git

Next, you have to use the same folder that houses the default page for the Nginx web server, so navigate to the public directory:

cd /var/www

Remove the html folder, recreate it and move into it:

rm -rf html

mkdir html

cd html

Now, clone the repository

// Install project

git clone https://github.com/yemiwebby/varnish-demo.git

With this in place, you now have a sample project installed on your server. But it won’t be accessible for now, and that is because we have not installed PHP. Let’s do that.

You Might Also Like: Using Memcached with PHP

Installing PHP and Configuring Nginx to Use the PHP Processor

Nginx does not contain native PHP processing like some other web servers. We will install a software named php-fpm and instruct Nginx to pass all PHP requests to it. Navigate back to the root directory of your web server and run the following command to install the PHP-FPM module:

$ cd ~

$ sudo apt install php-fpm

Once the installation process is complete, you can now configure Nginx to use the installed PHP processor. Open the default Nginx server block configuration file with:

sudo nano /etc/nginx/sites-available/default

and replace its content with:

server {

       listen 80 default_server;

       listen [::]:80 default_server;

       root /var/www/html;

       index index.php;

       server_name YOUR_SERVER_IP;

       location / {

               try_files $uri $uri/ =404;

       }

       location ~ \.php$ {

               include snippets/fastcgi-php.conf;

               fastcgi_pass unix:/var/run/php/php7.0-fpm.sock;

       }

       location ~ /\.ht {

               deny all;

       }

}

When you are done, reload Nginx to make the necessary changes:

$ sudo systemctl reload nginx

Now, you can visit the page in your favorite web server using your droplet’s IP address:

http://ip_address

Enabling caching using Varnish

If you inspect the page to view the HTTP headers from the network tab, you will not notice any traces of Varnish because it is not installed or configured to handle any request. Let’s change that by installing Varnish on our web server. To do this, type the following command:

$ sudo apt install -y varnish

This will go through the process of installing a Varnish cache on your server. After completing the installation, start and enable Varnish with:

systemctl start varnish

systemctl enable varnish

Varnish by default listens on port 6081. To confirm this, open your browser and visit your droplet IP address again, but this time, ensure that you append:6081 as shown below:

Nginx, which is the web server responsible for loading the contents of your web application, runs on port 80. To allow Varnish to take the responsibility of caching and rendering the page as quickly as possible, it needs to sit in between your web app client and the server. This will ensure that any request to your server which will mostly be on port 80, will be intercepted by Varnish, and depending on whether it is a cache hit or miss

, Varnish will act on it accordingly and return the appropriate response.

Supercharged Managed PHP Hosting – Improve Your PHP App Speed by 300%

I will be changing the process by configuring Varnish on port 80 and Nginx on port 8080. To do this, edit your site’s Nginx configuration file again. In our case, this file is /etc/nginx/sites-available/default. Change the 80 to 8080:

server {

       listen 8080 default_server;

       listen [::]:8080 default_server;

       root /var/www/html;

}

Port 80 will now be free to use for Varnish. Reload your Nginx configuration with:

sudo systemctl reload nginx

and check the availability of your website using your IP address with port 8080 appended to it:

When Varnish was installed earlier, two configuration files were also created on the server. They are:

  • /etc/default/varnish
  • /etc/varnish/default.vcl

I will use these files for configurations such as opening a port for Varnish and manipulating requests. Now, open this file /etc/default/varnish:

sudo nano /etc/default/varnish

   # Listen on port 6081, administration on localhost:6082, and forward to

   # one content server selected by the vcl file, based on the request.

   #

   DAEMON_OPTS=”-a :6081 \

                -T localhost:6082 \

                -f /etc/varnish/default.vcl \

                -S /etc/varnish/secret \

                -s malloc,256m”

Look for the line above and change the -a :6081 to -a :80.

Next, open the file /etc/varnish/default.vcl:

sudo nano /etc/varnish/default.vcl

and check that the default backend is set to port 8080, because this is where Nginx serves from now. Look for the line below and make sure the .port is 8080.

   # Default backend definition. Set this to point to your content server.

 backend default {

       .host = "127.0.0.1";

       .port = "8080";

   }

The next thing is to copy the varnish.service file to our systemd directory. This will enable systemd to start Varnish on port 80. Run the command below:

$ sudo cp /lib/systemd/system/varnish.service /etc/systemd/system/

This will copy the Varnish service file.

Next, open it:

$ sudo nano /etc/systemd/system/varnish.service

and  look for the ExecStart line:

[Unit]

Description=Varnish HTTP accelerator

Documentation=https://www.varnish-cache.org/docs/4.1/ man:varnishd

[Service]

Type=simple

LimitNOFILE=131072

LimitMEMLOCK=82000

ExecStart=/usr/sbin/varnishd -j unix,user=vcache -F -a :6081 -T localhost:6082 -f /etc/varnish/default.v$

ExecReload=/usr/share/varnish/reload-vcl

ProtectSystem=full

ProtectHome=true

PrivateTmp=true

PrivateDevices=true

[Install]

WantedBy=multi-user.target

//

ExecStart=/usr/sbin/varnishd -j unix,user=vcache -F -a :6081 -T localhost:6082 -f /etc/varnish/default.vcl -S /etc/varnish/secret -s malloc,256m

Now change the -F -a :6081 to -F -a :80. Save and exit the file.

If you check the network statistics using the netstat command:

root@varnish-demo:~# netstat -plntu

Active Internet connections (only servers)

Proto Recv-Q Send-Q Local Address           Foreign Address State PID/Program name

tcp        0 0 0.0.0.0:6081            0.0.0.0:* LISTEN   13257/varnishd

tcp        0 0 127.0.0.1:6082          0.0.0.0:* LISTEN   13257/varnishd

tcp        0 0 0.0.0.0:8080            0.0.0.0:* LISTEN   2440/nginx -g daemo

tcp        0 0 0.0.0.0:22              0.0.0.0:* LISTEN   1648/sshd

tcp6       0 0 :::6081                 :::* LISTEN   13257/varnishd

tcp6       0 0 :::8080                 :::* LISTEN   2440/nginx -g daemo

tcp6       0 0 :::22                   :::* LISTEN   1648/sshd

You will notice that Varnish cache is still running on port 6081. Change that by restarting Varnish:

systemctl daemon-reload

systemctl restart varnish

and also restart Nginx.

sudo service nginx restart

At this moment, if you visit the IP address of your web page, you will notice that the website still works perfectly, but inspecting the HTTP header will indicate that Varnish is installed and running.

You Might Also Like: Integrate PHP Opcache & Make Your Application Win Big With Performance

An overview of Varnish internal

Varnish triggers a couple of routines during completing the processing of the caching content of your web application. Let’s take a look at some of the most important routines, and what each one does to ensure that contents are delivered at the speed of light which in turn will improve the performance of your web application:

  • sub vcl_recv{}:  This is the first line that Varnish will hit the moment your client accepts a request.
  • sub vcl_backend_response{}: This callback is called after response and has been retrieved from the backend server. You can also make modifications to the response here in this block.
  • sub vcl_backend_fetch {}:
  • sub vcl_deliver {}: This is the last routine that VCL will hit before sending a response to the client. Here, you can easily do some cleanups, like removing whatever that you don’t want the client to see and so on.

Q: How to clear Varnish cache?

A: You can purge all Varnish cache from the command line using the ban command:
varnishadm “ban.url .” # Matches all URLs
You can also ban particular caches by describing the hostname:
varnishadm “ban req.http.host == xxx.com”

Q: How to disable Varnish cache?

A: You can easily disable Varnish from your website by editing the .htaccess file with the following code:
Header add “Cache-Control” “no-cache”
This will temporarily disable Varnish from your website. You can later enable Varnish by removing the following code.

Q: How to purge Varnish cache with CLI?

A: You can clear or purge the Varnish cache by simply restarting the Varnish service. This will purge all the Varnish cache because it is stored in memory by default:
/etc/varnish restart
Or
service varnish restart

Q: How to check if the Varnish cache is working or not?

A: To check whether your Varnish service is working fine or not, use the X-Varnish header which lets you know the active/non-active state of the service. To look more in detail whether it is hitting the cache, use the X-Cache header to get a complete analysis of caches.

Q: Varnish cache vs Memcached: Which one is better?

A: Both Varnish & Memcached are useful in particular caching domains, however, there are some differences between the two.
You can use Memcached as an in-memory, distributed backend for caching applications assets.
While Varnish is used as a reverse proxy to externally cache application’s HTTP requests.
Memcached can be termed as the general-purpose caching system as it caches results from the database and requires little changes in the GET method.
Whereas, Varnish works behind the web server, caches web pages, and doesn’t require changes in code.

Host PHP Websites with Ease [Starts at $10 Credit]

  • Free Staging
  • Free backup
  • PHP 8.0
  • Unlimited Websites

TRY NOW

Final Words

In this post, I have barely scratched the surface of what you can do with Varnish. It is so powerful that you can do much more with it. Apart from being able to cache the content of your web application and handle requests, Varnish cache can also be used as a load balancer and much more.

Check the official documentation here to learn more about Varnish and other awesome configurations that can be carried out using it. Feel free to leave a comment, question, or suggestion in the comment section below.

Share your opinion in the comment section. COMMENT NOW

Share This Article

Owais Khan

Owais works as a Marketing Manager at Cloudways (managed hosting platform) where he focuses on growth, demand generation, and strategic partnerships. With more than a decade of experience in digital marketing and B2B, Owais prefers to build systems that help teams achieve their full potential.

×

Get Our Newsletter
Be the first to get the latest updates and tutorials.

Thankyou for Subscribing Us!

×

Webinar: How to Get 100% Scores on Core Web Vitals

Join Joe Williams & Aleksandar Savkovic on 29th of March, 2021.

Do you like what you read?

Get the Latest Updates

Share Your Feedback

Please insert Content

Thank you for your feedback!

Do you like what you read?

Get the Latest Updates

Share Your Feedback

Please insert Content

Thank you for your feedback!

Want to Experience the Cloudways Platform in Its Full Glory?

Take a FREE guided tour of Cloudways and see for yourself how easily you can manage your server & apps on the leading cloud-hosting platform.

Start my tour

CYBER WEEK SAVINGS

  • 0

    Days

  • 0

    Hours

  • 0

    Mints

  • 0

    Sec

GET OFFER

For 4 Months &
40 Free Migrations

For 4 Months &
40 Free Migrations

Upgrade Now