How to Speed Up Your Website

How to Speed Up Your Website

Website speed is one of the most important aspects of user’s browsing experience, especially for e-commerce websites. One study found that 57% of visitors close the page after waiting 3 seconds for it to load. Another study by Amazon concluded that an increase of 100ms in page loading speed would lead to 1% higher revenue. These numbers tell us that if we do not optimize page loading times, we may lose visitors and profit. This tutorial will show you how to improve your website speed in a few simple steps. If you are a webmaster, this tutorial will help you increase visitors (and revenue). If you are a web developer, these simple tips can make your client extremely satisfied.

How does page loading work?

The first step in loading a page is a request — your browser sends a request to website’s server asking for a particular web page. Since the website’s name must be resolved to an IP address before actually sending a request, this whole process takes around 300ms, depending on your internet connection and server location. In the case where there is a redirect, this process would have to be carried out twice. After receiving the request, the server runs the scripts. At this point, there may be two speed problems — problem with script speed and problem with slow execution of database queries. The latter is more probable, especially for larger websites. When the script has finished working, the output is sent to user’s browser. Lots of textual data can impact page loading times, as well as CSS scripts, JavaScript files and images. Continue reading to find out more about fully optimizing the speed of your website.

Best tips for speeding up your website

Parallelization and serving static content from cookieless domains

Let’s start from the beginning — optimizing the server connections. As I mentioned above, redirects can slow down your website, hence I recommend avoiding them if possible. Another important thing regarding server connections is that each CSS file or image represents one request. Although these requests are queued, a number of them can be executed at the same time (up to 25 from one address). On the other hand, some requests cannot be parallelized (e.g. JavaScript). In practice, this means that it is a good solution to serve images and static content (CSS and JavaScript scripts) from another web address (e.g. a subdomain). Have in mind that a subdomain is regarded as a different address and that its IP must be resolved, which means that the speed improvement from parallelization must be more than 300ms (the speed loss as a result of requesting another web address). Another advantage of using a subdomain for serving images is the possibility to remove cookies. Cookies are data available across the whole website and are sent back with each file, even with the images. Imagine that you have 50 images on a page and 2KB of data in cookies — this is 100KB of unnecessary data. The solution to this problem is simple:

  1. Create a subdomain (e.g.
  2. Change the cookie domain in your scripts to
  3. Move the CSS, JavaScript and image files to the subdomain and change URLs in scripts.
See also  Strengthening Business Cybersecurity With Threat Intelligence

Script and database optimization

You can measure script performance by comparing microtime at the beginning and the end of the script. Usually, there is no problem with the script itself, but with the database connection. SQL query performance can be checked by running a query in phpMyAdmin. If you notice problems with database speed, try to break complex queries into simple ones. Avoid JOIN statements as much as possible. Do not use “ORDER BY RAND()” for fetching random rows. Set up a cron job that will execute OPTIMIZE query regularly. Lastly, check if the database query caching is turned on.

Output optimization

Script minification

My first advice regarding the speed optimization of data sent back from the server is to use pagination. Show 10 or 20 products per page — this way the server would have less data to return and the query would execute faster. After that, you should do the minification of HTML, JavaScript and CSS scripts. Minification means removing blank lines and new lines in order to save space. Minified scripts are usually less than a half of the size of the original script. You can minify JavaScript here and CSS here. HTML minification is not that easy, since websites usually generate dynamic content.

Gzip compression

Additional script size reductions can be achieved by compression. Although some browsers may not support compression, modern browsers actually do support it. For Windows hosting and IIS web servers, hereis how to enable compression in server settings. In case you are using more popular Apache server on Linux hosting, you need to add the following code to your .htaccess file:

# compress text, html, JavaScript, css, xml:AddOutputFilterByType DEFLATE text/plainAddOutputFilterByType DEFLATE text/htmlAddOutputFilterByType DEFLATE text/xmlAddOutputFilterByType DEFLATE text/cssAddOutputFilterByType DEFLATE application/xmlAddOutputFilterByType DEFLATE application/xhtml+xmlAddOutputFilterByType DEFLATE application/rss+xmlAddOutputFilterByType DEFLATE application/JavaScriptAddOutputFilterByType DEFLATE application/x-JavaScript# Or, compress certain file types by extension:SetOutputFilter DEFLATE

Shared hostings may not have access to .htaccess. In that case, use the PHP solution and add the following code to the top of your script:


Although compression leads to increase in CPU load, it has great results regarding the website speed — the size of a file can be reduced for 75%. This saves bandwidth as well. If you are unsure whether your website uses compression, you can check it on thiswebsite.

Leverage browser caching

Browser caching is as much as important as compression and minification. All static content is cached by default, but cache lifetimes can vary. That is why it is important to make cache expire later. This tweak is also done by changing .htaccess file:

ExpiresActive OnExpiresByType image/jpg "access 1 year"ExpiresByType image/jpeg "access 1 year"ExpiresByType image/gif "access 1 year"ExpiresByType image/png "access 1 year"ExpiresByType text/css "access 1 month"ExpiresByType application/pdf "access 1 month"ExpiresByType text/x-JavaScript "access 1 month"ExpiresByType application/x-shockwave-flash "access 1 month"ExpiresByType image/x-icon "access 1 year"ExpiresDefault "access 2 days"

The problem with leveraging browser caching is that data might not be up to date if there are changes in CSS and JavaScript files. An interesting solution to this problem (used by Google as well) is to add a timestamp to static file names. Whenever a change is made the filename is changed, which means that browser treats it as a completely new file, disregards the cached file and gets a new one from the server.

The order of JavaScript and CSS files in document head

As I mentioned before, JavaScript files cannot be parallelized. This means that improper order of JavaScript and CSS scripts in document head may prevent CSS file from using parallelization in its full extent. So, always put stylesheet files before JavaScript files in the document head. Also, it is good practice to use style and JavaScript blocks in html as little as possible — keep them all in external files.

Asynchronous requests

These days there is much third-party content on websites: Google maps, Google AdSense, Facebook like button and other boxes, etc. It is best to load them asynchronously if possible, in order to prevent their server problems from impacting the speed of your website.

Image optimization

As today’s websites have many images, it means that they have a significant role in website speed. They cannot be compressed, but they can be resized. Scale them in an image editing program or on the fly in PHP, but never use HTML to scale images. You would end up loading a large image and then showing a small one in HTML, instead of downloading a small one and showing it as is. Also, specifying image width and height in HTML can improve rendering time. Small images should be merged into one and displayed by using CSS background-position, thus decreasing the number of server requests.


Using a cookieless domain can reduce data size for more than 100KB. Minification reduces the size of scripts by 50% and gzip compression makes files up top 75% smaller than the uncompressed versions. Since some of these methods require a significant effort in order to be implemented in a dynamic environment, it is recommended that you implement only those solutions that would help you achieve your speed-related goals with as little time and effort as possible.


About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.

About Our Journalist