Web performance

According to the 2002 book Web Performance Tuning by Patrick Killelea, some of the early techniques used were to use simple servlets or CGI, increase server memory, and look for packet loss and retransmission.

[7] One major point that Souders made in 2007 is that at least 80% of the time that it takes to download and view a website is controlled by the front-end structure.

FEO concentrates on reducing file sizes and "minimizing the number of requests needed for a given page to load."

This reduces HTTP requests and the number of "round trips" required to load a web page.

Web pages are constructed from code files such JavaScript and Hypertext Markup Language (HTML).

CDNs use dedicated web caching software to store copies of documents passing through their system.

Minification removes comments and extra spaces as well as crunch variable names in order to minimize code, decreasing files sizes by as much as 60%.

These changes, such as pixel complexity or color gradations, are transparent to the end-user and do not noticeably affect perception of the image.

[13] In recent years, several metrics have been introduced that help developers measure various aspects of the performance of their websites.

[18][19][20] Modules to measure metrics such as TTFB, FCP, LCP, FP etc are provided with major frontend JavaScript libraries such as React,[21] NuxtJS[22] and Vue.

[24] In addition to this, tools such as the Network Monitor by Mozilla Firefox help provide insight into network-level slowdowns that might occur during transmission of data.