How to review Core Web Vitals scores in bulk

Pagespeed insights is a popular tool for many SEOs and web developers. Using the Lighthouse tool, its one of the best resources available for identifying and fixing speed issues to improve user experience. However, manually checking each URL via the website is a slow and manual process. Therefore, I have created a script to check the Core Web Vitals for a large number of URLs in bulk using the Pagespeed Insights API. No coding knowledge is required!

[do_widget id=custom_html-3]

What are Core Web Vitals?

Core Web Vitals are a set of metrics which Google use to measure the user experience of a web page based on speed and performance. At some point in 2021, Google will officially use Core Web Vitals as a ranking factor. Currently, these are split into 3 aspects:

  • Largest Contentful Paint (LCP) – How quickly the visibile part of the page loads
  • First Input Delay (FID) – How quickly can the browser respond when interacting with a clickable element
  • Cumulative Layout Shift (CLS) – How much the elements page moves around during page loading

Monitoring these metrics is important to providing a positive user experience. To do so in bulk for a number of URLs, just follow these simple steps:

Step 1 – Get API Key

To enable the script to work for large numbers of URLs, obtaining an API key is recommended. This is a unique identifier, which can easily be pasted into the script to authenticate requests made to the Pagespeed Insights Tool.

To obtain an API Key, just visit this page and then click on “Get a Key”.

Step 2 – Copy this script

Just open this script and save a copy.

It won’t work until you set up the API key. To do this, just replace the text “INSERT API KEY HERE” with your unique API Key identifier. Once done, you’re good to go!

Step 3 – Upload URLs

Before running the script, you will need to save your list of URLs to test in a CSV format, with just one column and no header. Once set up, just run the script by pressing CTRL+ENTER and wait for it to download the results.

Please note that this may take a while as you will need to wait for the Pagespeed Insights API to run for each individual URL. Just start the script and get on with your usual tasks whilst waiting for it to complete.

[do_widget id=custom_html-3]

Field Data Vs. Lab Data

Pagespeed Insights reports two sets of results; field data and lab data. Field data uses results from real world usage from Google Chrome over the last two weeks and is a more accurate overview on how users experience the site. Lab data is based on live results from Google’s servers whilst running the test. Lab data is preferred for analysing pagespeed because improvements can be more directly observed.

For this script, Largest Contentful Paint and Cumulative Layout Shift will be reported from lab data. However, First Input Delay is based on field data as it is not reported in lab data.

Stay tuned, there’s more to come

Currently, this script only reports on the primary Core Web Vitals metrics. Future versions will include report on the opportunities for speed improvements to allow for bulk analyse pages in greater depth so stay tuned!

Enjoy this script? Why not try this script to scrape Google Trends.

[do_widget id=custom_html-3]

11 thoughts on “How to review Core Web Vitals scores in bulk”

    • Thanks for your feedback, will try and add a screenshot later. Could you please give some more detail on where you got stuck? Is there an error message? If you’re using a small screen, you may need to scroll down to see the button to upload file.

      Reply
  1. Hi David, Thank you very much for the article. While implementing above steps. I getting error at: pagespeed_results = urllib.request.urlopen(‘https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url={}&strategy=mobile&key= Mykey ‘\ ## Replace key with API Key
    .format(url)).read().decode(‘UTF-8’)

    I have replaced my key wherever is required.

    I have uploaded my url’s and a red color cursor is getting when I try to run the code. Could you please guide be on this.

    Thanks & Regards,
    Deepak

    Reply
    • Hi David, I did have a play around using Google Sheets a while back. Getting the list of URLs from a sheet is fairly straightforward as I recall, but I couldn’t find a way to export the results back into the sheet. I’ll keep working on it when I have time, but in the meantime you’ll just have to copy and paat from a CSV.

      Reply
  2. Hi David.
    Thank you for this.
    I am getting an error here early in the process:

    —————–
    KeyError Traceback (most recent call last)
    in ()
    21
    22 largest_contentful_paint = pagespeed_results_json[‘lighthouseResult’][‘audits’][‘largest-contentful-paint’][‘displayValue’].replace(u’\xa0′, u”) # Largest Contenful Paint
    —> 23 first_input_delay = str(round(pagespeed_results_json[‘loadingExperience’][‘metrics’][‘FIRST_INPUT_DELAY_MS’][‘distributions’][2][‘proportion’] * 1000, 1)) + ‘ms’ # First Input Delay
    24 cumulative_layout_shift = pagespeed_results_json[‘lighthouseResult’][‘audits’][‘cumulative-layout-shift’][‘displayValue’] # CLS
    25

    KeyError: ‘loadingExperience’
    ————————-

    Thanks.

    Luke

    Reply
    • Hi Luke, I just ran the script and it looks to be working for me. Have you set up an API key? Does the error come up straight away or after crawling a number of URLs?

      Reply
  3. Hello,

    I get the following error:

    KeyError Traceback (most recent call last)
    in ()
    21
    22 largest_contentful_paint = pagespeed_results_json [‘lighthouseResult’] [‘audits’] [‘largest-contentful-paint’] [‘displayValue’]. Replace (u ‘\ xa0’, u ”) # Largest Contenful Paint
    —> 23 first_input_delay = str (round (pagespeed_results_json [‘loadingExperience’] [‘metrics’] [‘FIRST_INPUT_DELAY_MS’] [‘distributions’] [2] [‘proportion’] * 1000, 1)) + ‘ms’ # First Input Delay
    24 cumulative_layout_shift = pagespeed_results_json [‘lighthouseResult’] [‘audits’] [‘cumulative-layout-shift’] [‘displayValue’] # CLS
    25

    KeyError: ‘loadingExperience’

    Reply
    • Hi Rachel, I just ran the script and it looks to be working for me. Have you set up an API key? Does the error come up straight away or after crawling a number of URLs?

      Reply
  4. Hello David,

    I replaced “INSERT API KEY HERE” with my API KEY.
    “Choose Files” is grayed out
    I do “Run Cell” and “Choose Files” is no longer greyed out.
    I choose a CSV containing a few URLS (1 single column and no header)
    After choosing the CSV, it indicates “100% done” then just after the error measurement (no url analysis carried out)

    Reply

Leave a comment