How do we scrape a SERP? - Flat101

How do we scrape a SERP?

Scraping a SERP means obtaining the top 10 or so results from a Google search for any word and putting them in a document. A recent Flat 101 project called for us to repeatedly obtain the first 100 results of a Google search for a series of keywords. What could…

Scraping a SERP means obtaining the top 10 or so results from a Google search for any word and putting them in a document.

A recent Flat 101 project called for us to repeatedly obtain the first 100 results of a Google search for a series of keywords.

What could we do to speed up the process as much as possible?

There are several different approaches that are perfectly valid, although I should warn you before you try to use them yourself: be careful when performing continuous scraping of different pages because Google can detect you and bombard you with captchas for an indefinite period, so it is a good idea to buy proxys and use them for this process.

As I am not a technician, I only use external tools that can make my work easier without having to bother my colleagues in development.

Tool 1: Mozbar for your browser

Using a browser add-on can be a quick solution if you don’t need to handle too many keywords or a large number of results.

Mozbar offers you an infinite amount of SEO information, but also the option to export it in a CSV:

Scrapear-serp-mozbar

If we use “Export to CSV”, it will automatically send an Excel file to our computer with the first 10 results of our search in a comma separated format.

Pro Tip: To obtain more than 10 results, go to Google configuration and in set the default option for showing results under “search settings” to either 10, 20, 30, 40, 50 or 100.

results for each page

Tool 2: Oscraper for Chrome

Oscraper is another browser extension whose free version offers the same function as Mozbar. It also has a version that costs 17 dollars which gives you a series of editable options to fine tune the resulting document:

Extract the URL with or without http

Extract the domain with or without www

Extract only the domain without the internal folders

Extract the URLs of the AdWords ads as well

Exclude domains that you are not interested in

Tool 3: Simple SERP Scraper

We saw this first at the Measure Camp de Madrid event in the talk by MJ Cachón. It is a desktop tool that we can find at Urlprofiler.com for free, and which has endless possibilities:

SERP Scraper

Once we have installed the tool on our machine, either MAC or Windows, we can see the main options:

Which version of Google we want to extract

How many results

How much delay we want between searches

Finally, we can enter the keywords we want to measure and we are ready to roll. The option to use our proxies to avoid potential problems is clearly displayed.

Do you know any other way? Maybe with an import from Excel? Why not make your contribution…



Leave a Reply

Your email address will not be published. Required fields are marked *

Últimos artículos publicados
News
5 min
Por Mercedes Gómez
17 May, 2018
Web Analytics / Digital
3 min
Por Flat 101
10 October, 2017
News
4 min
Por Flat 101
27 September, 2017
Development
4 min
Por Sandra Navarro
21 February, 2017
News
2 min
Por Miguel Ángel Vallés
31 October, 2016
Web Design and UX
7 min
Por Sandra Navarro
23 June, 2016

If you need development specialists to develop or improve your digital business, we can help you.

  • The data provided to us by the user through this form will be included in the data processing whose responsible is FLAT 101 S.L. with CIF B99393613 and registered office at Avda. Maria Zambrano, nº 31, Edif. WTCZ, Torre Oeste, 12D, 50018 de Zaragoza. You can contact us by calling 976419856 or by e-mail at info@flat101.es. The purpose of collecting data on this form is to be able to answer the queries raised and send the user the information requested via the email or telephone indicated on the form. Data will only be transferred if there is a legal obligation to do so. Your rights to access, rectify and delete, as well as other rights, are reserved, as indicated in the Privacy Policy.