How high you rank in the search engines is important for your business. Having a high ranking for certain keywords can drive immense traffic to your website without having to buy advertisement. It’s a great way to increase adoption of your product. As a business owner you want to stay up-to-date with your SEO rankings to be at the top results and be the first they click on when searching for that type of product.
Today we’ll show some pitfalls of SEO and how we can optimize and automate our rankings to get most out of our SEO and get the organic growth flowing in.
Nowadays ranking algorithms have adapted a lot and it’s not always straightforward to rank number one. In 2019 for example, Google launched its new BERT algorithm which got a lot of attention as every SEO professional needed to, and probably still has, to learn the tricks on how to optimize for BERT. While before you could go through with optimizing keyword usage, it’s now also about the positioning of these keywords and if the content you serve is relevant.
Next to this technical scores also play a role, think of mobile optimization, optimal network delivery, UX changes. For example, Google will also look on how much your page moves when visiting (because of cookie banners, wrong image heights, …). Luckily there are tools like Google Search Console, Lighthouse and many more that help us with this.
Google Search Console
The Google search console is your new go to address for tracking your SEO changes. It includes a lot of relevant information to get started with getting this high ranking in the first place. A good rule of thumb here (and perhaps most of the time) is, is it red? Then it needs fixing. Everything that is red, being from a page with an error or errors in vitality will impact your ranking thus costing you a lot of potential income.
At first you’ll have to validate that you own the site, there are several ways to do this. We used the DNS record as it’s easy to remove afterwards and doesn’t require a re-deploy. If you don’t have the technical know-how you can always ask your site admin to do this for you.
After setting the search console up for the first time it’s ideal to submit your sitemaps, this will help Google identify the pages that need to be crawled and will ensure that your pages get taken up in their crawling database. From then on it’s a waiting game, crawling takes time, try to get it (rather) correct from the first time.
The search console also has a great Performance dashboard that includes how many impressions and clicks you get for each keyword. It can be that some keywords are included that you might not optimize for, but that’s okay!
An important metric here is the Clicks VS Impressions ratio. When the impressions are high and the Clicks are low, it can be because of a few reasons:
- Your site is low ranked
- The Title, Description don’t invite to be clicked
Luckily we’ll learn further on how we can track this.
There are many search engines out there, however, optimizing for the the top three targets more than 96%! of the total search engine. The top 3 is Google, Bing and Yahoo!. As you might have guessed, Google takes the crown with a massive 91% usage. Which is incredible and makes single targeting Google worthwhile.
The image shows the total usage of the search engines from June 2020, credits to statscounter.
Here we clearly see that optimizing SEO specifically for Google is very important! Imagine being number 1 for a keyword that 91% of the people who browse the internet use!
Luckily Google is also the provider with the best toolset to get the SEO optimization done right!
The BERT algorithm
It’s new, it’s shiny, and… nobody knows it well… Introduced in 2019, it’s still something that we don’t know how to optimize 100% for. There are some great tutorials out there that tell you more about it.
You probably want to identify keywords to increase organic growth and get these clicks rolling in. But how do you identify the keyword that will thrive these clicks? There are a lot of better tutorials out there to identify keywords, so i’ll leave it up to them, to name a few:
Staying up-to-date with rankings
Google crawls the web many many times a day and your rankings potentially change daily. This means that for each keyword you won’t always know your ranking and will have to search it every day, over and over again. You could outsource this, but you still need the analysis. So how do you stay up-to-date with what Google or Bing is showing?
This is where Web Scraping comes in, it’s a very simple way to check your ranking every hour, day or however frequent you want.
For example, over at Scraper.AI we want to rank #1 for certain keywords. One of these is “ai scraper”. But it took a while to get there, daily checking all our keywords would just be a pain, so that’s why we automated this process.
We when to the URL where we monitor our keyword, in Google we ended up with something like this: https://www.google.com/search?q=ai+scraper while the bing alternative is https://www.bing.com/search?q=ai+scraper
Scraping Search results
Selecting results could be as easy as in the following screenshot. In this case we launched the Scraper.AI extension, went to “Select element” and hovered over the items we were interested in. In this case we wanted all non-ad URLs, Titles and Descriptions. If you want more than ten results, you can also hit the page selector and click the next page button.
I’m interested in the Title, URL and Description of the rankings
Now i can get daily updates of my results, and check how my SEO changes to my webpage are affecting my ranking. For Scraper.AI we take the delta of the ranking after scraping to see this increase.
We also took the description to see why our competition is potentially doing better. This way we get a better understanding of the competition and might get an insight in which keywords they’re using and score well for.
Scraping the Google Search Console
Next to the results, it’s also very relevant to get the Performance for each keyword. Let’s head over to the Google Search Console, go to the Performance tab and start scraping that list. Some of the most relevant metrics here are:
- The last update time, so we know when these metrics were updated
- The queries, to get our keywords
- The countries, perhaps we need some localization?
- The devices, might want to optimize more for mobile or tablet?
- Dates, optional, in case you want to make a fancy graph yourself. Though we can also just scrape the graph image 😃
An example of a Google Search Console
This is where most scrapers fall out, scraping pages that require authentication is not always supported and will require you to send cookies. Luckily, over at Scraper.AI we support authenticated page scraping.
We always recommend to log in as a user that has the least permission needed in order to complete the task. At respective companies like ourselves the data is heavily encrypted and stored away, but having this least permission approach will increase security even more.
Ironing out the errors
While we’re in the search console, it’s very helpful to also identify the errors that might arise. For example, our sitemap is rather important. Head over to Sitemaps and scrape the table of submitted sitemaps. This way we can track automatically when something goes wrong and the amount of URLs a sitemap contains.
A lot has changed in the SEO world, and a lot of manual work is involved in order to get that number one spot and be the king/queen of a specific search result. Only then will you be able to get the most traffic for that keyword. Afterwards it’s up to you to convert that traffic in to paid users!
In order to track changes to your SEO rankings, a lot of manual tasks and time was needed. Web scraping can decrease the work and time considerably and should be seen as a great technique for your workflow.
Not only can we track the increase of keyword hits, we can also track the errors and optimize our SEO results
Hey 👋, My name is Maxim from https://scraper.ai. Let us know what you think in the comments below