SEO Automations That Can Make Your Life Easier
SEO is a way of Digital Marketing that is often perceived as requiring disproportionate amounts of manual human work, often repetitive human work.
This is one of the things that turns away a lot of aspiring SEOs away from this domain because they think it's too much of an effort.
And I am not gonna lie to you, it does require a lot of effort to be put in in comparison to other aspects of Digital Marketing.
But SEO isn't as labor-intensive as much as it made to look like.
There are tools, mechanisms, and hacks that can save a ton of time; I think automation is a more precise word to use here.
Here are some automation and hacks that I have found extremely helpful, I will go on explaining as per situation that arises day to day.
Situation #1: You Want to Generate Meta Tags Recommendation Spreadsheet
The traditional way to go about this is to open each and every page go to its source code or meta chrome extension and copy their existing Meta's, but there are two smarter ways that should be leveraged instead; I like the second one in my opinion.
1st way, you scan the website on Screaming Frog Free Version/ Paid and it will enlist all the URLs it crawled with it's respective Meta Title and Meta Description, you export that and begin creating recommendations with different columns in that spreadsheet.
Using Screaming Frog Internal Tab you can export all the data that includes your Meta Title & Meta Description
In the free version data segregation and the assortment is a challenge in the situation when you can to do this for a specific type of landing pages for example on /Product/ landing pages, that's where 2nd way comes into the play
2nd Way involves using 2 tools, 1 is Scrape Similar Chrome Extension that will scrape data using Xpath, and the other is Spreadsheet with Custom formula
first, you will open the sitemap from where you want links and right-click one of the links, and click "scrape similar"
In two simple steps, I have copied all the links from that specific sitemap
Now I have to paste these links into the Google Spreadsheet and then in another column using the formula I can totally extract their Meta Titles and Meta Descriptions
Let me show you
In this spreadsheet you can see how I have extracted the title tag from the URL using the formula
The formula is
=importxml(A2,"https://title")
A2 is the cell location you have to use your cell location
In this spreadsheet you can see that I have extracted the meta description using the formula and the formula is
=importxml(A2,"https://meta[@name='description']/@content")
you drag these rows down and you get the data for every URLs pretty cool, right!
Situation #2: You want to check the HTTP status codes of the URLs
Checking the HTTP Status codes of the web pages is an important practice that we SEOs have to perform from time to time to weed out the 404 & temporary reduction pages so that Google's crawl budget isn't wasted on unnecessary pages and so that even you can optimize your own crawl budget.
One way to go about is to wait for Google Search Console to highlight these errors in the Coverage report as it performs the crawl or you can use a tool like Screaming Frog (in the free version it will crawl only 500 pages and this includes image files, paginated pages and JS & CSS pages)
There is a way to get the complete data for free and this too can be accomplished via Google Spreadsheet
Here's how it's done
firstly you will paste the URLs in the spreadsheet, then you will need to add a code in the app script and then there's a formula for extraction
That right up there is the formula that you will need to use to extract the status code, the great thing is that this also showcases the 301 redirects so though they eventually turn out to be 200 OK you know here if they had 301 redirect that led to 200 OK
Here is the script that you will need to add to in the script editor
function getStatusCode(url){ var options = { 'muteHttpExceptions': true, 'followRedirects': false }; var url_trimmed = url.trim(); var response = UrlFetchApp.fetch(url_trimmed, options); return response.getResponseCode(); }
Source for the script → Adham El Banhawy
Situation #3: You want to check what pages of a site are not indexed/ are indexed on Google
One way to go about this is to run Google search operator site:example.com and Google will show the number of results and you can download the SERP report using SEOQuake, but that would only reveal the number of indexed pages not the unindexed pages; it will only some processes to get to that.
There is a way through Google Spreadsheet to get to this data sooner.
That is the formula is used to check which pages where indexed from my blog (all are it appears ??)
And the Script you need to trigger this formula is
function checkIfPageIsIndexed(url) { url = "https://www.google.com/search?q=site:"+url; var options = { 'muteHttpExceptions': true, 'followRedirects': false }; var response = UrlFetchApp.fetch(url, options); var html = response.getContentText(); if ( html.match(/Your search -.*- did not match any documents./) ) return "URL is Not Indexed"; return "URL is Indexed"; }
Source of the script is Black Hat World
Situation #4: Extract all the People Also Ask Question
For this I use a Free SEO Software called "SEO Ruler"
Here is how it's done
First you go to the software in that you will go to SERP Extract and add your query and perform the Google search and then below you will choose PAA (question) and then you will tap Auto Click PAA (10 times) that way 10 times question will be clicked which will automatically trigger even more question for you to extract
As you can in the image the dashboard of SEO Ruler in the left of Auto Click you have the option to extract, once you click on that you can extract the list of People Also Ask questions at once into an Excel Sheet.
It took a lot of efforts to put this together for the LinkedIn Tribe, please do share if this added any value to you, if you learned anything new. Moving ahead I will be sharing more SEO Automations with you all.
B2C & B2B Lead Generation Specialist with 8+ Years of Experience
4 年Which programming language did you used automation
Founder/ CEO - MarkBuzz LLP
4 年This is awesome ??
Digital Marketing Trainer at Digital Badi | Creating Industry-Standard Digital Marketers
4 年Thank you Kunjal