4 Underrated SEO Tactics That Your Competitors Aren't Using.
Dheeraj Satta (????? ????)
Chief Commercial Officer (CCO) at AWS Group , Driving Revenue Growth , Oversight of Corporate Strategy, Product Development and Crisis Management
These SEO spells are not cast enough (but they totally should).
While many things tend to get worse with time, search engines tend to get better, at least for users. I mean what they strive for is to give the users exactly what they want. Another thing is that the SEO world has to struggle to comply with the search engines' aspirations. Sometimes it looks like you have to jump, count to three, and sneeze while drinking virgin tears when the moon is full.
Keywords, backlinks, mobile friendliness, and many other SEO must-do's are buzzing in our ears, and our heart longs something fresh, fast, and fiery. And we thought, why not call for brainy authority advice? Done! Not so long ago, we have asked some SEO superheroes about any underrated SEO tactics that people should focus on.
As a result, almost everyone elaborated on different sides of quality in SEO. But this quality is very versatile indeed, content quality being most thorny. It is well known that content is crucial now for ranking, and all the newly built Google roads are coming this way. However, this concept of content quality may seem so vague that in many cases it is hard to begin the analysis as the measurements are not clear enough. But hey, we are all here, that's a good reason to find some tips together.
1. Optimizing content for comprehensiveness.
When we ask a search engine to find something, how will it know what to look for if it is not a human? It will analyze a number of various factors, one of them being LSI, or Latent Semantic Indexing. This is a technology that helps Google to understand the connection between words, notions, and web pages. In what way?
The search engine analyzes tons of pages for their content. In the course of this analysis, it learns the relationships between the terms and the context they usually appear in. This, in its turn, allows it to also build expectations as to what terms are prone to appear in a specific context. Using then RankBrain, Google picks the results according to these expectations. So all in all, to rank better you do not have to use a few keywords over and over again. You have to build a cloud of associations related to your main topic and deliberately use the words and concepts from this cloud all over your content.
Effect on rankings.
So how to be sure that comprehensive content impacts ranking in any way? Check this out. Backlinko did a grand study (1 million Google results!) to measure different factors that may influence the SERPs. For this particular factor of comprehensive content, they used MarketMuse to examine 10,000 of URLs from their data set for "Topical Authority". They discovered that content that focuses on a topic in-depth ranks much higher in the SERPs than shallow content.
You see, the assumption that Google prefers results with more comprehensive content is now more than just an assumption.
How to optimize?
By using the TF-IDF formula, you can find the needed terms and notions, and how to use them, to improve the relevance and rankings of your web page. WebSite Auditor's TF-IDF tool searches for terms and concepts at the pages of your top ranking competitors. Then it selects the most important and relevant ones depending on their usage by the competitors.
2. Schema markup.
Coming back to quality and SEO superheroes, one more highlight to stop by was from Bill Slawski's, Director of Search at Go Fish Digital, on Schema markup:
"People aren't using Structured Data markup as described at Schema.org and on the Google Developers pages to earn information on rich Knowledge panels and rich snippets. These are sources of information about how search works and where it will be growing in the future that are worth paying a lot more attention to."
The exciting thing is that Bing, Google, Yahoo, and Yandex came together and, in spite of their competition, launched Schema.org, a markup based on the microdata specification. It is basically a uniform format that calls for data consistency across search engines.
Though Schema markup is easy to implement, it seems that few websites have an interest to use it. According to the study from SearchMetrics, less than one percent of websites are using Schema markup:
Yet, we should think a few steps ahead. When such giants cooperate on making a standard, we should think that this thing has a great potential to stay with us for a long time. Of course, it does not mean that everyone should cut on all other formats but this new one. However, we should step by step shift in this direction as Schema will continue to be supported and developed.
Effect on rankings.
Yes, Google keeps saying that Schema markup is not used in its ranking algos. At the same time it hints that with time this feature will have enough potential to become a part of the ranking algorithm.
According to the MOZ study based on the analysis of 17,600 keyword search results from Google.com (US), Schema markup has a small direct correlation with the rankings:
Indirect correlations are no less powerful. See for yourself. With the help of the specific semantic vocabulary some content gets indexed and returned in a different way, thus telling the search engines what the data actually means. Search output gets more efficient and relevant. Users get more satisfied as they are often able to pick a most suitable result right on the SERPs. As a positive consequence, website owners will receive higher CTR and lower bounce rates.
Additional recommendation: regularly check the CTR of your search results in Google Search Console, for example, once a month, to see if your optimization goes in the right direction. Besides, track your rankings with Rank Tracker to check whether increases in clicks have summoned higher rankings.
How to optimize?
1. Choose a suitable type. The bonus of Schema markup is that you can create a so called rich snippet that will appear in the search listings. This markup will make a signal to search engines that a particular page is about a certain thing from one of the approved categories:
- People
- Products
- Events
- Businesses and Organizations
- Articles
- Restaurants
- TV episodes and ratings
- Book Reviews
- Movies
- Music
- Software Applications
- Recipes
Search Metrics has conducted the study on the usage frequency of each type of Schema markup:
2. Edit the code. You can do it manually, of course, or with the help of Google's Structured Data Markup Helper.
3. Test the markup with the help of Google's Structured Data Testing Tool.
3. Crawl budget.
Most vital things are not right on the surface to see and tweak. However, if you have a website to manage, you need to dig in to see whether everything operates properly. Millions of pages are crawled through to let search engines show the best results for our queries. These results heavily depend on the crawl budget of your site, where crawl budget is the number of the site's pages that a search engine can and wants to crawl.
Effect on rankings.
Google says that crawl budget is not a ranking factor. However, crawl budget determines the frequency the important pages of your website are crawled with (and whether some pages are being crawled at all). So this factor can indirectly influence your ranking positions.
Gary Illyes from Google has written a blog post titled What Crawl Budget Means for Googlebot where he has mentioned the following factors that have an impact on the crawl budget of a website:
1. Popularity, as popular URLs are crawled more often.
2. Freshness, as the updated pages are crawled more often.
And we believe that the following things have enough power over rankings:
3. The number and structure of internal links to a page, as the healthy structure of links within the same domain makes the crawling more efficient.
4. The number of backlinks, as the backlinks pass on link juice to the website.
5. Site speed, as speedy sites are crawled more efficiently.
How to optimize?
To get a healthy internal links structure, you should check and fix the things listed below. WebSite Auditor is ready to help you at each stage of your endeavor, so I have illustrated each step with the options from our software (open your WebSite Auditor project, jump to Site Structure > Site Audit or Site Structure > Pages):
- Robot.txt file. This file should be accurate and constantly updated. You do not want to waste your crawl budget on privacy policies or expired promotions. And you do want to amplify your crawl budget over the important pages that are restricted from indexing but are not supposed to be this way.
- Click depth. Bear in mind that your significant pages should be no more than three clicks away from the homepage. This may not always be possible for larger sites. If you run one of those, make sure users can find what they're looking for by utilizing internal search and various widgets (think "similar posts" and "popular posts" for a blog or "you may also like" for an e-commerce store).
- Redirected links. A big number of redirects will increase a load time and waste the crawl budget. A chain of three and more redirects is a sign to do some optimizing.
- Orphan pages. Such pages are not linked to from other pages of the site. Even if search engines are more or less able to find these pages as they are in the sitemap, for users they are practically non-existent. You can either delete these pages if they do not have any value, or link them from other pages (rebuild your WebSite Auditor project by going to Site Structure > Pages and hitting the Rebuild Project button. At Step 2 of the rebuild, check the Search for orphan pages box and proceed with the rebuild).
- Duplicate pages. Nothing prevents the search engines from analyzing such pages, so the crawl budget will be wasted on them. Thus, if it is possible, get rid of the similar pages.
- Broken links. When a crawler finds a link to a 4XX/5XX page, you lose a unit of your crawl budget. Moreover, broken links are unpleasant experience for your users.
- Image alt attributes. Search engines cannot understand your images if you haven't provided an alt attribute. In other words, image alt attributes serve like anchor text for text links. In case the images on your website are important, make them visible and understandable.
Now you see, if you would like Google to crawl more of your important pages to retrieve better results, then you go and optimize your crawl budget as if it is a VIP ranking factor.
4. Going social.
For a dessert, let's take the words of Tim Soulo, Head of Marketing at Ahrefs, who we also have asked about underrated SEO tactics:
"Apart from keeping all content on your website fresh and awesome, one other underrated tactic is spreading the word about it. The good old "if you publish, they will come" notion. That never happens. Even the best content needs a ton of promotion to thrive."
If you have a website, it would be a nice thought to build a bridge to wider audiences. By far, the best kind of bridge material is the social media. People of all the professions, statuses, geographic locations and what not use social media in different ways and portions. And the phenomenon goes like this — we tend to treat people who we follow as our friends and their recommendations as the best option for a particular kind of product or service.
Effect on rankings.
Back in 2010 Google's Matt Cutts published a video where he said that social signals were a ranking factor. After that a variety of studies proved the same thing. You can look at SearchMetrics' Rank Correlation for 2013 and the case studies shown in this infographic from Quicksprout. They showed that such social signals as Facebook Likes and Google +1s are one of the highest correlation factors for higher rankings in the SERPs.
After these studies many made the conclusion that these social signals were a cause of higher rankings. But what a bitter blow we all received when in 2014 the same Matt Cutts clearly stated in his video that Google does not consider social signal as a ranking factor.
Some hot-heads got around and tried to explain that a correlation factor does not have to be a causal factor. For example, Matt Cutts says in his video that websites with higher social activity also tend to be so good that they draw other signals that actually do have a ranking power.
The reasons for non-inclusion of social media to the ranking mechanism are quite understandable though. The results that we receive from social media accounts are highly volatile, and there are so many of them that it is a super task to index them all.
The sunny sides to all this dismay are the following.
- Social media pages are indexed. By stating that Google does index all the social media pages, Matt Cutts means that when something takes place on Facebook and Twitter, and this page can be crawled, then these pages are able to pop up in the SERPs as any other page can.
- Social media channels are search engines. Another exciting thing is that social media channels are search engines as well. These days people look up things not only in Google and Bing. They can do it right from the search box of any social media channel. And while social media accounts of even corporate giants have a bit of a personal touch and rich visual content, a user can be satisfied with this search result up to 100%.
- There is still Bing. Moreover, while Google backtracked on the issue of social media signals, there is no evidence that Bing has done the same. So the statement of Bing still holds true:
"We do look at the social authority of a user. We look at how many people you follow, how many follow you, and this can add a little weight to a listing in regular search results."
Finally, a cherry on the cupcake: in his Pubcon Las Vegas 2013 keynote speech Matt said that social signals should not be looked at for "short term" benefits (as a direct ranking signal) but rather as a "long term" play. What does it mean?
Social media does not seem likely to disappear in any near future, it establishes more presence in our lives. So with time search engines will have to incorporate these signals in their rankings, at least to find the indication which sources can be trusted.
How to optimize?
- Choose your platform. People now spend more and more time on social media, especially Twitter, Facebook, and Instagram. If you have a business and you are not on social yet, consider to take a dive as all your potential customers and users are there to be found.
- Of course, not all social accounts are suitable for a particular kind of business or activity. Twitter might be not as efficient as a blog or an Instagram account. How should you understand what is more suitable? I think the good thing with the social media is that you can sneak peek at the way your competitors do it. Moreover, you can see their followers and the accounts that they are subscribed to. In case you are interested to explore this subject further, you can refer to the following related post.
- Make a first move. The first moves for social inclusion are pretty simple — you create a public profile and put the link to your website so it can be clicked in case people are interested and need to go further. Once you have established your social presence, you should understand that adding lots of friends and lots of hashtags won't make you popular overnight.
- If you like to get the quality backlinks and worthy mentions, you have to be choosy at who you befriend or which blog you ask to make a post about you. One niche influencer who recommends you is worth more than a hundred random subscriptions and blog posts.
- Make a contact. You should remember that a real-time feed is now more than it should be — things come and go at great speed, and what means more is the immediate response. To fuel this response, you have to engage with your audience, react to comments, both positive (thank people) and negative (suggest a solution).
- For starters, if you have a modest following, you can do it manually. But as your social media reach grows, you may want to use a social media monitoring platform (like Awario) to track mentions of your brand and respond to them in time.
Mainly this interaction, by way of mistakes, slips, and confusions, will help you to build your reputation, form a clearer picture of what you do, what your audience is, and where to lay the ways for improvement.
The best thing is that in this particular moment of time you do not have to optimize your social media account for any ranking signals. Your social presence, connections you make, and the traffic that you get are ranking signals, and with the help of this presence Google will adapt its algorithms.
You see that all the tactics that are covered here are not those well-known ranking factors confirmed by Google. In spite of it, these tactics have an ability to influence the ranking factors, directly or indirectly, trigger higher awareness for your brand, and make you more visible in the SERPs.
However, as everything comes and goes, what you truly should do is to listen to what your guts tell you. Some new waves of algorithms may feel as a disaster to your rankings. Things change but their cornerstones do not. If you have made a real effort to brush your website up, you understand how the implemented things really work (if you do not, please take time to deal with this nuisance — our tools and blog to your rescue!), then you will quickly adapt to the changes as they are usually not as serious as they seem.