How to increase your SEO traffic in just 30 days - Part 2
Achin Gupta
Associate Director - Product | Finding PMFs(0-1) and Scaling legacy products | Working Across Gig Economy, EdTech & Marketplaces
I hope you were able to follow the first 7 days and were able to make the most of it. In any case, please give that a read.
How to increase your SEO traffic in just 30 days - Part 1
Day 8 - USE ANCHOR TEXTS TO INCREASE RELEVANCE
Anchor texts describe a link in detail and inform the user about what to expect from the link. Rather than having to click on an ugly URL, anchor texts allow people to click on keywords people actually understand and are redirected by the URL hidden behind the anchor text.
Ideally, the corresponding keyword of the landing page should always be used in the anchor text of internal links. The more webpages use the same keyword to point to a subpage, the more signals the search engine will receive indicating that this landing page must be very relevant for this keyword. This, in turn, means that the page will be ranked better for this and other similar keywords. Avoid using non- descriptive anchor texts (e.g., “here”, “more”, etc.) in your internal links and focus more on keywords.
Hands-on tips:
- Try to use the same anchor text when you link to a landing page.
- Make sure the anchor text matches the content of the landing page.
Day 9 - KEEP CLICK PATHS SHORT
Website users want to get to their desired page as fast as possible. That means you should keep your click paths as short as possible.
The click path is the route a user takes to reach their desired page. Think of online shopping experience. The user might start on the homepage and end at the shopping cart. Their click path is the ease and number of pages the users have to go through to reach their desired product and buy it. The length of the click path plays a key role in the navigation on your website.
Search engines also benefit from short click paths when crawling. If the Google bot goes through your website and manages to access all sub-pages within just a few clicks, it can use its limited crawl budget to scan and index more pages. Optimizing the click path pays off for both usability and search engine crawling!
As a rule of thumb, every sub-page should be accessible with no more than 3 clicks.
Hands-on tips:
- Add breadcrumb navigation to enable your users to orient themselves within the click path.
- Limit the length of the click path to a maximum of 4 clicks.
- Use a smart filter and search function on your website to avoid long click paths.
Day 10 - IMPROVE YOUR SITE’S ACCESSIBILITY
Troubleshooting technical errors and ensuring that your website is always accessible is one of the biggest hurdles to manage when thinking about good SEO practices over time.
You can use a sitemap.xml file to inform search engines about all URLs on your website. This sitemap can be read by search engines and contains a list of all important URLs and metadata on the website. Google bot uses this list as a basis to go through the website and review the corresponding URLs. The sitemap.xml file always has the same structure:
<?xml version=“1.0” encoding=“UTF-8”?> <urlset xmlns=“https://www.sitemaps.org/schemas/sitemap/0.9”> <url> <loc>https://www.mywebsite.com/firstpage.html</loc> <priority>1.0</priority> <changefreq>weekly</changefreq> <lastmod>2020-06-12</lastmod> </url> </urlset>
Both the XML version and coding are specified in the file. The URLs can be supplemented with additional metadata [e.g. the frequency with which the URL is changed (<changefreq>) or the last modification (<lastmod>).]
The sitemap.xml can be created using different content management systems There are also special sitemap generators available for the creation of the file.
After creating the file, you should upload it to the Google Search Console. Google then checks the sitemap-XML for accuracy. However, there is no guarantee that all webpages given in the sitemap will be crawled and indexed. This is up to the search engine.
Hands-on tips:
- Regularly update your sitemap.xml.
- Always adjust your sitemap.xml whenever you change URLs or edit content.
- Check the status codes of the webpages using the sitemap and fix any accessibility errors.
Day 11 - TELL SEARCH ENGINES WHAT TO CRAWL
The robots.txt is a text file that tells search engine crawlers which directories to crawl (allow) and which not to crawl (disallow). Every bot must first access the robots.txt file before crawling the website.
Using the robots.txt file helps you ensure that search engines identify all the important content on your website. If the important website or JavaScript elements are excluded from the crawling, search engines will not be able to correctly index your website.
Below is the simplest form of robots.txt:
User-agent: *
In this case, the instructions apply to all bots (*). There are no crawling restrictions. After creating the robots.txt file, you should save it in the root directory of your website.
If you do not want a specific area of the website to be crawled, you should specify this using a “disallow” in the file.
User-agent: * Disallow: /thisdirectory
Hands-on tips:
- Use a robots.txt file to give instructions to search engines.
- Make sure that important areas of your website are not excluded from crawling.
- Regularly check the robots.txt file and its accessibility.
Orientation & performance of the website
Your website should be thematically oriented to specific keywords for optimal rankings. The website should fulfil the needs of your users. At the same time, it is also important for your web content to load quickly in order to guarantee user satisfaction.
Day 12 - DO YOUR KEYWORD RESEARCH
Keyword research helps you identify keywords that appeal to a target audience and expand the reach of your content.
When you use keyword research tools, it helps you identify what kinds of content users are looking for on any given topic. Always make sure to prioritize time for keyword research.
When selecting keywords, you should also keep the purpose of your website in mind. Opt for transactional keywords if the main intention is sales, or informational keywords if your website aims to provide readers with important information.
Below are some of the recommended tools to help you research appropriate keywords:
- Google Keyword Planner: The Keyword Planner is part of the AdWords advertising program. You need a valid AdWords account to use this free tool. You can start searching for keywords and suitable ideas as soon as you register. You can also enter websites and view suitable keywords based on their content. The tool also shows you information about the monthly search volume. To the Keyword Planner.
- Google Trends: This free tool shows you how often frequent search terms are used. The tool also shows you a preview of possible peak demands. Google Trends is well-suited for seasonal and event-related keywords.
- Google Search: When you go to google anything, Google provides suggestions as you type based on the most heavily search keywords that you’re current search is matching. This includes long-tailed keyword suggestions based on your short-tailed entry. On a tight time or money budget, take advantage of this easy solution!
- übersuggest: übersuggest is a classic keyword research tool. It goes through all the recommendations of Google Suggest and shows you the most appropriate search term.
Day 13 - ENSURE NEAT WEBSITE NAVIGATION
A navigation menu helps users to easily navigate and find what they’re looking for on a website. A well-structured navigation menu is key for the user experience. The navigation structure is also important for search engines since it enables them to determine how important a URL is. The navigation on your website should be structured logically to ensure users don’t have problems on your website. Long dwell times from your users can help improve your search rankings, so the user experience is an important SEO consideration!
Hands-on tips:
- Use anchor texts in navigation elements. These help search engines understand the subject of the landing page better.
- Identify pages that have a high bounce rate and take measures to prevent this.
- Use breadcrumb navigation for a better overview.
Day 14: IMPROVE YOUR SITE’S LOADING SPEED
A webpage’s loading speed is very important for its ranking. Users don’t want to spend time waiting for a page to load; they want to see your content immediately. Webpages with high loading times have high bounce rates, and high bounce rates often result in poor rankings.
The website’s loading speed is even more important for mobile users since the lower bandwidths can further delay the loading of a website.
There are many technical ways to optimize loading time. You can use Google PageSpeed to check how fast your website loads.
Hands-on tips:
- Check the page speed of your website.
- Identify (very) slow pages and find out the cause.
- Avoid using giant image files and optimize the images for their minimum sizes.
- Optimize CSS and JavaScript files. You can save these in external files on the server for performance reasons.
I hope you are able to make the best of these tips and the initial 7 days planner. I'll be sharing again next week with 7 new strategies complementing the last strategy.
Please do feel free to ask me any questions on this. I'll be more than glad to assist you.
Do let me know how Week 1's strategy impacted your SEO.