SEO Hygiene for B2B Software Companies with Deepcrawl

SEO Hygiene for B2B Software Companies with Deepcrawl

Technical SEO reports exist in different shapes and forms, and are probably the most popular SEO service out there. Everyone in digital marketing offers them and they can range from a 25-page tech SEO audit, monthly or quarterly crawls with a few prioritized issues, to before and after crawls for big web projects like web migrations and rebranding.?

All these formats can be very impactful for the organic visibility of the business, especially if it has been a while since the last check. The longer this period, the bigger the chances that serious issues have already piled up, and fixing them would bring in significant improvements. It only sounds like good news at first glance. In reality you have been ranking subpar, and had tech SEO and UX issues on the website for a while which translates to lost opportunity in rankings, traffic and revenue.?

The way to avoid this is to set up a website crawling system that enables you to regularly crawl, review and prioritize key issues based on impact and effort. Then communicate these recommendations to the right team for execution.?

Read below how to set up such an SEO hygiene system.

Choose the Right Tooling

For this setup you will need crawling software like Deepcrawl, OnCrawl etc, Google Spreadsheet or Excel, and very basic knowledge of Regex.

At Verto Digital we use Deepcrawl and Google Spreadsheets because these are best in class tools, and they cover a range of SEO use cases. Both are web based which enables a whole new level of features, like always working with the latest document version and having a full archive of these versions.?

However, you can use any web crawler and spreadsheet with very similar results.?

These are paid tools so you will need to allocate a budget for them but it is worth the investment. Having a system like that in place not only takes care of the SEO and UX web hygiene but is invaluable in web projects like web migrations, rebrands, redesigns etc.?

Segment Your Website by Execution Teams

B2B software websites tend to have a very similar structure. There is the marketing website that includes the product, demo, blog, webinars, whitepapers and similar pages. It is usually owned by the marketing organization.?

Then there are the content rich sections like the Documentation, Community, Forums, Investors, Newsroom etc which are owned by separate, other teams.?

This is where the challenge is coming from.?

Many of the SEO factors are sitewide, so all sections of the website need basic SEO and UX hygiene.

This is a very difficult task to complete with a number of different teams.?

To make this task even more complex, orchestrating this effort is a recurring task, so definitely you need to simplify as much as possible.?

Map the crawling reports to the organizational structure. This will help get each team’s attention and accountability.?

No alt text provided for this image

Setting up each report using basic regular expression (Regex) knowledge is very easy. You can follow this logic.

  • Map all tech support teams in your organization and the public pages they support. In the example above these are the Main Site (the marketing website), the Documentation, the Community section, the Investor website etc.
  • Decide on the crawling frequency for each section. If your Documentation team can only work on fixing SEO issues every other month, there is no need to send them monthly reports. Set up a quarterly report. And on the other side of the spectrum - if you need a list of landing pages for ad campaigns crawled while you are actively spending money in advertising, you can set them up for weekly or even daily crawls.
  • Write the Include and Exclude Regex rules for each segment. You only need crawls that do not include bogus pages with endless strings of parameters and pages people can actually work on. To remove all parameters, UTMs, and similar URL noise go to your Google Analytics account and do that:

  • Setting up the include and exclude rules in Deepcrawl. This is the point when you need to include the Regex.?

For the marketing website, assuming you want to crawl everything on the root domain that is not part of some other team’s responsibilities, we want to keep the Include rule open, and use the Exclude rule extensively for everything we do not want to have there - both as subdomains and as parameters.

No alt text provided for this image

If you want to crawl a certain subdomain, say the Community website, then we will use the Include rule to include the Community website only, and the Exclude to remove all unnecessary URL parameters from its URLs. For user-generated content as Community, Forums, Q&A platforms etc, this is critical. They are popular for generating massive amounts of bogus URLs, so you need to make sure you only crawl what needs crawling.

No alt text provided for this image

Deepcrawl and similar tooling have lots of very useful additional settings you would want to set up, but that's another post. For the segmentation task we are done.?

Run, Review and Prioritize the Recommendations

Run the crawling reports as often as you have the resources to execute the recommendations. There is no need for a weekly or a monthly crawl if your developers can only work on them twice a year. Monthly or quarterly reports are okay for most businesses.?

If quarterly reports also feel like a stretch, you have a bigger problem to solve before what reports to run. You need to find technical help for supporting your website. I would focus on that first.?

Once this step is also completed, all SEO recommendations coming from these reports need to be reviewed, prioritized and aligned to the web team’s workload. Crawlers report all cases where the status of a set of pages on a website deviates from a certain standard. This is not necessarily an SEO issue, but will always result in tens of recommendations.

It looks something like this.

No alt text provided for this image

You do not need to fix all 46 issues. They are not even all issues. Some are just deviations from a certain threshold.?

Reviewing and prioritizing them means to find the top three that impact the organic visibility the most, and it is a good practice to fix just three recommendations each month.?

Execute the Recommendations

Executing the recommendations is in a few very simple steps, none of which need much explanation.

Documenting the recommendations

We document these in a Google doc file that serves as a log of all recommendations over time with the most recent on top. This helps us easily see what the issues were on any given month and helps us a lot in the communication part.?

Communicating them to the web team

The best way to do it is for each report to have a 30-60 minute call with a representative of the tech support team that will be executing the recommendations for this segment. Explain to them what the highlights of the report are, why these three are a priority from an SEO standpoint and give them the opportunity to ask questions. These are the people that built and support the website, they know it best so also listen carefully because you can learn a lot on these calls.

QA the results on the next crawl

Quality assurance is very simple here because another comparison crawl will run next month or quarter. All changes since the last crawl will be fully crawled and documented so you will be able to assess how the recommendations have been executed.?

No alt text provided for this image

In combination with a monthly assessment of the traffic and rankings it would be easy to check how impactful these recommendations have been, but most of all - you will be sure that in terms of technical SEO - you are doing just fine.?

Arron Westbrook

Head of Content @ BRYTER

3 年

YES, Lily Grozeva! This is excellent! ??

要查看或添加评论,请登录

Lily Grozeva的更多文章

社区洞察

其他会员也浏览了