SEO Hygiene for B2B Software Companies with Deepcrawl
Lily Grozeva
Driving SEO for B2B tech companies like Neo4j, Cribl, and Extreme Networks. Excited about helping brands transition from classic Google SEO to AI Search.
Technical SEO reports exist in different shapes and forms, and are probably the most popular SEO service out there. Everyone in digital marketing offers them and they can range from a 25-page tech SEO audit, monthly or quarterly crawls with a few prioritized issues, to before and after crawls for big web projects like web migrations and rebranding.?
All these formats can be very impactful for the organic visibility of the business, especially if it has been a while since the last check. The longer this period, the bigger the chances that serious issues have already piled up, and fixing them would bring in significant improvements. It only sounds like good news at first glance. In reality you have been ranking subpar, and had tech SEO and UX issues on the website for a while which translates to lost opportunity in rankings, traffic and revenue.?
The way to avoid this is to set up a website crawling system that enables you to regularly crawl, review and prioritize key issues based on impact and effort. Then communicate these recommendations to the right team for execution.?
Read below how to set up such an SEO hygiene system.
Choose the Right Tooling
For this setup you will need crawling software like Deepcrawl, OnCrawl etc, Google Spreadsheet or Excel, and very basic knowledge of Regex.
At Verto Digital we use Deepcrawl and Google Spreadsheets because these are best in class tools, and they cover a range of SEO use cases. Both are web based which enables a whole new level of features, like always working with the latest document version and having a full archive of these versions.?
However, you can use any web crawler and spreadsheet with very similar results.?
These are paid tools so you will need to allocate a budget for them but it is worth the investment. Having a system like that in place not only takes care of the SEO and UX web hygiene but is invaluable in web projects like web migrations, rebrands, redesigns etc.?
Segment Your Website by Execution Teams
B2B software websites tend to have a very similar structure. There is the marketing website that includes the product, demo, blog, webinars, whitepapers and similar pages. It is usually owned by the marketing organization.?
Then there are the content rich sections like the Documentation, Community, Forums, Investors, Newsroom etc which are owned by separate, other teams.?
This is where the challenge is coming from.?
Many of the SEO factors are sitewide, so all sections of the website need basic SEO and UX hygiene.
This is a very difficult task to complete with a number of different teams.?
To make this task even more complex, orchestrating this effort is a recurring task, so definitely you need to simplify as much as possible.?
Map the crawling reports to the organizational structure. This will help get each team’s attention and accountability.?
Setting up each report using basic regular expression (Regex) knowledge is very easy. You can follow this logic.
For the marketing website, assuming you want to crawl everything on the root domain that is not part of some other team’s responsibilities, we want to keep the Include rule open, and use the Exclude rule extensively for everything we do not want to have there - both as subdomains and as parameters.
领英推荐
If you want to crawl a certain subdomain, say the Community website, then we will use the Include rule to include the Community website only, and the Exclude to remove all unnecessary URL parameters from its URLs. For user-generated content as Community, Forums, Q&A platforms etc, this is critical. They are popular for generating massive amounts of bogus URLs, so you need to make sure you only crawl what needs crawling.
Deepcrawl and similar tooling have lots of very useful additional settings you would want to set up, but that's another post. For the segmentation task we are done.?
Run, Review and Prioritize the Recommendations
Run the crawling reports as often as you have the resources to execute the recommendations. There is no need for a weekly or a monthly crawl if your developers can only work on them twice a year. Monthly or quarterly reports are okay for most businesses.?
If quarterly reports also feel like a stretch, you have a bigger problem to solve before what reports to run. You need to find technical help for supporting your website. I would focus on that first.?
Once this step is also completed, all SEO recommendations coming from these reports need to be reviewed, prioritized and aligned to the web team’s workload. Crawlers report all cases where the status of a set of pages on a website deviates from a certain standard. This is not necessarily an SEO issue, but will always result in tens of recommendations.
It looks something like this.
You do not need to fix all 46 issues. They are not even all issues. Some are just deviations from a certain threshold.?
Reviewing and prioritizing them means to find the top three that impact the organic visibility the most, and it is a good practice to fix just three recommendations each month.?
Execute the Recommendations
Executing the recommendations is in a few very simple steps, none of which need much explanation.
Documenting the recommendations
We document these in a Google doc file that serves as a log of all recommendations over time with the most recent on top. This helps us easily see what the issues were on any given month and helps us a lot in the communication part.?
Communicating them to the web team
The best way to do it is for each report to have a 30-60 minute call with a representative of the tech support team that will be executing the recommendations for this segment. Explain to them what the highlights of the report are, why these three are a priority from an SEO standpoint and give them the opportunity to ask questions. These are the people that built and support the website, they know it best so also listen carefully because you can learn a lot on these calls.
QA the results on the next crawl
Quality assurance is very simple here because another comparison crawl will run next month or quarter. All changes since the last crawl will be fully crawled and documented so you will be able to assess how the recommendations have been executed.?
In combination with a monthly assessment of the traffic and rankings it would be easy to check how impactful these recommendations have been, but most of all - you will be sure that in terms of technical SEO - you are doing just fine.?
Head of Content @ BRYTER
3 年YES, Lily Grozeva! This is excellent! ??
Thank you Lily Grozeva ! Great, helpful read ??????