Lumar (formerly Deepcrawl) cover photo
Lumar (formerly Deepcrawl)

Lumar (formerly Deepcrawl)

软件开发

New York,NY 21,436 位关注者

Find, prioritize, and fix website issues at scale — it’s easy with Lumar.

关于我们

The technical health of a website plays a major role in both user experience and search engine visibility — key influences on any business’s brand awareness, customer satisfaction, and sales. Lumar is the platform of choice for enterprise companies & teams behind complex, revenue-driving websites around the globe to identify issues preventing their sites reaching their full potential, including technical SEO, web accessibility, site speed, and even bespoke metrics to capture practically anything from the HTML of a web page. Lumar re-imagines the way you manage technical SEO, accessibility and other website technical health issues. It enables proactive, connected workflows to identify, track and fix issues across your domains and ease your development process. The Lumar platform gives you total control over your crawl strategies, eliminates data overload, simplifies monitoring and reporting, and automates QA testing. The results? Better prioritization, faster time to fix, and less time wasted fixing preventable issues. And ultimately reduced costs.

网站
https://www.lumar.io
所属行业
软件开发
规模
51-200 人
总部
New York,NY
类型
私人持股
领域
SEO、Website Monitoring、Web Architecture、Link Analysis、Custom Extraction、Diffing、Hreflang、Search Engine Optimisation、ROI、Organic Search、Website Architecture、Website Performance、Digital Marketing、Online Marketing、Web Performance、Accessibility、Website Accessibility和Website Health

产品

地点

  • 主要

    109 S 5th St

    900 Broadway

    US,NY,New York,11249

    获取路线
  • Warnford Court, 29 Throgmorton Street

    GB,London,London,EC2N 2AT

    获取路线

Lumar (formerly Deepcrawl)员工

动态

  • Lumar (formerly Deepcrawl)转发了

    查看Richard Barrett的档案

    Professional Services Director at Lumar & Google Gold Product Expert

    It's the end of Feb already! I wanted to sign off this performance month with a post about Lighthouse and how to use it to get actions for your website First off what is Lighthouse - it's an open source tool that is built into Chrome and can be used to generate performance metrics for the active page (https://lnkd.in/eP3z4fxj) It's totally free to use and generates pretty detailed reports about what exact items on the page are causing problems as well as providing suggestions on how you can fix them and the potential benefit. Lighthouse is a great starting point for any sort of performance audit you want to conduct. Its only real downside is that using it in Chrome means you can only really test one page at a time. More on that later. To get started with it in the browser, head to a page that you want to run a report on and open up the Developer Tools (F12 on windows, Option + ? + J on Mac). If you look at the top of this new panel you will see a bunch of tabs, one of which should be "lighthouse" - if you don't see this immediately you may need to click the >> icon and select it from the drop down. From here there are various options which you can tweak depending on what you want to test, and a big blue button to start analysing. Clicking this button will generate a report for the current page, make sure you leave this open and shifting focus whilst this is calculating can mess up the report. Once the report is complete you will be presented with 4 scores which each have a total out of 100. These indicate where there are problems detected on the page and each one breaks down into lots of subsections which have specific details about what is impacting the score. You also get some screenshots of individual items as well as the code that is triggering the effects. Typically the way I use lighthouse is to gather multiple pages worth of details and group items that are the same together, the purpose being to tackle one problem that is impacting multiple pages. This is helpful for prioritising bigger site impacts and provides a better justification for getting this work done when you are pitching fixing these issues to developers. Generally speaking I would prioritise things that impact page loading over others as these tend to have the biggest effect on users and can quickly compound if there are multiple things delaying the page actually loading. Hopefully this was useful and if not hopefully this terrible drawing will make it worth your time! Hope you had a great February, next month I will be taking on the big topic of AI (LLM's) so buckle up or something

    • 该图片无替代文字
  • Lumar (formerly Deepcrawl)转发了

    查看Andrew Levey的档案

    Head of Product Marketing

    Working out how to effectively use AI in your SEO workflows? Well here's the webinar for you! Find out how AI can streamline technical SEO, content optimization and reporting, and the best ways to enhance (not replace!) human expertise with AI, with real-world examples, and info on AI tools and techniques. Book your space here: https://lnkd.in/e4GS_eBm #techSEO #technicalSEO

    (Webinar Sign-Up) Smart Ways to *Actually* Use AI in Your SEO Workflows

    (Webinar Sign-Up) Smart Ways to *Actually* Use AI in Your SEO Workflows

    https://www.lumar.io

  • Lumar (formerly Deepcrawl)转发了

    查看Richard Barrett的档案

    Professional Services Director at Lumar & Google Gold Product Expert

    Continuing on from last weeks post on web performance I wanted to move into the most common point of discussion - Core Web Vitals Core Web Vitals are interesting because Google has indicated that they have a direct impact on ranking - generally when Google openly says something impacts ranking it's because they are trying to encourage all web owners to improve these parts of their site. Core Web Vitals break down into three pieces, how long does it take your main content to load (LCP), how much do parts of your site move around after it's loaded (CLS) and how quickly does your site respond to interaction (INP) An especially important factor with CWV is that these are user generated metrics, so a page won't have any values if not enough people visit it using the Chrome browser. This is vitally important to understand as what is quick for a server to do might be painfully slow for a user on a mobile network. LCP is typically linked to overall server performance, it's rare that it takes a long time for your main content to load and the issue is isolated to just one element. This usually leads to looking at what is causing delays, if you are using a CMS like Wordpress then plugins are usually the main culprit. More complex websites might have slow data retrieval from databases or caching problems. CLS is usually down to poor web space definition. Specifically having an element load onto a page without having the space it is going to use set, when it pops in everything around it gets pushed around on the page. This leads to users clicking on wrong items and is generally very frustrating. Typically this is solved by defining the space an item is going to take up on a page, defining the dimensions of an item on the page means that even if the element is very slow to load, it won't interfere with the rest of the page INP is the newest in the Core Web Vitals set having previously been looking at just the First Input Delay (FID), INP calculates a score based on all interactions with a web page. Ultimately this metric was always about how responsive your site is to interaction and that hasn't changed, however now all interactions are considered it's a bit more representative of how a site actually performs when a user is clicking.? Problems here are likely to be fundamental, such as pulling data on click events rather than pre populating common actions like navigation bars, or actions that fetch new data often rather than relying on caches. Hopefully that was useful in understanding the underlying causes of problems within each of the CWV metrics, there is a lot more complexity to these and the way the score is generated at the end which I want to go into a bit more next week, particularly with CrUX Please enjoy my terrible picture of sharing your browsing with a bot

    • 该图片无替代文字
  • Lumar (formerly Deepcrawl)转发了

    查看Richard Barrett的档案

    Professional Services Director at Lumar & Google Gold Product Expert

    With last weeks post on Core Web Vitals (CWV) I wanted to go into more detail about CrUX dashboards as this is the main data provided by Google to monitor your website performance If you are not familiar with CrUX, it is the Chrome User Experience data which is recorded when a Chrome user visits your website, the CWV are gathered and then passed to Google to use within its rankings. If you want to view your scores you can see them in Search Console but you can also generate your own dashboard via Google : https://lnkd.in/eKyadaxR There is also a relatively new site intended to help dig into the data a bit more and understand the numbers : https://lnkd.in/eqvBUB4p The strength of these reports is in the overall trending for your website, if an issue was introduced and it impacts a large section of the popular pages then you should see an increase in negative scores and vice versa for positive impacts. The weakness of these reports is that they don't tell you what pages are being used to calculate the scores or how many pages. This limits the usefulness as more investigation is required before you can start to take actions. There are a few ways to use these reports, the primary use case is as an overall site health view allowing you to trend over 12 months the progress of your sitewide CWV. This helps you keep an eye on broad positive and negative changes, these are the ones that you want to catch as quickly as possible so that's pretty helpful. A secondary use case is to track your device performance as the reports can be split by desktop and mobile. The difference between your mobile and desktop performance can give you a big clue as to where problems are. If you want to get more specific and do a page by page analysis you can extract this information from the Google CrUX API or do a lighthouse report for a page, but keep in mind that a lighthouse report is run from your local machine. *warning self promotion* If you wanted to do this at scale then you can use Lumar as we do have lighthouse crawls and the ability to pull in real user metrics (RUM) from the CrUX api A final note about CrUX dashboards, due to the way that Google allows you to access the data around your website, the information is effectively publicly available, meaning that you can just pop in a competitors site and look at their data for the last 12 months, but also they can just do the same to your website. Useful to know if you are trying to figure out where you stand in the competitive landscape. Hope that was useful and if you made it this far as always here is a terrible drawing (it's a chrome UI joke and not a good one)

    • 该图片无替代文字
  • 查看Lumar (formerly Deepcrawl)的组织主页

    21,436 位关注者

    Lyssa-Fêe Crump ??: "SEO & web accessibility have shared goals. We want the same things. We all want clear site structure, to make it easier for search engines and users. We want descriptive links that use keyword relevance. We want faster loading times that improve UX. We want alt text and media descriptions. We want consistent navigation. We want accessible forms. We want readable content that benefits everybody. There is a lot of overlap between the accessibility standards and SEO guidelines, and the good news is, you might already be doing a lot of this without knowing." #webaccessibility #techseo #wtsfest

    查看Natalie Stubbs的档案

    Senior Technical SEO at Lumar (formerly Deepcrawl)

    Accessibility is more than a tick-box exercise. It’s also about more than your ROI (although the potential benefits to your bottom line can certainly help sweeten the deal!) Loved this talk on the importance of accessibility for every website and every user, from the inspiring Lyssa-Fêe Crump ?? #WTSFest

    • 该图片无替代文字
    • 该图片无替代文字
    • 该图片无替代文字

相似主页

查看职位

融资

Lumar (formerly Deepcrawl) 共 6 轮

上一轮

未知

US$5,999,738.00

Crunchbase 上查看更多信息