User Testing with Quant UX, Part 3.
Quant UX

User Testing with Quant UX, Part 3.

In this part you may learn more about the heatmaps and the analytic canvas.

After getting some insights about our project we are now diving deeper into screen KPI’s and their meanings. Quant UX offers powerful indicators to let you know how your test users handled the prototype.

Analytic Canvas

To open the analytic canvas select the “Heatmaps” tab in the prototype overview page. Now select one of the heatmaps or click the “Analytic Canvas” button in the right corner.

What are heatmaps?

Heatmaps visualize where the users have clicked or moved the cursor. The more the users click on a certain area, the hotter (more reddish) the area gets. Thus, the elements in the area are likely important for the user.

Click Heatmaps

When you review click heat maps, you should analyze them in the context of your uses cases. Before you created the interface, you identified and prioritized user tasks and designed the interface accordingly. The primary elements should be easy to find and you expect them to be used a lot.

If the primary elements are hot your hypothesis was most likely right and the users behave as you expected. If the primary elements are cold, this usually indicates a problem. The users might not be able to find the elements or do not want to use the function. Unexpected hot areas indicate that the users behave differently than you thought.

Mouse Heatmaps

Cursor heatmaps work different than click heatmaps. The longer the cursor is over a certain screen estate, the hotter it gets. Research shows some correlation between the cursor movement and eye gaze. This means long hover times over a specific area can indicate strong user interest, but it can also mean that the user simply didn’t move the mouse. Often these heatmaps are the result of a “reading pattern”, which often takes ans F shaped form.

User Journey

The user journey shows how the users have navigated over the prototype. By default, the different journeys are merged and common paths are shown in a warmer color. You can deselect the merge option in the properties panel to show the individual flows. In the properties section, you can also see the list of all user tests. You can toggle the visibility and also launch the screen recordings. The merged graph in our example shows that most of our testers took the right “path” in ourplanned user journey.

Es wurde kein Alt-Text für dieses Bild angegeben.

Scroll Visibility

The scroll visibility shows for each screen which parts of the screen were shown to the users. This is important for you have longer screens. Parts below the fold (the bottom of the screen) are usually less often seen and are therefore shown in colder colors. The scroll visibility helps you to detect if the users explored the entire screen.

Es wurde kein Alt-Text für dieses Bild angegeben.

Scroll Time

The scroll time shows on which parts of the screen the users have spend most of their time. The more time the users spend on a given section, the warmer is the color.

Screen Views

The view heatmap shows how many times a screen was seen by the users in relation to the other screens. Cold colors indicate that the majority of users have not seen the screen, which could be an indicator that the navigation is broken.

Dwell Time

The dwell time indicate how much time the users have spent on a screen. If you have for instance a screen where the users have to fill out a form, the screen is usually hot.

UI-Element KPI’s

When you select a widget (UI element) or screen you can also see certain KPI’s that are related to the widget.

Es wurde kein Alt-Text für dieses Bild angegeben.

Widget clicks

The widget clicks tell you how many time a certain widget was clicked. This KPI relate directly to the heatmaps. The gauge shows the absolute number of clicks, the position of the ring shows the relation to all other widgets in the prototype.

Example: During test 100 clicks were recorded by 5 users. Widget a was clicked 20 times. The relative frequency is therefore 20%.

First clicks

The widget clicks tell you how many time a certain widget was clicked directly after a screen was loaded. The first clicks show which elements catch the most attention of the users. The gauge shows the absolute number, and the position visualizes the relation to the screen loads.

Example: A screen has two elements, A and B. The screen was loaded 10 times and 4 times element B was clicked immediately afterwards. The relative frequency is thus 40%.

Time before click

The time before click tells you how many seconds the users took in average until the interacted the first time with the given element. In general elements in the top should have shorter times the elements at the bottom of a screen.

Example: A screen is loaded and after 10s the user interacts with element A. In a second test, the user only clicked on the element after 2s. The average time before the click is therefore 15s. If the time before click is too long, it indicates that the learning curve for your users is high.

Test coverage

The test coverage tell you how many time a screen was tested. This metric indicates how easy the screen is to find. The gauge shows the absolute number of screen tests in the middle. The position of the ring indicates the relative test ratio.

Example: Your prototype has two screens and was tested by two users. The first user saw both screens, whereas the second user saw only the first screen. This means there are two test. The relative frequency of the first screen is 100% because it was tested by every users, where is the relative frequency of the second screen is 50%.

Dwell time

The average dwell times tells you how much times the users have in average spend on a screen. A high number might indicate that the users had to perform a lot of interactions, e.g. fill out a form. However, it can also indicate that the users had some problems, for instance finding the right elements. The gauge shows the absolute dwell time and also puts in in relation to the total test duration.

Example: Five tests were done, each taking exactly 60 seconds. The users spend 20, 30, 30, 30 and 40 seconds in the first screen. The average dwell time is 30 seconds, and the relative dwell time 50% ((20 + 30 + 30 +30 + 40) / (5*60)).

Screen views

The screen views tell you how many time a screen was shown. If this number is much higher than the overall “Test Views” (Users), this indicates that the users came often back to this screen. The gauge shows the absolute number in the middle. The position of the ring indicates the relative frequency.

Example: Your prototype has two screens and was tested by two users. The first user saw both screens, whereas the second user saw only the first screen. This means there were three screen loads. The relative frequency of the first screen is thus 67% and of the second 33%.

Screen background clicks

The background clicks tell you how many times the users have clicked on the screen, and not on a widget. A high number indicates often problems, for instance, that the users expect certain elements to be clickable. The gauge shows the absolute number, and the position indicates the relative frequency with respect to all clicks on the screen.

Example: During the test, 100 events were recorded by three users on a given screen A. 10 events were on screen A. The relative frequency is therefore 10%

Screen widget clicks

The widget clicks tell you how many times the users have clicked on UI elements. The number indicates how much “work” the users have performed on a certain screen. The gauge shows the absolute number, and the position indicates the relative frequency with respect to all clicks on the screen.

Example: During the test, 100 events were recorded by three users in screen A. 90 events were on the five widgets of the screen. The relative frequency is therefore 90%

This is it for now. If you have trouble, feedback or further questions join our community: https://spectrum.chat/quant-ux


要查看或添加评论,请登录

社区洞察

其他会员也浏览了