The Xa11ytaire Experiment Part 2: The game’s at the Play Store and waiting for your feedback!
The Xa11ytaire Android app, with the Voice Access feature showing numbers to enable speech input to play the game.

The Xa11ytaire Experiment Part 2: The game’s at the Play Store and waiting for your feedback!

This article invites your feedback on the experiences around using the experimental Xa11ytaire game app using Android’s TalkBack, Voice Access, and Switch Access features.

 

Introduction

A while back I undertook an exploration into how a game of solitaire might be played with various input and output methods on Windows, the summary of that being The Sa11ytaire Experiment - End of Part 1. Since then, I wanted to explore Azure Custom Vision running locally on my Android phone, so I ported my Windows app to Xamarin, as described at The Sa11ytaire Experiment Part 3: Xa11ytaire and the local Azure AI. And just recently I had the opportunity to discuss the accessibility of that Xamarin.Android app at a Microsoft event, and that led to The Xa11ytaire Experiment – Setting the scene, with the goal of polishing up my Android app to a point where I can get feedback on its accessibility.

And this is where you come in. ??

The Xamarin app is now available at the Google Play Store, as Xa11ytaire.

The app’s still a work in-progress, but I am hoping it’s reached a point where people can try it out with the TalkBack screen reader, Voice Access, and Switch Access, and can provide feedback to help me learn how I should prioritize improvements to the app.

Short videos of various input methods being used with the app are:

  1. The Xa11ytaire Android app being played with TalkBack
  2. The Xa11ytaire Android app being played with Voice Access
  3. The Xa11ytaire Android app being played with Switch Access

(Apologies for the lack of audio descriptions on these videos at the moment.) 

 

The most important lesson for me during this experiment

When I first worked on the Sa11ytaire app, I took the classic layout of a solitaire game, and considered how I might make it accessible. When I later started working on the Xa11ytaire app, I took the Sa11ytaire app and ported its design as-is to run on my small phone. I then interacted with that app through its visuals and touch, and felt it seemed to work pretty much as expected. While something about the app experience didn’t feel great, I told myself I’d done what I wanted to do. That is, get that traditional solitaire game working on my phone.


The original Xa11ytaire app layout, with lots of space taken up with individual face-down cards.

Figure 1: The original Xa11ytaire app layout, with lots of space taken up with individual face-down cards.


Then I started interacting with the app using a variety of input methods.

  1. TalkBack: When I swiped through the elements in the app, I moved through every face-down card in each list. Is that really the most efficient experience I can deliver? Why do I have to encounter (say) five identical “face-down card” elements in a list, and count them myself, rather than encounter an element representing a pile of face-down card elements, whose accessible data can convey the number of face-down elements available.
  2. Voice Access: Numbers appeared beside all the face-down cards, and so the cognitive load for the player increased, with no associated benefit to the player. A face-down card can’t be moved, so why show numbers on all individual face-down cards, if all it means is that the player has to spend more time deciding which of the Voice Access numbers presented on the screen are actually of interest. Why not have a single element representing all face-down cards in a list, and reduce the count of Voice Access numbers shown for the face-down cards from (say) five, to one.
  3. Switch Access: When navigating through the elements in a specific list, by default, the highlight will move through all face-down cards. Given that I can’t move those face-down cards, that’s hardly an efficient experience. Why not have a single element representing all face-down cards in a list, and reduce the time it takes to reach a face-up card of interest using Switch Access from (say) five, to one.


There certainly seemed to be a pattern going on there. It would significantly improve the experience with TalkBack, Voice Access and Switch Access if I consolidated the set of face-down cards in each list, to be a single element.

And this is where the important lesson for me is in all this. Once I’d made that change to consolidate the face-down items, I realized that I’d not only improved the experience for players using TalkBack, Voice Access and Switch Access, but also for players leveraging the visuals shown in the app and using touch. Previously there was non-trivial space being used up by all the face-down cards in the app. And that meant that right at the heart of the game there was a fair amount of screen real estate used for elements that can’t be interacted with. It also meant that as the lists grow longer during the game, they may need to start scrolling simply due to all the face-down cards in the lists.

The change I decided to make following my experiences with TalkBack, Voice Access and Switch Access improved the experience for sighted players using touch, yet I might not have made the change were it not for the fact that I explored the experience with the various input methods.

So looking back, rather than me simply taking whatever UI design I had at the start, and trying to make it accessible, I really should have taken a step back and considered what UI design would enable the most efficient experience for all players, regardless of how they interact with their device.


The redesigned Xa11ytaire app with the consolidated face-down cards. TalkBack says “5 face down, in list pile 6” when encountering the face-down element highlighted in the screenshot.

Figure 2: The redesigned Xa11ytaire app with the consolidated face-down cards. TalkBack says “5 face down, in list pile 6” when encountering the face-down element highlighted in the screenshot.

 

I did say it’s a work in progress

Just a few quick notes on the current version of the app.

  • Yeah, it crashes once in a while.
  • The selection visual highlights need a little tidying up.
  • I need to work on a few more rules about exactly where cards can be placed.
  • I’ve not yet done thorough testing on screen resolutions other than what my phone supports.

I’ll continue to improve the app, and try my best to account for your feedback at the same time.

 

What about all the other interesting aspects of input and output at the app?

For this release of the app, I’ve been mostly considering the screen reader, speech input, and switch input experience. So I’ve not focused on other topics such as the use of colors in the app, or the experience when a physical keyboard is being used. All being well, I’ll get to that in a later release.

But that said, Magnification seems to work great without any specific work on my part.


The Xa11ytaire app running on my Android phone, with the phone’s Magnification feature turned on.

Figure 3: The Xa11ytaire app running on my Android phone, with the phone’s Magnification feature turned on.

 

The Screen Reader experience

The most important thing I can do as part of setting off on the journey of building an accessible app, is to use the standard controls that come with the UI framework I’m using. Be that Win32, WinForms, WPF, UWP XAML, or, as in this case, Xamarin.Forms. I’ll always have less work to do around accessibility if I’m enhancing the experience delivered through use of standard controls, rather than to trying to patch up inaccessible custom UI to be accessible.

For this app, all interactable elements are a Button, ImageButton, or items in a ListView.

My next step is to enhance the UI through use of the Xamarin AutomationProperties.

AutomationProperties.Name

This is the concise, helpful, unique-amongst-its-peers, localized string that conveys the purpose of the control. If the control has visual text showing on it, then often the Name property is the same as that visual text. As it happens, none of the controls in the Xa11ytaire app show visual text, and so the Name property must be set on each control.

This is done in a couple of ways in the Xa11ytaire app.

The Names of the “Restart” Button and the dealt card pile ListViews never need to change, and so these are set to localized strings in XAML. For example, to set the Name of the second dealt card pile ListView:

AutomationProperties.Name="{i18n:TranslateExtension Text=Pile2}"

 (Localization in the app was implemented using the approach described at Localization.)

Note: One might ask why a ListView needs a Name given that you generally interact with the items in the ListView, not with the ListView itself. The answer is that a screen reader can announce the name of a ListView that the items are in, when the items are encountered. The game’s going to be much harder to play without knowing which ListView contains the item you’re interacting with, so it’s really important that the ListViews have helpful Names. (And note that TalkBack’s Verbosity influences exactly how much information gets included in an announcement when encountering an element.)

The other approach to set the Name on the element is to use binding. Many of the controls in the game show visuals specific to the state of the control, (for example, what playing card is represented by the item in a ListView). In this game, I added properties to all classes to which these controls are bound, and those properties provide the strings for the bound AutomationProperties.Name properties.

For example, for the Name of the target card piles, which might be (say) “Clubs pile” when the target pile is empty, and later some specific clubs playing card when the target pile is not empty, I have the following XAML:

AutomationProperties.Name="{Binding CardPileAccessibleName, Mode=OneWay}"


AutomationProperties.HelpText

This localized text provides supplemental helpful information to the element, where that information would not be appropriate to include in the concise Name of the element. (The Name conveys the purpose of the element, not supplemental information such as advice on how to interact with it).

As it happens, I only added HelpText to one element, and even then it’s conditional. I bound the AutomationProperties.HelpText to a “HelpText” property that I added to the item in the ListViews which hold the dealt cards, as follows:

AutomationProperties.HelpText="{Binding HelpText, Mode=OneWay}"

By default, I set that HelpText to be an empty string, given that it typically didn’t seem particularly valuable to add any HelpText. But when the collection of movable cards in a pile becomes empty, the ListView is left with one item that becomes a placeholder for a King to be dropped on. For someone not too familiar with the game of solitaire, it might seem confusing as to why other cards can’t be moved to that list. So when the list is in that state, I set the HelpText to “Only a king can be placed here.”

If you have thoughts on how other HelpText might be useful in the app, I can update it accordingly.


AutomationProperties.IsInAccessibleTree

Some of the UI in the game involves a containing element, and then a bunch of elements within that container. For example, the items in the ListViews include a Grid which can contain such things as a StackLayout, Image, Frame and Label. However, when a sighted player views a playing card, they consider it a single entity. It is (say) the 3 of Diamonds, not a couple of containers with an image in them. Given that semantically, that collection of UI elements is the 3 of Diamonds, it must be conveyed as such to all players of the game. And that means as a player using a screen reader navigates through the UI in the game, they should encounter the entity that is the 3 of Diamonds, and not the individual Xamarin UI elements that go to make up that entity visually.

By default the elements that make up some composite UI can be exposed to screen readers such that the player encounters many of those elements individually while navigation through the UI. This will be an irritating distraction and the app should avoid delivering this experience where practical. By setting AutomationProperties.IsInAccessibleTree to false for the elements which the player would not want to encounter, the screen reader will bypass those when navigating through the UI in the app.

AutomationProperties.IsInAccessibleTree="False"

While this is a really helpful feature of the AutomationProperties class, you’ll want to be sure that no helpful information is being conveyed by these elements which is not being conveyed elsewhere.

 

AutomationProperties.LabeledBy

The LabeledBy property can be a handy way of having one element’s accessible name set from another element’s data. Classic examples would be where a visual label precedes an edit control, combobox or list. The LabeledBy property could be used to have that edit control, combobox or list have its accessible name set from the preceding label. This avoids the need to add another string to your app that needs localizing.

As it happens, due to the UI design of the app, I didn’t leverage AutomationProperties.LabeledBy.


The TalkBack screen reader highlighting the 3 of Diamonds in the Xa11ytaire app, in preparation for moving it to the 4 of Spades.

Figure 4: The TalkBack screen reader highlighting the 3 of Diamonds in the Xa11ytaire app, in preparation for moving it to the 4 of Spades.

 

Note: When TalkBack moves into a ListView and encounters an item, depending on its current Verbosity setting, it may attempt to include the count of items in the ListView in its announcement. At the moment, TalkBack always seems to announce an item count that’s two greater than the actual count of items in the ListView. I’ve not been able to prevent this, and perhaps this behavior is due to active issue Talkback reports incorrect number of list items.

 

The speech input experience

With the Voice Access feature in Android, I can have numbers shown visually by the elements in the app, and then speak the number of the element that I want to interact with. This all seemed to work by default, which was a joy to discover.

I have found that if I turn Voice Access on while a game is in progress, the items in the ListViews don’t get numbers shown on them, so I can’t play the game. Instead I need to start the game after turning on Voice Access.

I’ve not been able to find any way to control whether numbers appear by specific elements in the app. For example, the two cards that are partially obscured when turning over the three next cards are disabled and the player cannot interact with them. Yet the Voice Access numbers still appear on them. That’s not a big deal in this case, but if I had a choice, I think I’d ask for the numbers to not appear on those disabled elements, in order to reduce the cognitive load for someone playing the game.

 

Voice Access presenting numbers by all elements in the Xa11ytaire app.

Figure 5: Voice Access presenting numbers by all elements in the Xa11ytaire app.

 

The switch device experience

I really barely started this part of the experiment, but I feel it’s worth sharing my experiences all the same. The only switch device I have that can connect to my Android phone is my Xbox Adaptive Controller, (XAC). But as I understand things, there are no official resources saying that the XAC is compatible with an Android device. If this is the case, then while I’d like to try out the device at my phone, I can’t feel confident that things will work.

Indeed, once I’d connected the XAC and phone via Bluetooth, and set the two big switches on the XAC to be the Next and Select buttons to navigate through and interact with the app, I couldn’t get Switch Access to work in the game. I can’t know exactly what the problem is, but the end result was that I couldn’t reliably navigate through the app, nor interact with all the elements in it.

The only combination of Switch Access settings that did seem to have potential to work for me was to have Switch Access navigate automatically through the app, and I use a single XAC button to select the highlighted element. While overall I could seem to play the game using that, I still found in response to some action in the game, the big Switch Access menu of actions would appear, and I’d have to work to dismiss that before I could continue playing the Xa11ytaire game.

So really, the game does not seem playable with this switch device. I’d still value any feedback from players interested in this interaction method though, in case I get myself set up with a physical switch device which seems more compatible with my phone.

And as with Voice Access, I didn’t have to take any action as a dev to enable this form of input. Interestingly, Switch Access seems to bypass the disabled partially obscured cards near the top of the app, and that seems helpful.

And again, as with Voice Access, Switch Access doesn’t seem aware of all the elements in the app if Switch Access is started while the app’s running. The game needs to be restarted in the Xa11ytaire app before Switch Access will move to all the elements in the game.

 

An Xbox Adaptive Controller being used to move a 2 of Spades on to a 3 of Diamonds in the Xa11ytaire app.

Figure 6: An Xbox Adaptive Controller being used to move a 2 of Spades on to a 3 of Diamonds in the Xa11ytaire app.

 

Summary

I really have found this a most fascinating experiment so far, and I’m pleased that the exploration into how the app might be played with a screen reader, speech input, and switch input, led to a better experience for all players.

This is only the start of the experiment though. Please do let me know your thoughts on how the experience might be improved further, and I can try to update the app accordingly.

I’m looking forward to hearing from you! ??

Guy


要查看或添加评论,请登录

Guy Barker的更多文章

社区洞察

其他会员也浏览了