Skip to main content

Automated Usability Testing Tools


Faculty of Natural Sciences

Department of iComputer Science

CSE2202: Human Computer Interaction

Assignment #1

April 30th, 2022

Tariq Harris 1037257


 Introduction 

An automated usability testing tool should take into account the properties of an application's graphical user interface, the sequence of user actions as they use the application to complete specific tasks, their behavior, and comments, and a description of these tasks. The tool should assess both the static and dynamic properties of the interface, as well as the navigational burden, and make recommendations for changes or templates that would improve usability. The results should be quick and simple to interpret, and they should be understandable to people who are not specialized testers.


2(a)  Definition of Key Terms

  • Usability - IEEE (1990) defines usability as "the ease with which a user can operate, prepare inputs for, and interpret outputs of a system or component," and it consists of five major attributes, as outlined by Le Peuple and Scane (2003) and Nielsen, J. (1993): learnability, efficiency, memorability, errors, and satisfaction.

  • Usability Testing - There are numerous usability testing methodologies available, each with its own set of pros and cons. Heuristic evaluations are relatively inexpensive, quick, and simple to perform; however, Le Peuple and Scane (2003), Scholtz (2006), and Spolsky (2001) claim that such evaluations are likely to identify only about half of the actual problems, with a significant number of false problems raised and actual problems missed. User testing, when done correctly, is likely to identify the majority of the major usability issues because it involves real users attempting real tasks and is very useful for collecting feedback on subjective aspects of usability (such as satisfaction and aesthetic appeal).
           Attributes of Usability Testing include:

  • Ease of use – To check how easily a user can use the different functionalities of the application.

  • Ease of learning – To check how easily users can learn to use the application.

  • Memorability – To check how easily the user can remember the different flows of the application after exploring it for the first time.

  • Error rate – While using the application, how often do users make mistakes and how easily can they recover from those errors.

  • Level of satisfaction – This is a subjective attribute that deals with the satisfaction or the general opinion user have about the product.
2(b)  Two Usability Testing Tools for this Assignment 

  • Morae - Morae is a four-module commercial usability testing software package developed by TechSmith. The Recorder is installed on the test user's machine and records screen and system activity; the Remote Viewer allows testers to control the Recorder remotely and view, hear, and annotate the recordings in real-time; and the Manager Analysis module allows testers to isolate segments of the recording and provides a search editor to quickly locate user actions. Metrics such as time on task, number of clicks or pages viewed, and delay times are also calculated automatically based on time-stamped and indexed events on the video; the Manager Presentation module is used to edit, annotate (textual or audio), and title video clips for use in Morae or other programs (such as Microsoft PowerPoint), and optionally with videos not recorded with Morae. Morae automates and significantly reduces the cost of conducting and analyzing a user test in many ways by facilitating the more efficient collection and use of recorded data.

  • Watchfire Bobby(WebXACT) - Watchfire Bobby is a popular web accessibility testing tool, with a limited version available for free online as WebXACT. Bobby navigates a website (both local pages and web pages behind a firewall) to determine whether each page satisfies various accessibility requirements, such as readability by screen readers, the provision of text equivalents for all images, animated elements, audio, and video displays.'

2(c)  Design for the experiment to test tools in 2(b)

To evaluate an application's usability or to promote good usability given a specific problem or scenario, the tool must collect and interpret information about various aspects of the application. This information ranges from specific to general; from details of the HDA's control-level implementation to user interaction sequences with the forms to high-level descriptions of the problem domain

  • GUI Description - XML or other scripting languages can be used to fully describe an HDA's interface in the same way that HTML describes the layout of web pages. A parser should be included in the tool to interpret such scripts and extract relevant data for analysis.



Control / component Properties
Button Width, height, text font type
and size, position,
enabled/disabled, visible/not
visible
Table Width, height, position,
number of columns, width of
each column, data type for
each column, test font type
and size, enabled/disabled,
visible/not visible
Menu Number of items, greatest
depth (number of levels)
Dropdown list Width, height, position,
number of items
Table 1: GUI information examples
Control / component Properties
Button Width, height, text font type
and size, position,
enabled/disabled, visible/not
visible
Table Width, height, position,
number of columns, width of
each column, data type for
each column, test font type
and size, enabled/disabled,
visible/not visible
Menu Number of items, greatest
depth (number of levels)
Dropdown list Width, height, position,
number of items
Table 1: GUI information examples

  • Recording user interaction sequences - Observing and video/audio-recording a user's interactions with an HDA or an equivalent working prototype is the traditional method of capturing how a user interacts with an HDA. The time and resources required for data collection and subsequent analysis make this one of the most expensive processes in a usability evaluation. Furthermore, while most tape mediums are adequate for displaying images of people, they lack the resolution required to display images on computer displays. Furthermore, the presence of observers, cameras, and other recording equipment frequently distracts or intimidates test users, causing them to perform poorly. A different recording procedure is clearly required.
  • User Behaviour and Comments - The traditional video recording practices provide useful footage of user behavior. Body language reveals the user's thoughts and feelings as they attempt the tasks; frowning, pauses, uncertain actions, and other expressions of confusion or frustration strongly suggest poor usability. The comments and thoughts of the users themselves are, of course, a good indicator of how usable the AUT was for the users. Although these tasks would undoubtedly be difficult to fully automate, according to Lee and Grice (2004), they should still be included in the usability evaluation of HDAs and thus the tool should aim to better support them.

  • Task Definition - Providing a means for developers to input task definitions would be a significant challenge, as these can be broad in scope and either generic or very specific to a specific HDA. However, for the purposes of templates or style guides, a more generic classification should suffice.

2(c)  Analysis of data collected and Discussion of results

The effectiveness of two existing software designed to automate or improve the efficiency of usability testing is examined in this section. The two tools discussed are typical of the types of tools that are currently commercially available. Table 2 describes how well each tool meets the functional requirements discussed earlier. 






It can be seen that Morae focuses primarily on automating some processes of an actual user test, but doesn't attempt to perform any evaluation. This means that testers must still perform this work themselves. WebXACT accesses the GUI, but does not take into account other aspect.It should be noted however that non of these tools capture the description of tasks that
the HDA was intended to support, nor provide patterns of proven paradigms that promote
good usability.

Conclusion

Automating aspects of usability testing can increase testing efficiency and make
integration with the development process easier. An ideal automated usability testing tool
should capture a variety of inputs, perform analyses on various aspects of usability,
clearly present results, be simple and flexible to use, and be usable throughout
development. None of the existing tools mentioned meet all of the requirements.
Notably, none of the tools can recommend good usability solutions; they can only
perform evaluations.


References

Institute of Electrical and Electronics Engineers (1990). IEEE Standard Computer Dictionary: A Compilation of IEEE Standard Computer Glossaries. New York, NY.


Le Peuple, J, Scane, R. (2003): User Interface Design. Crucial, a division of Learning Matters Ltd.


Nielsen, J. (1993): Usability Engineering. Academic Press, Inc.


Scholtz, J. (2006); Usability Evaluation. http://www.itl.nist.gov/iad/IApapers/2004/Usability%20Evaluation_rev1.pdf


Spolsky, J. (2001): User Interface Design for Programmers. Apress. New York, NY


Morae: Usability Testing for Software and Websites. http://www.techsmith.com/morae.asp


Watchfire: Accessibility Testing. http://www.watchfire.com/products/webxm/bobby.aspx
Control / component Properties
Button Width, height, text font type
and size, position,
enabled/disabled, visible/not
visible
Table Width, height, position,
number of columns, width of
each column, data type for
each column, test font type
and size, enabled/disabled,
visible/not visible
Menu Number of items, greatest
depth (number of levels)
Dropdown list Width, height, position,
number of items
Table 1: GUI information examples
Control / component Properties
Button Width, height, text font type
and size, position,
enabled/disabled, visible/not
visible
Table Width, height, position,
number of columns, width of
each column, data type for
each column, test font type
and size, enabled/disabled,
visible/not visible
Menu Number of items, greatest
depth (number of levels)
Dropdown list Width, height, position,
number of items
Table 1: GUI information examples

Comments