Asee peer logo

Cross Platform Usability: Evaluating Computing Tasks Performed on Multiple Platforms

Download Paper |

Conference

2017 ASEE Annual Conference & Exposition

Location

Columbus, Ohio

Publication Date

June 24, 2017

Start Date

June 24, 2017

End Date

June 28, 2017

Conference Session

Emerging Computing and Information Technologies I

Tagged Division

Computing & Information Technology

Page Count

13

DOI

10.18260/1-2--28090

Permanent URL

https://peer.asee.org/28090

Download Count

716

Request a correction

Paper Authors

biography

Brian Patrick Brigham Young University

visit author page

Brian Patrick is an upperclassman attending Brigham Young University in pursuit of a BS in Information Technology. Primary interested in studying cyber-physical systems and related applications. Hobbies include participating in the maker community and building enthusiast PCs.

visit author page

biography

Richard G. Helps Brigham Young University

visit author page

Richard Helps has degrees in EE and a PhD in Instructional Technology. He is primarily focused on Cyber-Physical systems in IT with related interests in HCI. He is also interested in instructional design. He is a member of ACM-SIGITE, IEEE-CS and ASEE.

visit author page

Download Paper |

Abstract

Evaluating the usability of software across various computing hardware, including desktop computers, laptops, tablets, and smartphones, is becoming increasingly important as our need to complete computing tasks on various computing platforms becomes more commonplace in our daily lives. This shift in computing expectations focuses on the ability to compute horizontally, performing the same tasks within different computing environments, as opposed to vertically, performing different tasks in the same computing environment. In this paper, we propose a method by which the usability of a given piece of software can be comparatively analyzed across all hardware platforms on which it is available. This methodology is based on the Majrashi and Hamilton’s twelve factors of usability and has been extended to include specific applications and metrics for the following computing platforms: desktop computers, laptops, tablets, smartphones.

These metrics focus on collecting quantitative data about the usability of each platform. This quantitative data contrasts with recent studies which focus extensively on qualitative data regarding the user experience. Combining these two types of studies will allow researchers to capture a more comprehensive picture of software usability across computing platforms.

The methodology has been validated by performing an initial study using the tasks and performance metrics across each of the defined computing platforms. The results from these initial tests have allowed us to make improvements to the testing methodology as well as some hypotheses for further testing. This proposed methodology, when used in conjunction with qualitative research, can provide a reliable approach for cross-platform usability analysis.

Some considerations for educational design of cross platform methodology are discussed.

Patrick, B., & Helps, R. G. (2017, June), Cross Platform Usability: Evaluating Computing Tasks Performed on Multiple Platforms Paper presented at 2017 ASEE Annual Conference & Exposition, Columbus, Ohio. 10.18260/1-2--28090

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2017 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015