Participatory Heuristic Evaluations of Jeliot Mobile: End-users evaluating usability of their mlearning application
Self archived versionfinal draft
MetadataShow full item record
CitationHassan, Muhammad Mustafa. Tukiainen, Markku. Qureshi Adnan N. (2020). Participatory Heuristic Evaluations of Jeliot Mobile: End-users evaluating usability of their mlearning application. 2019 4th Technology Innovation Management and Engineering Science International Conference (TIMES-iCON): Proceedings, 1-6. 10.1109/TIMES-iCON47539.2019.9024452.
The usability evaluations (testing and inspection) are utmost important in the development of learner-centered educational technology. Poorly designed learning tools put an additional interface learning load on learners' memory and cognition, which may result in significantly decreased attention to the main content learning goal. Nonetheless, discounted inspection (by usability experts) are commonly reported for such tools but expensive user-based testing is a rarity, mostly due to limited budgets and constrict timeframes. In this work, the authors propose to run discounted inspections with special end-user (domain expert) evaluators having a prior experience in usability. The authors postulate that putting discounted inspections into a participatory context shall bring the benefits of both type of usability evaluations to one. It shall also reduce costs typically associated with expensive user-based testing, thus removing hindrances. The argument is put to test via participatory heuristic evaluation experiments on a case tool with a large number of redundant evaluating groups to cross-validate results. Over 164 end-users with some prior usability experience, divided into 45 evaluating groups evaluate the case tool. The outcomes are promising. The dual-personae evaluators find significant amount of domain related problems (missed by usability experts in discounted methods) as well as interface related problems (missed by domain experts in testing), suggesting their usefulness in mixed method. The authors thus advise to evaluate the usability of low-budget educational technology projects with the proposed method.
Subjectsparticipatory heuristic evaluation dual-person evaluators mixed inspection and testing usability evaluations mlearning
Link to the original itemhttp://dx.doi.org/10.1109/TIMES-iCON47539.2019.9024452
Showing items related by title, author, creator and subject.
Synthesis and preclinical evaluation of [11C]MA-PB-1 for in vivo imaging of brain monoacylglycerol lipase (MAGL) Ahamed Muneer; Attili Bala; van Veghel Daisy; Ooms Maarten; Berben Philippe; Celen Sofie; Koole Michel; Declercq Lieven; Savinainen Juha R; Laitinen Jarmo T; Verbruggen Alfons; Bormans Guy (Elsevier BV, 2017)MAGL is a potential therapeutic target for oncological and psychiatric diseases. Our objective was to develop a PET tracer for in vivo quantification of MAGL. We report [11C]MA-PB-1 as an irreversible MAGL inhibitor PET ...info:eu-repo/semantics/article
Age-related changes of cortical excitability and connectivity in healthy humans: non-invasive evaluation of sensorimotor network by means of TMS-EEG Ferreri Florinda; Guerra Andrea; Vollero Luca; Ponzo David; Määttä Sara; Mervaala Esa; Iannello Giulio; Di Lazzaro Vincenzo (Elsevier BV, 2017)The sensorimotor cortical system undergoes structural and functional changes across its lifespan. Some of these changes are physiological and parallel the normal aging process, while others might represent pathophysiological ...info:eu-repo/semantics/article
Lee, K A; Hautamäki, V; Kinnunen, T; Larcher, A; Zhang, C; Nautsch, A; Stafylakis, T; Liu, G; Rouvier, M; Rao, W; Alegre, F; Ma, J; Mak, M W; Sarkar, A K; Delgado, H; Saeidi, R; Aronowitz, H; Sizov, A; Sun, H; Nguyen, T H; Sahidullah, Md; Vestman, V; Halonen, M; Kanervisto, A; et al. (ISCA (the International Speech Communication Association), 2017)The 2016 speaker recognition evaluation (SRE’16) is the latest edition in the series of benchmarking events conducted by the National Institute of Standards and Technology (NIST). I4U is a joint entry to SRE’16 as the ...conferenceObject