Home   >   CSC-OpenAccess Library   >    Manuscript Information
Full Text Available

This is an Open Access publication published under CSC-OpenAccess Policy.
Generating a Domain Specific Inspection Evaluation Method through an Adaptive Framework: A Comparative Study on Educational Websites
Roobaea S. AlRoobaea, Ali H. Al-Badi, Pam J. Mayhew
Pages - 88 - 116     |    Revised - 15-05-2013     |    Published - 30-06-2013
Volume - 4   Issue - 2    |    Publication Date - May / June 2013  Table of Contents
Heuristic Evaluation (HE), User Testing (UT), Domain Specific Inspection (DSI), Adaptive Framework, Educational Domain
The growth of the Internet and related technologies has enabled the development of a new breed of dynamic websites and applications that are growing rapidly in use and that have had a great impact on many businesses. These websites need to be continuously evaluated and monitored to measure their efficiency and effectiveness, to assess user satisfaction, and ultimately to improve their quality. Nearly all the studies have used Heuristic Evaluation (HE) and User Testing (UT) methodologies, which have become the accepted methods for the usability evaluation of User Interface Design (UID); however, the former is general, and unlikely to encompass all usability attributes for all website domains. The latter is expensive, time consuming and misses consistency problems. To address this need, new evaluation method is developed using traditional evaluations (HE and UT) in novel ways.
The lack of a methodological framework that can be used to generate a domain-specific evaluation method, which can then be used to improve the usability assessment process for a product in any chosen domain, represents a missing area in usability testing. This paper proposes an adapting framework and evaluates it by generating an evaluation method for assessing and improving the usability of a product, called Domain Specific Inspection (DSI), and then analysing it empirically by applying it on the educational domain. Our experiments show that the adaptive framework is able to build a formative and summative evaluation method that provides optimal results with regard to the identification of comprehensive usability problem areas and relevant usability evaluation method (UEM) metrics, with minimum input in terms of the cost and time usually spent on employing UEMs.
CITED BY (8)  
1 Alqahtani, M. A., Alhadreti, O., AlRoobaea, R. S., & Mayhew, P. J. (2015). Investigation into the Impact of the Usability Factor on the Acceptance of Mobile Transactions: Empirical Study in Saudi Arabia. International Journal of Human Computer Interaction (IJHCI), 6(1), 1.
2 AlRoobaea, R., & Mayhew, P. J. (2014, August). The impact of usability on e-marketing strategy in international tourism industry. In Science and Information Conference (SAI), 2014 (pp. 961-966). IEEE.
3 Alroobaea, R., & Mayhew, P. J. (2014, August). How many participants are really enough for usability studies?. In Science and Information Conference (SAI), 2014 (pp. 48-56). IEEE.
4 Roobaea AlRoobaea, Ali H. Al-Badi and Pam J. Mayhew (2013), " The Impact of the Combination between Task Designs and Think-Aloud Approaches on Website Evaluation” Journal of Software and Systems Development, Vol. 2013 (2013), Article ID 172572, DOI: 10.5171/2013. 172572
5 Roobaea AlRoobaea, Ali H. Al-Badi and Pam J. Mayhew, “Generating a Domain Specific Inspection Evaluation Method through an Adaptive Framework” International Journal of Advanced Computer Science and Applications(IJACSA), 4(6), 2013.
6 Roobaea S. AlRoobaea, Ali H. Al-Badi and Pam J. Mayhew, “Generating an Educational Domain Checklist through an Adaptive Framework for Evaluating Educational Systems” International Journal of Advanced Computer Science and Applications(IJACSA), 4(8), 2013.
7 Alroobaea, R. S., Al-Badi, A. H., & Mayhew, P. J. (2013, November). Generating a Domain Specific Checklist through an Adaptive Framework for Evaluating Social Networking Websites. In IJACSA) International Journal of Advanced Computer Science and Applications, Special Issue on Extended Papers from Science and Information Conference (p. 25).
8 AlRoobaea, R., Al-Badi, A. H., & Mayhew, P. J. Research Article The Impact of the Combination between Task Designs and Think-Aloud Approaches on Website Evaluation.
1 Google Scholar
2 CiteSeerX
3 refSeek
4 Scribd
5 SlideShare
6 PdfSR
1 Abuzaid, R. (2010). Bridging the Gap between the E-Learning Environment and EResources:A case study in Saudi Arabia. Procedia-Social and Behavioral Sciences, 2(2):1270-1275.
2 AcademicEarth, (2012), AcademicEarth, Accessed on 3/4/2012, Available at:[http://academicearth.org/]
3 Alias, N., Siraj, S., DeWitt, D., Attaran, M. & Nordin, A. B. (2013), Evaluation on the Usability of Physics Module in a Secondary School in Malaysia: Students’ Retrospective. The Malaysian Online Journal of Educational Technology, 44.
4 Alkhattabi, M., Neagu, D. and Cullen, A. (2010). Information Quality Framework for ELearning Systems. Knowledge Management & E-Learning: An International Journal(KM&EL), 2(4): 340-362.
5 Alrobai, A. AlRoobaea, R. Al-Badi, A., Mayhew, P. (2012). Investigating the usability of ecatalogue systems: modified heuristics vs. user testing, Journal of Technology Research.
6 Alshamari, M. and Mayhew, P. (2008). Task design: Its impact on usability testing. In Internet and Web Applications and Services, 2008, ICIW’08. Third International Conference on, pages 583-589. IEEE.
7 Ardito, C., Costabile, M., De Angeli, A. and Lanzilotti, R. (2006). Systematic evaluation of elearning systems: an experimental validation. In Proceedings of The 4th Nordic Conference on Human-Computer Interaction: changing roles, pp. 195-202. ACM.
8 BBC KS3bitesize, (2012), BBCKS3bitesize, Accessed on 3/4/2012, Available at:[http://www.bbc.co.uk/schools/ks3bitesize/]
9 Brooke, J. (1996). SUS - A quick and dirty usability scale. Usability Evaluation in Industry,pages 189-194.
10 Chattratichart, J. & Lindgaard, G., (2007). Usability testing: what have we overlooked?.In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 1415-1424). ACM.
11 Chattratichart, J. and Lindgaard, G. (2008), A comparative evaluation of heuristic-based usability inspection methods, In the proceeding of CHI'08 extended abstracts on Human factors in computing systems, 2213-2220
12 Chattratichart, J. and Brodie, J. (2002), Extending the heuristic evaluation method through contextualisation. Proc. HFES2002, HFES (2002), 641-645.
13 Chattratichart, J. & Brodie, J. (2004), Applying user testing data to UEM performance metrics.In CHI'04 extended abstracts on Human factors in computing systems (pp. 1119-1122). ACM.
14 Chen, S. Y. and Macredie, R. D. (2005), The assessment of usability of electronic shopping:A heuristic evaluation, International Journal of Information Management, vol. 25 (6), pp. 516-532.
15 Cockton, G. and Woolrych, A. (2002). Sale must end: should discount methods be cleared off HCI's shelves? Interactions, 9(5): 13-18. ACM.
16 Coursaris, C. K. & Kim, D. J. (2011), A meta-analytical review of empirical mobile usability studies. Journal of usability studies, 6(3), 117-171.
17 Dumas, J. and Redish, J. (1999). A practical guide to usability testing. Intellect Ltd.
18 Doubleday, A., Ryan, M., Springett, M., & Sutcliffe, A. (1997). A comparison of usability techniques for evaluating design. In Proceedings of the 2nd conference on Designing interactive systems: processes, practices, methods, and techniques (pp. 101-110). ACM.
19 Fernandez, A., Insfran, E. and Abrahão, S. (2011), Usability evaluation methods for the web:A systematic mapping study, Information and Software Technology.
20 Gutwin, C. & Greenberg, S. (2000), The mechanics of collaboration: Developing low cost usability evaluation methods for shared workspaces. In Enabling Technologies: Infrastructure for Collaborative Enterprises, 2000. (WET ICE 2000). Proceedings. IEEE 9th International Workshops on (pp. 98-103). IEEE.
21 Hertzum, M. and Jacobsen, N. (2001). The evaluator effect: A chilling fact about usability evaluation methods. International Journal of Human-Computer Interaction, 13(4): 421-443.
22 Holsapple, C. W. and Wu, J. (2008), Building effective online game websites with knowledgebased trust, Information Systems Frontiers, vol. 10 (1), pp. 47-60.
23 Holzinger, A. (2005), Usability engineering methods for software developers,Communications of the ACM, vol. 48 (1), pp. 71-74.
24 Hart, J., Ridley, C., Taher, F., Sas, C. and Dix, A. (2008), Exploring the Facebook experience: a new approach to usability. In Proceedings of the 5th Nordic Conference on Human-Computer Interaction: Building Bridges, pages 471-474. ACM.
25 Hasan, L. (2009), Usability evaluation framework for e-commerce websites in developing countries.
26 ISO (1998), ISO 9241-11: Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs): Part 11: Guidance on Usability.
27 Jeffries, R., Miller, J. R., Wharton, C., & Uyeda, K. (1991, March). User interface evaluation in the real world: a comparison of four techniques. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 119-124). ACM.
28 Khajouei, R., Hasman, A. and Jaspers, M. (2011), Determination of the effectiveness of two methods for usability evaluation using a CPOE medication ordering system, International Journal of Medical Informatics, vol. 80 (5), pp. 341-350.
29 Liljegren, E. (2006), Usability in a medical technology context assessment of methods for usability evaluation of medical equipment, International Journal of Industrial Ergonomics, vol.36 (4), pp. 345-352.
30 Liljegren, E., & Osvalder, A. L. (2004). Cognitive engineering methods as usability evaluation tools for medical equipment. International Journal of Industrial Ergonomics, 34(1), 49-62.
31 Ling, C. and Salvendy, G. (2005), Extension of heuristic evaluation method: a review and reappraisal, Ergonomia IJE & HF, vol. 27 (3), pp. 179-197.
32 Law, L. and Hvannberg, E. (2002). Complementarily and convergence of heuristic evaluation and usability test: a case study of universal brokerage platform. In Proceedings of The second Nordic conference on human-computer interaction, pages 71-80, ACM.
33 Magoulas, G. D., Chen, S. Y. and Papanikolaou, K. A. (2003), Integrating layered and heuristic evaluation for adaptive learning environments, In the proceeding of UM2001, 5-14.
34 Molich, R. and Nielsen, J. (1990), Improving a human-computer dialogue, Communications of the ACM, vol. 33 (3), pp. 338-348.
35 Muir, A., Shield, L. and Kukulska-Hulme, A. (2003), The pyramid of usability: A framework for quality course websites, In the proceeding of EDEN 12th Annual Conference of the European Distance Education Network, The Quality Dialogue: Integrating Quality Cultures in Flexible,Distance and eLearning, 188-194. Rhodes, Greece.
36 Nielsen, J. (1992), Finding usability problems through heuristic evaluation, In the proceeding of ACM CHI'92 Conference 373-380. Monterey, CA, USA, May 3-7.
37 Nielsen, J. (1994a), Heuristic evaluation, Usability inspection methods, vol. 24, pp. 413.
38 Nielsen, J. (1994b), Usability engineering, Morgan Kaufmann.
39 Nielsen, J. and Loranger, H. (2006), Prioritizing web usability, New Riders Press, Thousand Oaks, CA, USA.
40 Nielsen, J. and Molich, R. (1990), Heuristic evaluation of user interfaces, In the proceeding of SIGCHI conference on Human factors in computing systems: Empowering people, 249-256.
41 Nielsen, J. (2000), HOMERUN Heuristics for Commercial Websites, in www.useit.com.
42 Nielsen, J. (2001), "Did poor usability kill e-commerce", in www.useit.com.
43 Nayebi, F., Desharnais, J. M. & Abran, A. (2012), The state of the art of mobile application usability evaluation. In Electrical & Computer Engineering (CCECE), 2012 25th IEEE Canadian Conference on (pp. 1-4). IEEE.
44 Oztekin, A., Kong, Z. J. and Uysal, O. (2010), UseLearn: A novel checklist and usability evaluation method for eLearning systems by criticality metric analysis, International Journal of Industrial Ergonomics, vol. 40 (4), pp. 455-469.
45 Sears, A. (1997), Heuristic walkthroughs: Finding the problems without the noise,International Journal of Human-Computer Interaction, vol. 9 (3), pp. 213-234.
46 Shackel, B. and Richardson, S. J. (1991), Human factors for informatics usability, Cambridge University Press.
47 Skoool, (2012), Skoool, Accessed on 3/4/2012, Available at: [ http://lgfl.skoool.co.uk//]
48 Stracke, C. and Hildebrandt, B. (2007). Quality Development and Quality Standards in eLearning:Adoption, Implementation and Adaptation. In Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunication 2007, pp. 4158-4165.
49 Thompson, A. and Kemp, E. (2009), Web 2.0: extending the framework for heuristic evaluation. In Proceedings of The 10th International Conference NZ Chapter of the ACM's Special Interest Group on Human-Computer Interaction, pp. 29-36. ACM.
50 Wilson, C. (2007). Taking usability practitioners to task. Interactions, 14(1): 48-49.
51 Van den Haak, M., de Jong, M. and Schellens, P. (2004), Employing think-aloud protocols and constructive interaction to test the usability of online library catalogues: a methodological comparison. Interacting with computers, 16(6): 1153-1170.
52 Zaharias, P. & Poylymenakou, A. (2009), Developing a usability evaluation method for elearning applications: Beyond functional usability. Intl. Journal of Human-Computer Interaction, 25(1), 75-98.
Miss Roobaea S. AlRoobaea
Faculty of Computer Science and Information Technology Taif University, Saudi Arabia and School of Computing Sciences, UEA, Norwich, UK - United Kingdom
Mr. Ali H. Al-Badi
Department of Information Systems, Sultan Qaboos University, Oman - Oman
Dr. Pam J. Mayhew
University of East Anglia - United Kingdom