A Web-Based Group Decision Support System for ... - Semantic Scholar

6 downloads 504 Views 320KB Size Report
Sep 28, 2007 - Educational Multimedia, Instructional Design, Knowledge. Management ..... taken by the top management at the higher education institution.
A Web-Based Group Decision Support System for the Selection and Evaluation of Educational Multimedia Mohammed N. A. Abdelhakim

Shervin Shirmohammadi

Department of Management Information Systems Alain University of Science and Technology United Arab Emirates

Multimedia Communications Research Laboratory (MCRLab) University of Ottawa, Canada

[email protected]

[email protected]

ABSTRACT

1. INTRODUCTION AND BACKGROUND

Multimedia is now a relatively mature field, having advanced over more than two decades. The combination of text, images, videos, animations, etc. in both presentational and conversational form is now a standard part of most computer applications, web-based and stand-alone alike. As such, Educational Multimedia (EMM) can be a great tool to improve teaching and learning. However, EMM selection and evaluation for higher education is a complex and interdisciplinary problem characterised by uncertainty, dynamics, explicit and implicit knowledge and constraints, and involvement of different stakeholders. In this work, we use a domain-based web oriented Group Decision Support System (GDSS) for the selection and continuous evaluation of EMM for educational providers. We investigate the viability of developing and validating a webbased GDSS, integrated with knowledge management (KM) and instructional design, as a support tool to overcome the difficulties in the selection and evaluation of EMM. The proposed solution manages and supports the six phases of planning, intelligence, design, choice, implementation, and evaluation. In addition to design and implementation, performance evaluation is also presented using data collected from experts, instructors, and EMM producers. The results reveal that the proposed solution can successfully help educational consumers in selecting and evaluating EMM.

Providing quality education has become a major concern of many organizations, specially colleges and universities. Driven by the need to compete in a more global market, many universities are turning to the adoption of new educational technologies to expand their markets and improve the flexibility of their offerings. These technologies include the Internet and Multimedia, used in an educational context. The combination of the Internet and Multimedia has lead to the creation of contentrich and easy-to-access applications that can be very beneficial to education, be it self-paced learning, mobile learning (mLearning), distance education, or group-based tele-learning, all possible in both synchronous (real-time) and asynchronous (offline) modes. This has greatly expanded the traditional static educational content into dynamic interactive material and beyond the physical classrooms. Multimedia can be defined as a computer program that includes text along with at least one of the following: audio or sophisticated sound, music, video, photographs, 3-D graphics, animation, or high-resolution graphics [17]. Accordingly, Educational Multimedia (EMM) is the use of multimedia in an educational setting. It sets out to take advantage of features of Multimedia in order to enhance and improve teaching and learning as a communication and presentation tool. The use of EMM has already offered opportunities and facilities to improve teaching, learning and assessment, in the form of tutorials, hypermedia, drill & practice, simulations, games, tests and Webbased learning [1].

Categories and Subject Descriptors K.3.1 [Computers and Education]: Computer Uses in Education - Computer-managed instruction (CMI), H.4.2 [Information Systems Applications]: Types of Systems Decision support.

EMM is usually packaged as instructional material (content), assessment tools, and course management system. However, production of these packages and Multimedia in general can be time consuming and costly [21]. It takes time and effort to produce and/or modify EMM content; therefore, proper selection and evaluation of EMM becomes an important economic factor for EMM providers and Universities in order to save both time and money. But, selection and evaluation of EMM systems is a complex problem involving many interrelated factors including: diversity of required knowledge, characteristics of the problem, competencies and interests of stakeholders, and transparency and communication. Diversity of required knowledge itself involves different areas such as Internet technology, pedagogy, educational evaluation, software engineering, and other areas. EMM selection and evaluation is hence characterized by uncertainty, dynamic changes of the environment, explicit and implicit criteria and constraints, and involvement of different stakeholders [29]. While there are many selection methodologies and evaluation approaches

General Terms Management, Design, Verification.

Keywords: Educational Multimedia, Instructional Design, Knowledge Management, Web-Based Group Decision Support System. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. EMME’07, September 28, 2007, Augsburg, Bavaria, Germany. Copyright 2007 ACM 978-1-59593-783-4/07/0009...$5.00.

27

the required problem-solving phases such as implementation and evaluation that is needed by EMM systems. Therefore, in this paper will present the processes of development and validation of a specific-domain web based GDSS to support solving the problem of selection and evaluation of EMM. But before discussing the details of the system, some background about Knowledge Management and its application/ in EMM and Instructional Design must be explained. This is discussed next.

involving hundreds of criteria, there is no comprehensive model that can solve this complex problem. In this paper, we propose a system, based on Group Decision Support System (GDSS), to contribute towards overcoming this complexity. We present our model, design, and proof-of-concept implementation, as well as performance evaluations conducted with real education professionals and student.

2. RELATED WORK

3. KNOWLEDGE MANAGEMENT AND EMM

While much effort has been spent on developing EMM content, sufficient attention to ensure their quality has usually been neglected [20]. Many commercial EMM developers do not properly evaluate the effectiveness of their instructional packages [31]. In response to this problem, it has been suggested that educators conduct their own evaluations [11]. Educators attempting to use EMM, such as Universities and colleges, need to be able to select, evaluate and know how to use EMM for instructional purposes effectively and efficiently. However, very often, users of EMM (instructors and students) are not fully aware of the characteristics, features, types and limitations of these resources. As a result, they are often unable to critically assess their usefulness [2]. The selection and use of EMM that turns out not to be effective, will lead to the loss of both economic resource and time, and failure in effectively supporting the educational processes. Most of the methods and techniques for Commercial Off The Shelf (COTS) selection have a “closed world” assumption, in the sense that support is not provided for interaction between different stakeholders [29]. Lievertz [16] also points out the need for a new generation of software selection methodologies.

Knowledge management (KM) involves human resource, enterprise organization and culture, as well as the information technology, methods and tools that support and enable it [19]. KM typically follows six cyclic steps: Create, capture, refine, store, manage/update, and disseminate the knowledge [30]. After the knowledge is disseminated, new knowledge would be created in the cyclic process, and the cycle will continue. Another perspective of KM is given in [12] where most of KM basic processes are defined as follows: “To increase/create knowledge includes to discover, to research, to read, and to study knowledge. To capture knowledge includes to write, and to record knowledge. To refine knowledge includes to verify, to correct, to update, to augment, to clarify, and to generalize knowledge. To share knowledge includes to present, to publish, to distribute, and to discuss knowledge. To apply knowledge includes to plan, to decide, to design, to build, and to solve problems”. Consequently, Knowledge Management Systems (KMS) are tools aimed at supporting knowledge management. KMSs evolved from information management tools that integrated many aspects of computer-supported collaborative work environments (CSCW), which can directly apply to EMM systems through instructor/student interactions, with information and document management systems [7]. A KMS provides support for many information functions, such as acquiring and indexing, capturing and archiving; finding and accessing; creating and annotating; combining, collating and modifying; and tracking [5].

The available EMM evaluation models have a number of gaps. According to Pham [20], “some models are not sufficiently explicit to allow easy implementation in practice. Others even appear contradictory and are not substantiated by validation studies”. A few EMM evaluation models provide some basic help for developing a selection criteria, but they do not provide comprehensive models for managing the whole selection and evaluation processes [30][22][29].

The relationship between KM and EMM becomes evident when focus is shifted to Instructional Design. It is intuitive that using EMM in higher education as learning or teaching tools should be based on educational background. Instructional Design (ID) is the key background for this problem. Earlier research defines ID as a decision-making process of identifying the most effective instructional methods under given conditions to achieve optimal outcomes [9][27]. More recently, ID is defined as a systematic approach to instructional planning that typically involves a project team analysing a problem situation, exploring alternative performance support or instructional solutions, and then planning, implementing, evaluating, and managing solutions [28]. The four general stages of any instructional design process include a) analysis, b) design, c) development, and d) implementation, with evaluation being a continuous task. Based on these definitions, ID can basically be thought of as a problem solving process. Most modern instructional design models include the process of evaluation, and there are generally two forms of evaluation in instructional design models: formative evaluation and summative evaluation [4]. The evaluation methods used in part as a standard in this research are the summative evaluation procedures. In our system, the expert judgment phase of summative evaluation was used for the

At the same time, Decision Support Systems (DSS) have evolved and are now standard practice in many fields. DSS are interactive computer-based systems and subsystems intended to help decision makers use communications technologies, data, documents, knowledge and/or models to identify and solve problems and make decisions [23]. These systems are usually used by a group of people, referred to as GDSS. Consequently, a web based GDSS is “a computer-supported cooperative work that delivers decision support information or decision support tools using a "thin-client" Web browser like Netscape Navigator or Internet Explorer” [23]. Such a system could be used as a methodical and computerised solution for solving EMM problems with the above-mentioned characteristics, and this is the direction taken in this research. It should be noted; however, that acquiring a DSS is not as simple as obtaining a word processor: DSS are usually designed to handle complex situations. and very few fully ready DSS solutions are available, hence a DSS must usually be custom designed, developed and implemented for each specific application [30]. Those DSS system that are available off the shelf can support decisionmaking phases up to the choice phase and give some recommendation for implementation, but they cannot support all

28

provides a university with the ability to be more agile in its response to dynamic conditions. In our design mode, this recommendation has been implemented and KM has been integrated with instructional design and selection and evaluation of EMM decision-making process.

selection process of EMM as quality control, and the field trial phase was used to assess the impact of using EMM. To assure quality, the system supports the EMM acquiring and evaluating process. Since EMM is a learning and teaching tool, instructors, administrators, and universities need to be able to use instructional design models to carefully define the educational problem, learner’s needs, goals of using EMM, and then consider how EMM can assist them in achieving those goals.

In short, one can identify two main shortcomings in existing systems that our proposed system tries to alleviate. The first one is the need for a comprehensive conceptual evaluation model to manage the process of selection and evaluation of EMM. The second one is the need for an online solution to make the process easy, accessible, systematic, flexible and interactive. With this in mind, let us now have a closer look at the proposed model.

3.1 KM in Instructional Design ID is a complex, collaborative activity involving teams whose members are often distributed in different locations. Consequently, it is recommended to use a KMS to support ID. For example, the European Commission Fifth Framework Adapt IT Project involves the use of a KMS in the design and development phases of ID [7]. Since KM technology can be integrated into an instructional delivery framework, it has the potential to affect patterns of interaction among those who design and develop instruction, such as instructional designers, developers, content experts, system integrators, graphics artists, and media specialists. Specifically, a KMS enhances communication, coordination, and collaboration among a team while improving long-term productivity by facilitating access, archiving, retrieval, and reuse of a variety of learning objects and instructional resources [18].

4. THE PROPOSED MODEL FOR EMM SELECTION AND EVALUATION The proposed model, as shown in figure 1, is a web based GDSS that integrates the above-mentioned requirements and alleviates the aforementioned shortcomings. The model aims to be systematic, interactive, ready and easy to use, highly availability, support zero-administration clients, security and privacy. It is also easily integrated with other related processes in higher education, and uses KM and Web based GDSS to support higher education at strategic, tactical, and operational levels. The proposed model is, in part, based upon the decision phases model discussed in [30], where five stages for problem solving are given: intelligence, design, choice, implementation, and evaluation. However, in this research, after the problem was deeply investigated and decomposed into strategic management and tactical management issues, a new “planning” phase is proposed and added. In Beckman’s [3] framework for KM, the following stages are defined for knowledge management: identify, capture, select, store, share, apply, and create. The model proposed here develops the EMM selection and evaluation criteria using KMS. Furthermore, the model is designed to support the process of instructional design, as explained earlier, where the process of selection and evaluation of EMM is part of that process. The model also makes use of the systematic evaluation procedures for interactive multimedia for education and training [25], additional evaluation criteria specific to each EMM learning methodology [6], and the criteria for evaluating multimedia instructional courseware [8]. With the above taken into account, let us now have a closer look at the phases shown in figure 1.

Today, many new web technologies are used in E-learning systems. Many organizations use E-Learning, and E-Training for staff development. In these systems, E-learning addresses individual learning and KM widens the focus to organizational learning, and both share a common strategy of creating a learning organization [24]. However, the source of knowledge in KM is much broader. More and more KMS tools are used in Elearning systems since knowledge and learning within the organization is supposed to be used by the employee to improve his/her comprehension of the way the corporation works. In this regard, Ikehara [13] argues that although learning starts with individuals, individual learning does not necessarily lead to organizational learning. This necessitates more integration between organizational learning and many other strategies such as staff development strategies. There is clearly a gap here that needs to be filled.

3.2 Knowledge Education

Management

in

Higher

Traditionally, universities have been the sites of knowledge production, storage, dissemination, and authorization. The rapidly expanding use of technology in teaching and learning, and the transformed economic basis upon which universities are instituted, have caused universities to change and transform the ways in which knowledge is produced, stored, disseminated, and authorized [26]. The degree to which these changes are informed by strategic reasoning is proposed as an indicator of success. Higher Education institutions have “significant opportunities to apply knowledge management practices to support every part of their mission,” [14].

4.1 Phases The proposed model manages EMM selection and evaluation processes through a series of six sequential phases or steps. Those steps represent a series of decisions and actions to be taken by the top management at the higher education institution and selection and evaluation team, each step building on the one before. The six phases are planning, intelligence, design, choice, implementation and evaluation. There is a continuous flow of activities from planning to intelligence to design to choice to implementation to evaluation (continuous bold line) as shown in figure 1. However, at any phase there may be a return to a previous phase (feedback) for corrective measures. The problem solving steps flow from identifying the problem, sanity checking, developing the knowledge in terms of who will do what in the planning phase, to problem definition in the

Both Kirschner [15] and Hansen [10] provide approaches for strategic reasoning to develop technical solutions to manage the knowledge base upon which higher learning depends. They recommend the use of web based KMS as a required key technology. The capacity for such systems to be scalable

29

intelligence phase, design the decision model in the design phase, selecting EMM in the choice phase, assessing EMM’s impact in the implementation phase, and finally, assessing the satisfaction of EMM users for the selected EMM and validation the entire proposed solution in the evaluation phase. At the end, the specifications required for each phase are developed, and the users for the model should customize the processes to fit their unique requirements.

4.2 Users and their roles In DSS, user profiling usually has the goal of characterizing users on dimensions relevant to requirements. Relevant dimensions such as users’ experience, attitudes and expectations for the DSS, and their cognitive abilities and style. Users’ roles and profiles, as summarized in table 1, are: Chief Knowledge officer (CKO)/researcher, external expert, internal expert, facilitate their role in the virtual knowledge community.

Figure 1. Problem solving model for EM selection and evaluation using GDSS

30

In brief, the roles are: a) Chief Knowledge Officer (CKO)/researcher/team manager: CKO is responsible for managing the knowledge and acts as team leader. b) External Expert: An external expert is an education technologist, instructional designer, multimedia expert, information technology specialist, EMM producer, or developer. Such an expert is hired and works from outside the EMM providing institute. Phase Planning

c) Internal Expert: An internal expert has the same background as the external expert, except s/he is a regular employee of the EMM providing institute. d) Instructor: An internal subject matter expert or instructor who is going to use EMM within the institute for a specific course. e) Student: Student users of the EMM, taking specific EMMenabled courses.

Table 1. Users’ roles and their relationship with phases Process External Internal Instructor Student Expert Expert Define stakeholders User management Develop EMM selection criteria Develop Model Evaluation criteria

Researcher/ CKO * * *

*

*

*

*

*

*

Manage the knowledge Define the problem Manage the problem

*

*

* * *

Propose Alternatives Set Main criteria Develop/reuse sub criteria

* * *

*

* * *

Vote to select promising EMM

*

*

*

Construct the problem Publish the problem

*

*

* *

Choice

Vote for each alternative Preview problem report Sensitivity analysis

* * *

* * *

* * *

Implementation

Conduct pre test Conduct post-test

Intelligence Design

* *

Preview pre-post test and T test analysis Evaluation

*

*

Student satisfaction questionnaire

* *

Preview student satisfaction report

*

Instructor satisfaction questionnaire

*

*

*

*

Preview instructors satisfaction report

*

*

*

Validate the model

*

*

*

*

Preview Model evaluation report

*

*

*

*

Document lessons learned

*

31

their roles are shown above activity flow lines. Figure 3 shows the class diagrams that implement the EMM selection process. This process includes the intelligence, design and choice phases. The problem is defined, the solution is designed and the process of selection is based on the EMM validated criteria. The system will generate decision analysis report for every problem based on these data. The team is supported by a KM and group decision model. Everyone participates in various tasks as shown in the figure.

4.3 The web based GDSS design and implementation A prototyping development methodology was used to design, implement, and deploy the system at Ajman University of Science and Technology in the UAE. Microsoft SQL 2000 and Active Server Pages (ASP) were used for the database and for the web application modules, respectively. The system’s activity diagram is illustrated in Figure 2, where the system users and

Users activities support ed by the system

Proposed System support

Request Logon

Enter User Name and Password

Validate logon information

Problem Identifi cation



Map the required knowledge

User management



Planning

Develop EMM selecti on criteria

Go to Main Menu

Establi sh dynamic related links

Develop system validation criteria

Establish debate forum



Select EMM



Use the sel ected EMM



Conduct Impact Eval uation

Assess the student satisfacti on

role=student

Assess i nstructor satisfacti on

Val idate the proposed system

Figure 2. The system’s activity diagram

32

Contact Webmaster

gives

proposed solution

Problem

Sensitivity analysis

1

1.. n

n n

1

lead to

refine 1

1..n EMM alternatives

1..n

rank

n

1..n

Selection Process 1..n

1.. n

decide

result in n

apply

EMM subcriteria

filter n

n

1..n

assign weight 1..n

1. .n

Selection process is to rank the alternatives based on their respons e to the sub criteria, which will lead to rank the alternatives from the best one to the leas t successful one. Accordingly, selection process will order the alternat ives and choose the best one as a proposed solution

*

has

reuse * EMM Validat ed Subcriteria

1. .n n

Voters

1..n EMM Criteria

classify 1.. n Voters are Chief Knowledge Officer, Internal Expert , Instructor

Figure 3. EMM selection process

5.2 Analysis 5. SYSTEM EVALUATION

The solution was applied and validated empirically. The study triangulated the data obtained from the literature, questionnaire surveys, and workshops. Data were collected from experts, instructors, students, and EMM producers. For the online questionnaire surveys, 55 users (experts, and educators) were invited by emails. 44 experts and educators accepted the invitation: 10 EMM producers, 15 external experts, 11 internal experts, 8 instructors, and of course the CKO/researcher. Figure 6 shows the evaluations collected from this group. Furthermore, 50 students participated in two groups, 25 as a control group, and 25 as an experimental group, with the collected data shown in 5.1. Moreover, within the period of 2003-2006, four workshops were conducted at the institute (Ajman University of Science and Technology), one workshop at the nearby University of Sharjah, and one at the University of Salford. These workshops supported the process on prototyping and were used for formative and summative evaluation of the proposed solution.

As mentioned before, the application was developed, deployed, as a password-secured web site at http://gdssformultimedia.com, and tested with an experimental group of students at Ajman University. The students used the selected EMM and were asked to assess their degree of satisfaction by answering web-based questionnaires provided by the system itself. Also, the expert users and the manager were asked to evaluate the system. In this section we present some of the collected evaluation reports and validation feedback.

5.1 Collected reports Figure 4 shows the problem report. The report itself is a graphical user interface and displays the problem, promising alternatives, main criteria and their weights. It presents the voting information and highlights the proposed solution based on the calculated and analysed data. The form also supports conducting sensitivity analysis for the solution. Finally, Figure 5 shows the students’ satisfaction report.

The data reveals that the proposed solution has successfully supported selecting and evaluating EMM. The main factors for

33

Figure 4. Problem report

Figure 5. Students’ satisfaction report

34

Figure 6. Model evaluation report and use of web based GDSS integrated with KM and instructional design, and by unveiling the most commonly observed implementation issues and barriers related to this technology. The proposed model also contributes to other stakeholders such as EMM, and web based GDSS developers and vendors. Our experience showed that integrating a continuously-self-evaluating KM system into the higher education processes is quite useful and highly recommended. More interdisciplinary and empirical research related to human factors is required to improve the design, implementation and evaluation of web based GDSS and KM for higher education use.

success and acceptance of the proposed solution in this work were: top management support, meaningful stakeholders involvement, advantages of using specific domain web based GDSS, knowledge management, integrated with proven instructional design model, comprehensive design and support, and the flexibility to meet decision makers both short-term and long-term objectives to solve a practical problem. This shows that a Web based knowledge management subsystem for managing EMM evaluation criteria continuously can be beneficial to EMM providers and users. This updated EMM criteria is considered as actionable knowledge required for decision making and as a learning and research resource. In addition, encouraging all stakeholders to join and work within an informal knowledge network is a major challenge for knowledge management. Our system resolves three main obstacles that need to be overcome in the process of adopting web based EMM for higher education: a) lack of time and commitment, b) lack of a formalized reward system for participants, c) lack of properly designed web based GDSS and KM, especially for higher education problems.

7. REFERENCES [1] Alessi, S. M., & Trollip, S.R., “Multimedia for learning: Methods and development”, Boston, MA, Allyn & Bacon, 2001. [2] Avellis, G., “The ERMES Approach to Software Evaluation. Points to consider when appraising educational software”, ACM Ubiquity, 1999, last accessed June 2007, http://www.acm.org/ubiquity/views/g_avellis_1.html. [3] Beckman, Thomas J., "A Methodology for Knowledge Management." International Association of Science and Technology for Development (IASTED) AI and Soft Computing Conference, Banff, Canada, 1997. [4] Dick, W. and Carey, L., Carey, J. O., “The systematic design of instruction”, 5th edition, NY: Addison-Wesley, 2001. [5] Edmonds, G., & Pusch, R., “Creating shared knowledge: Instructional knowledge management systems”, Educational Technology & Society, 5 (1), Available at: http://ifets.ieee.org/periodical/vol_1_2002/ , 2002. [6] Foshay. R. and Muhammad. I., “A practical process for reviewing and selecting educational software”, Technical

6. CONCLUSIONS Using EMM in higher education should start and be rooted in educational technology plans, which have clear goals and objectives. Selection and evaluation of EMM is a complex problem. The solution for this problem should be comprehensive and interdisciplinary. Furthermore, it should support higher education strategic, tactical and operational levels. The proposed web based GDSS has been a successful supportive tool for the specified decision making processes required for selecting and evaluating EMM for higher education. The direct contribution of the proposed solution goes to higher education by identifying the benefits that could be acquired through the implementation

35

paper, Plato Learning Inc., 2000. [7] Ganesan, R., Edmonds, G. S., & Spector, J. M., “The changing nature of instructional design for networked learning”, In C. Jones & C. Steeples (Eds.). “Networked learning in higher education (pp. 93-109). Berlin: SpringerVerlag., 2001. [8] Gibbs, W.J., Graves, P.R., and Bernas, R.S., “Evaluation Guidelines for Multimedia Courseware”, Journal of Research on Technology in Education, 34(1), 2-17, 2001.

[19] [20]

[21]

[9] Glaser, R., “Components of a psychology of instruction: Toward a science of design”, Review of Educational Research, 46, 1-24, 1976.

[22]

[10] Hansen, S. (1999) ‘Adoption of Web Delivery by Staff in Education Institutions. Issues, Stratagems and a Pilot Study. Paper presented at AusWeb99-The Fifth Australian World Wide Web Conference, Ballina, Australia. [11] Higgins, K., Boone, R., & Williams, D. (2000). Evaluating educational software for special education. Intervention in School and Clinic, 36 (2), 109-124.Individuals With Disabilities Education Act Amendments of 1997, 20 U.S.C. 1400 et seq. (1997). [12] Horton, Katherine (2002) “Using E-Learning for Knowledge Management” (M310) ASTD 2002 International Conference (June 3, New Orleans, Louisiana, USA). [13] Ikehara, H. T. (1999). "Implications of Gestalt Theory and Practice for the Learning Organization." The Learning Organization, no. 2, 63-69. [14] Kidwell, Jillinda J., Vander Linde, Karen M., and Sandra L. Johnson (2001). “Applying Corporate Knowledge Management Practices in Higher Education.” In Bernbom, Gerald, editor, Information Alchemy: The Art and Science of Knowledge Management. EDUCAUSE Leadership Series #3. San Francisco: Jossey-Bass. Pp. 1-24. [15] Kirschner, P. (1999) ‘The Design and Development of a Powerful Electronic Learning Environment: Pedagogical Basis and Technological Design’ Paper presented at the19th ICDE World Conference on Open Learning and Distance Education. Vienna, Austria, June 20 – 24. [16] Lievertz, Fred, (2001), A new generation of software vendor selection methodology, Aris Corporation, Aris business innovation, www http://www.lievertz.com/alfred/index.html the site visited at 25 April 2003. [17] Maddux, C., Johnson, D., & Willis, J. (2001). Educational computer: Learning with tomorrow’s technologies. Boston: Allyn and Bacon. [18] Marshall, J. M., & Rossett, A. (2000). Knowledge management for school-based educators. In J. M. Spector & T. M. Anderson (Eds.), Integrated and holistic perspectives

[23] [24] [25]

[26]

on learning, instruction and technology: Understanding complexity (pp. 19-34). O’Leary, D.E., Studer, R., (2001, January/February), “Knowledge Management: An Interdisciplinary Approach”, IEEE Intelligent Systems, Vol. 16, No. 1. Pham B, (1998) “Quality Evaluation of Educational Multimedia Systems”, in Australian Journal of Education 14(2) pp. 107-121. http://cleo.murdoch.edu.au/ajet/ajet14/pham.html [website, accessed 25 august 2002] Phillips, R. 1996, Developers guide to interactive multimedia: A methodology for educational applications. Curtin University: Perth, WA. Power, D. J., & Kaparthi. S. (2002), Building Web-based Decision support Systems, Studies in Informatics and Control, Vol. 11, No. 4 Power, D. J., (2002), Decision Support Systems: Concepts and Resources for Managers. Quorum Books/Greenwood Publishing. Ravet S., (2002), E-learning and Knowledge management. The Newsletter of PROMETEUS, New York, No 20, JulyAugust, 2002, pp. 2-6. Reeves, T. C., & Harmon, S. W. (1994). Systematic Evaluation Procedures for Interactive Multimedia for Education and Training. In Resman, S. (Ed.) Multimedia Computing: Preparing for the 21st Century, Harrisburg, London: Idea Group Publishing, 472-505. Reid. Ian, C. (2000) The web, knowledge management and Universities, The Sixth Australian World Wide Web Conference http://ausweb.scu.edu.au/aw2k/papers/reid/paper.html last accessed on January 2005.

[27] Reigeluth, C. M. (Ed.). (1983). Instructional-design theories and models: An overview of their current status. Hillsdale, NJ: Erlbaum. [28] Richey, R. C., Fields, D. C., & Foxon, M. (Eds.). (2000). Instructional design competencies: The standards (3rd ed.). Syracuse, NY: ERIC Clearinghouse on Information and Technology and the International Board of Standards for Training, Performance and Instruction. [29] Ruhe G, Eberlein A. and Pfahl D. (2003) Trade-Off Analysis for Requirements Selection, International Journal of Software Engineering and Knowledge Engineering, Vol. 13, No. 4, pp. 345-366 [30] Turban Efraim and Aronson, Jay E. (2001). Decision Support Systems and Intelligent systems (sixth Edition). Prentice Hall, NJ, USA. [31] Zane, T., & Frazer, C.G. (1992). The extent to which software developers validate their claims. Journal of Research on Computing in Education, 24, 410-419.

36