Identifying Criteria that Should be Considered ... - Semantic Scholar

104 downloads 0 Views 183KB Size Report
deciding the proportion of online to face-to- face components in their blended courses? (2) How important is each of these criteria to the design process?
2015 48th Hawaii International Conference on System Sciences

Identifying criteria that should be considered when deciding the proportion of online to face-to-face components of a blended course Ali Alammary Monash University [email protected]

Angela Carbone Monash University [email protected]

Abstract

mix the two models of instruction somewhat equally [5, 6]. Although a number of research studies have addressed the blended learning design challenges [7, 8, 9, 10, 11], none of these research studies has focused on the challenge of deciding the proportion of online to face-to-face components that should be incorporated into blended learning courses. To enhance the understanding of blended learning course design and contribute towards the existing literature in this area, the current study aims to identify criteria that teachers should consider when deciding the proportion of online to face-to-face components to incorporate into their blended courses. The research questions driving this study are as follows: (1) What criteria teachers should consider when deciding the proportion of online to face-toface components in their blended courses? (2) How important is each of these criteria to the design process? The remainder of the paper is organized as follows. Section 2 discusses blended learning design challenges and the criteria that might affect the proportion of online to face-to-face components in a blended course. Section 3 explains the modified Delphi technique that has been used to conduct the study. Section 4 presents results obtained with the Delphi technique. Section 5 discusses the results obtained. Section 6 concludes the paper and outlines future work.

Over the last decade blended learning has been growing in use and popularity, particularly in higher education, as it has proved to be an effective approach for accommodating an increasingly diverse student population whilst adding value to the learning environment through incorporation of different online delivery methods. Despite this growing interest, designing a successful blended learning course is still challenging for many academics in the Higher Education sector. A major design problem is deciding the proportion of online to face-to-face components to incorporate into the blended course. This study contributes to addressing this problem by: (i) identifying criteria that teachers should consider when deciding the proportion of online components in their blended learning courses; and (ii) rating the importance of each of these criteria. It employs a two-round online modified Delphi survey to achieve its aims.

1. Introduction A blended learning course1 can be defined as a course that: (i) integrates different instructional methods such as: lectures, discussion groups, selfpaced activities; and (ii) contains both face-to-face and online portions [1]. When it is thoughtfully designed and implemented, a blended learning course can be more effective than either an online or face-toface course [2]. However, designing an effective blended learning course is challenging for many teachers. A major design problem is deciding the proportion of online to face-to-face components to incorporate into the blended course. According to Duhaney [3] and Brunner [4], building a successful blend will require teachers to reach a harmonious balance between face-to-face and online components. In some courses more face-to-face than online will be appropriate, whilst others will tip the balance in favour of online components. Still other courses will

2. Literature Review Many teachers in higher education institutions, even those who might consider themselves experts in face-to-face instruction, return to being novice teachers as they face different kinds of challenges when they decide to design their blended courses. The first challenge is the large numbers of blended learning components that need to be considered when constructing a blended learning experience (e.g., lectures, classroom instruction, virtual classrooms,

1

A "course" is a unit of teaching that typically lasts one academic term. It may also be called unit, subject or topic.

1530-1605/15 $31.00 © 2015 Crown Copyright DOI 10.1109/HICSS.2015.19

Judy Sheard Monash University [email protected]

72

(online or face-to-face). According to Limniou and Smith [16], it is important to provide students with a learning experience that match their individual learning styles. Teachers might also need to look at their students’ life situation and whether they have any outside commitments. Online components may suit students who seek additional education but who have commitments such as family and work [17]. Furthermore, teachers might need to consider students’ access to campus. According to Vaughan [11], students who are not living in institution-owned housing may find it difficult to go regularly to their schools and find available parking space. Alammary, Sheard and Carbone [1] discussed five teacher related criteria that might affect the proportion of online to face-to-face components in blended courses. These criteria are: (i) teacher’s preferred teaching style; (ii) teacher’s technological knowledge; (iii) teacher’s confidence in integrating technology; (iv) teacher’s experience in designing for blended learning; and, (v) teacher’s experience in teaching a traditional course. For institutional related criteria, institutional support might be an important criterion that teachers should consider. Aycock, Garnham and Kaleta [18] pointed out that institutional support in the form of time release, technical support, funding and professional development is important for the successful integration of online components in the traditional face-to-face experience. Another institutional related criterion that might worth consideration is blended learning alignment with institutional goals (culture). According to Bath and Bourke [15], it is important for teachers to look at institutional culture regarding teaching and learning to see if blended learning fits that culture.

workshops, problem-based instruction, peer teaching, discussion groups, online learning communities and practice questions). Carman [12] stated that blended learning is most effective when it uses a combination of different components. However, Clark [7] argues that without a list of the possible blended learning components, teachers are simply relying on their own limited experience. The second challenge is deciding the most appropriate delivery method to achieve each course outcome. According to Mortera-Gutiérrez [13], designers of blended learning courses should try to maximize the benefit of traditional and online delivery methods by using each method for what it does best. With a large number of available delivery methods for a blended learning arrangement, the selection process becomes harder. Bersin [8] stated that one of the most difficult decisions teachers will make when designing their blended learning courses is choosing the best delivery method for the content at hand. The last challenge, which is the focus of this paper, is deciding the proportion of online to face-toface components of a blended course. Dziuban, Moskal and Hartman [6] stated that there are no defined standards to guide decisions as to how much or what part of courses should go online and what part should be taught in the traditional classroom. Vaughan [11] also found that there is no recognized formula for the reduction of class time or the use of technologies within a blended learning course. Harriman [10] suggested that the variety of technology combinations and the lack of patterns to follow creates daunting challenges at the front end of the design process. A number of criteria that might affect the decision about the proportion of online to face-to-face components in a blended course have been found in the literature. These criteria are related to the course, the students, the teacher, or the educational institution. For course related criteria, course type (theoretical, practical or a combination) might be an important criterion that needs consideration. Díaz and Entonado [14] found that it is more satisfactory and efficacious to deliver theoretical content online than face-to-face. They also found that students studying online may find it difficult to cope with practical content. The number of students enrolled in the course might be another important criterion that is worth consideration. The number of students according to Bath and Bourke [15] can limit but also provide opportunities for technology integration. For student related criteria, teachers might need to consider their students’ preferred learning styles

3. Methods A two-round online modified Delphi survey was employed in this study. The Delphi method is a common technique for gathering data from experts through multiple rounds of questionnaires [19]. It employs a series of data collections and analysis techniques to reach consensus on a particular topic [20]. The traditional Delphi method normally has four rounds of feedback and modified questionnaires [19, 21]. The modified Delphi can have as few as two rounds [22, 23]. The first round questions of a modified Delphi survey can be based on an extensive review of the literature [19]. Before inviting the participants, a decision was made to limit the number of rounds to two for two main reasons. Firstly, it was considered that a tworound survey would encourage more participants to 73

formats; and (vii) export results in various formats such as pdf, excel and SPSS.

participate in the study and would minimise their workload [24]. The second reason is that the Round 1 survey was based upon an extensive and careful review of available literature. Participants in the survey were presented with an initial list of criteria that could influence the decision problem. According to Snyder-Halpern, Thompson and Schaffer [22] and Martino [23], the number of Delphi rounds can be reduced to as few as two if participants are provided with an initial list of preselected items.

Table 1. Initial list of influential criteria Course related criteria 1. Course type (Theoretical, practical or a combination) 2. Number of students enrolled in the course Student related criteria 1. Students’ preferred learning style (Online or face-toface) 2. Students’ life situation (Any outside commitments, such as work or family) 3. Students’ access to campus Teacher related criteria 1. Teacher’s preferred teaching style (Online or face-toface) 2. Teacher’s technological knowledge 3. Teacher’s confidence in integrating technology 4. Teacher’s experience in designing for blended learning 5. Teacher’s experience in teaching a traditional course Institutional related criteria 1. Institutional support (e.g., time release, technical support, funding and professional development) 2. Blended learning alignment with institutional goals (culture)

3.1. Creating an initial list of influential criteria To create an initial list of potentially influential criteria, we searched a number of databases that contain publications on e-learning and blended learning such as: ACM digital library, ProQuest, Computer database, ScienceDirect, IEEE Xplore and Google Scholar. The search terms used included: ‘design’ + ‘hybrid course’, ‘design’ + ‘blended course’, ‘develop + ‘blended course’, ‘approach’ + ‘blended learning’, ‘approach’ + ‘hybrid course’, ‘blended learning’ + ‘model’ and ‘hybrid course’+ ‘model’. The word ‘hybrid’ was used because both ‘hybrid courses’ and ‘blended courses’ are used interchangeably. Also, the words ‘develop’, ‘design’, ‘approach’ and ‘model’ were all used to retrieve papers that might have discussed the topic of designing blended courses. The aim was to retrieve papers that may use different expressions to describe similar concepts. Only studies conducted in the context of higher education were reviewed. The final decision whether to include or exclude a criterion from the list was based on the justification that has been given for the importance of that criterion. Criteria that have been mentioned in the literature, but without explanation, were excluded from the list. Twelve criteria were found and they were divided into four main categories: (i) course related criteria (ii) student related criteria (iii) teacher related criteria; and (iv) institutional related criteria (see Table 1).

Prior to beginning the Round 1 survey, a pilot study was conducted. The pilot survey was intended to improve the internal validity of the survey, gain feedback about the questions and their clarity, and, get rough estimates of the time and cost involved with the Delphi technique [25, 26]. Three researchers in educational technology were asked to participate in the pilot study. The pilot survey was distributed in an online format. Under each item in the survey, participants were provided with a textbox to comment on the clarity and relevance of that item. Suggestions were analyzed and a number of changes to the survey were made.

3.3. Expert panel recruitment A critical step in any Delphi study is to identify and select experts who have high level of knowledge in the area under study, and can be representative of their profession [23]. A group of Australian and New Zealand experts who have in-depth knowledge and sound experience in both face-to-face and online teaching methods were invited to participate in this study. A purposive approach was adopted to select this group of experts. At first, 26 experts from different New Zealand and Australian universities who are known to the researchers formed an initial list. These experts are members of professional groups such as the Australasian Computing Education Conference (ACE) committee and the

3.2. Developing and piloting the survey LimeSurvey was used to create the two-round survey and to collect responses. LimeSurvey is a free and open source online survey application that allows users to: (i) use any web browser to develop their surveys; (ii) create a wide range of question types; (iii) track respondents; (iv) identify an expiry date for their surveys; (vi) view the results in different 74

Monash Better Learning and Teaching team. They all have years of experience in course design and online delivery methods. After that, the researchers searched a number of New Zealand and Australian universities websites to find more participants. Three criteria were used to identify experts: (1) Experience in course design: participants were required to have been involved in designing one course at least. (2) Experience with online delivery methods such as course website, online discussion, blogs or webcasts. (3) Publication record: in the field of educational technology in top-tier publication venues. Additionally, the researchers aimed to include experts from as many different disciplines as possible to examine the impact of the experts’ discipline on criteria selection and rating.

However, the 19 experts who agreed to participate, represent a wide range of academic disciplines. The majority of them teach undergraduates and postgraduate courses, have more than five years of experience in course design and have four years or more of experience with online delivery methods (see Table 2). They all have a number of publications in the field of educational technology. The list of experts is not exhaustive as it included academics from New Zealand and Australian universities only and perhaps, other academics could have been added to the list. Despite this, experts who participated provided adequate representativeness and a wide range of views [27].

3.4. Round 1 survey Participants were sent an e-mail containing a link to the Round 1 survey. The survey contained an overview explaining the design problem and the 12 criteria that had been identified in the literature. The criteria were divided into four sections: (i) course related criteria; (ii) student related criteria; (iii) teacher related criteria; and (iv) institutional related criteria. Participants were requested to rate the importance of each criterion by using an ordinal scale of 1 to 5, where 1 is very unimportant and 5 is very important. They were also requested to add any additional criteria that may have not been included in the list. Round 1 data analysis involves both qualitative and quantitative methods. The central tendency (mean) and level of dispersion (standard deviation) were used to present information concerning the criteria rates. The mean was used to represent the group opinion, while standard deviation was used to indicate the spread of responses from the expert panel [28]. A content analysis approach similar to that of Burnard [29] was used to analyse the open-ended questions. All comments from the returned Round 1 survey were copied into a word processing document. Each statement was examined to decide if it was a comment or a new criterion that the participant wanted to be included in the list. Then, criteria that were either the same or similar were grouped together and a decision was made on whether these criteria should be collapsed into one criterion, and, if so, the wording that should be used. The anonymized raw data and the nal list of criteria were shared with another two academics who have experience in the educational technology field to ensure that the collapsing process did not change the meaning of any criterion [28].

Table 2. Experts involved in the Delphi study No of participants Discipline1 1. Information Technology 8 2. Business 5 3. Education 4 4. Economics 2 5. Social Sciences 2 6. Medicine 1 7. Human Sciences 1 8. Library 1 9. Exercise Science 1 10. Management 1 11. Engineering 1 Experience in course design 1. 1- 5 years 1 2. 6 - 10 years 7 3. 11 - 20 years 8 4. 20+ years 3 Experience with online delivery 1. 1- 3 years 2 2. 4 - 6 years 4 3. 7 - 10 years 5 4. 10+ years 8 Course level 1. Undergraduate 6 2. Postgraduate 3 3. Both 10 1 Some experts belong to more than one discipline

A total of 48 experts were contacted by email and invited to participate in the study. Nineteen of them agreed to participate and completed the Round 1 survey. No specific set of characteristics differentiate those experts who participated from those who did not. It seems that the multiple rounds and the large number of statements in each round of the survey made some academics reluctant to participate.

75

four criteria: (i) time release (ii) technical support (iii) funding; and (iv) professional development.

3.5. Round 2 survey The same group of experts who participated in the Round 1 survey were sent an e-mail containing a link to the Round 2 survey. The Round 2 survey had two main sections. In the first section, participants were presented with a list containing criteria that they had rated in Round 1. They were asked to reconsider their responses while taking into consideration the group mean response which was indicated by underlined red text after each criterion. The participants were encouraged to consider adjusting their responses toward the group mean response. They were also informed that if they wish to rate an item more than one point away from the mean, they needed to provide justification. In the second section, experts were presented with a list containing additional criteria that had been suggested by respondents in round one. They were also requested to rate the importance of each of these criteria. Round 2 data analysis involved mainly quantitative methods. The central tendency (mean) and level of dispersion (standard deviation) were used to present information concerning the criteria rates.

4.2. Round 2 results Of the 11 course related criteria listed in the Round 2 survey, one criterion (9%) scored a mean importance rating over 4. Seven criteria (67%) scored mean importance ratings between 3 and 4 and three criteria (14%) scored less than 3. The experts reached a medium level of consensus on almost all the criteria (see Table 3). Table 3. Course related criteria means and standard deviations 1.

Course type (Theoretical, practical or a combination) 2. Number of students enrolled in the course 3. Intended learning outcomes 4. Course level (undergraduate, postgraduate) 5. How students are enrolled (on campus, off campus, both) 6. Offered across multiple campuses 7. Availability of technology to enable online delivery 8. Duration of current face-to-face classes (lecture and tutorial combined) 9. Accreditation: Is the course to comply with an external standard 10. Threat of MOOCS 11. The number of tutors/sessionals available for online delivery

4. Results The response rate for the first round was around 40% (19 out of 48). Fifteen of the respondents who completed the first round also completed the second round with a response rate of slightly less than 80%.

Mean

SD

3.47

1.09

3.13

1.15

3 2.6

1.56 1.4

3.6

1.45

3.4 4.07

1.31 1.34

2.73

1.18

3.27

1.48

2.6

1.58

3.13

1.2

4.1. Round 1 results For the student related criteria, 11 criteria were presented to the experts. Two criteria (18%) scored mean importance ratings over 4 while the remainder (82%) scored between 3 and 4. The experts reached a high level of consensus on seven criteria and a medium level consensus on the remainder (see Table 4). Eight teacher related criteria were listed in the Round 2 survey. One (13%) scored a mean importance rating over 4 while the remainder (87%) scored between 3 and 4. The experts reached a high level of consensus on five criteria and a medium level on the remainder (see Table 5).

Of the 12 criteria listed in the Round 1 survey, one criterion (8%) scored a mean importance rating over 4. The other 11 criteria (92%) scored mean importance rates between 3 and 4. Overall, institutional support scored the highest mean (4.42) among all the other criteria in the Round 1 survey. Eight criteria (67%) achieved a high level of consensus by experts. A high level of consensus was determined to be reached when the standard deviation was less than or equal to 1 [30]. The other four criteria achieved a medium level of consensus. A medium level of consensus was determined to be reached when the standard deviation was greater than 1 and less than or equal to 1.5. Twenty-two new criteria were suggested by experts. Nine course related criteria, eight student related criteria, three teacher related criteria and two institutional related criteria. Based on the experts’ feedback, the second criterion from institutional related criteria, institutional support, was divided into 76

Table 4. Student related criteria means and standard deviations 1.

Students’ preferred learning style (Online or face-to-face) 2. Students’ life situation (Any outside commitments, such as work or family) 3. Students’ access to campus 4. Students' technology literacy 5. Students' preparedness for study 6. Students' access to technology 7. Students' attendance requirements (eg. international students on visas) 8. Students' language proficiency 9. Students' expectations (misconceptions for example: online work is easier/harder than face-to-face ) 10. Level of students (ie. year of study) 11. Students' experience with online delivery

5. Discussion

Mean

SD

3.6

0.71

3.93

0.57

4.07 3.67 3.2 4 3.47

0.44 0.94 1.22 0.82 1.15

3.33 3.2

1.07 0.91

3 3.07

1.03 0.77

The Delphi technique was a good fit to explore the complex and multifaceted problem of identifying criteria that teachers should consider when deciding the proportion of online to face-to-face components in their blended courses. It helped in building consensus among a panel of experts drawn from different academic disciplines who were geographically spread across Australia and New Zealand and would not have been able to participate in a face-to-face consensus method. It allowed the panel members to express their opinions and judgments privately without feeling intimidated by other participants [19]. The statistical analysis techniques that have been used to analyse their feedback helped ensuring that opinions generated by each one of them are well represented in the final results [31]. The study identified and rated the importance of criteria that teachers should consider when deciding the proportion of online to face-to-face components in their blended courses. These criteria were divided into four main categories: (1) course related criteria; (2) student related criteria; (3) teacher related criteria; and (4) institutional related criteria. Experts who participated in this study perceived Availability of technology to enable online delivery as the most important course related criterion that need to be considered. A possible explanation of this is that whatever teachers’ intentions regarding the proportion of online components, they will be limited by the amount and type of technology available to them. The second most important course related criterion was How students are enrolled (on campus, off campus, both). It seems that delivering a course to on and off campus students using a greater proportion of online components can bring a number of benefits to both students and teachers. It can help, for example, in creating a space for collaboration and live interaction for the whole group and can reduce workload for the teacher by preparing and presenting materials only once. The least important course related criteria were Course level (undergraduate, postgraduate) and Threat of MOOCS. These two criteria scored the lowest mean ratings of all the other criteria in the Round 2 survey. This might indicate that (i) taking a decision regarding the proportion of online components in an undergraduate blended course is no different than taking the same decision for a postgraduate course; and (ii) the focus when designing a blended course should be on creating a better student learning experience rather than responding to external factors such as the perceived threat of MOOCS.

Table 5. Teacher related criteria means and standard deviations 1. 2. 3. 4. 5. 6. 7. 8.

Teacher’s preferred teaching style (Online or face-to-face) Teacher’s technological knowledge Teacher’s confidence in integrating technology Teacher’s experience in designing for blended learning Teacher’s experience in teaching a traditional course Teacher's workload Teacher's willingness to try new teaching methods Peer support and mentoring

Mean

SD

3.07

0.85

3.2 3.27

0.83 0.68

3.87

0.62

3.07

0.77

3.93 4.2

1.18 1.11

3.87

1.02

For the institutional related criteria, seven criteria were presented to the experts. Two criteria (29%) scored mean importance rating over 4 while the remainder (71%) scored between 3 and 4. The experts reached a high level of consensus on four criteria and a medium level on the other three (see Table 6). Table 6. institutional related criteria means and standard deviations 1. 2. 3. 4. 5. 6. 7.

Blended learning alignment with institutional goals (culture) Time release Technical support Funding Professional development Teacher performance evaluation Supporting teaching innovation

Mean

SD

3.87

0.62

3.6 4.13 3.2 3.87 3.8 4.13

1.2 0.62 1.11 0.96 1.11 0.62

77

Experts also regarded Students’ access to campus as the most important student related criterion that need consideration, followed closely by Students' access to technology. A likely explanation of this is that while a blended learning course designer should try to increase students’ flexibility and convenience, they should not assume that all students have access to the different technologies required by the course. The least important student related criteria were Level of students (ie. year of study) and Students' experience with online delivery. This indicates that whether students are first year students or later year students and whether they have already experienced blended learning in their program of study or not, should have little impact on the proportion of online components in the blended course. For the teacher related criteria, Teacher's willingness to try new teaching methods was regarded as the most important criterion. It also scored the highest importance mean rating among all the other criteria in the Round 2 survey. This result is consistent with many other studies that suggest that to adopt technology, teachers need to be willing to take risk and be open to change [32, 33, 34]. The least important teacher related criteria were Teacher’s preferred teaching style (Online or faceto-face) and Teacher’s experience in teaching a traditional course. There are several possible explanations for this finding. For example, it may be the case that experts think that with a strong willingness, teachers can step out of their comfort zone and commit to new approaches. One expert commented: “The teachers’ preference should have no relevance, as we are doing the education for the benefit of the students, not the teachers. If some teachers don't want to do blended learning, then they can be replaced with others who do”. For the institutional related criteria, experts regarded both Technical support and Supporting teaching innovation as the most important criteria. These two criteria also scored the second highest importance mean rating among all the other criteria in the Round 2 survey. This finding highlights the significant importance of institutional support and provides insights into how this support should be channelled. The least important institutional related criterion was Funding. This finding was quite unexpected as it conflicts with other studies that emphasize the importance of funding for the successful design of blended learning courses [7, 18, 35]. It is also important to note that institutional related criteria achieved the highest importance rating. Six out of the seven criteria in this category scored mean importance rate of 3.6 or higher. This

emphasizes the crucial role that the institution plays in determining the proportion of online components that can be integrated in blended learning courses. It seems that an institution’s leaders might need to provide high level of technical support to facilitate the integration of online technologies into the traditional face-to-face experience. Professional development should also be tailored towards helping teachers in learning new teaching and technology skills so they can integrate more online components in their blended courses. Furthermore, institutional culture regarding blended learning might need to be changed. Senior administrators might need to view technology as a means of achieving the institution’s strategic goals. They might need to support teaching innovation. As one expert remarked: “It is very important for the institution not to put barriers in the way of online and blended courses. As an example my head of department complemented me on high student satisfaction with my online course. They then asked how many contact hours are there? Clearly they had no idea what an online course was and wanted to call all the students in to ask them in person if this was okay, which missed the whole point of an online course”. Another commented: “The intuition’s formal rules and unwritten culture may discourage on-line courses. For example, where teachers are paid by the contact hour for teaching and e-leaning has no contact hours.” Overall, experts reached a high level of agreement on the majority of student, teacher and institutional related criteria. However, there was a noticeable spread of opinion regarding course related criteria. Five criteria namely, Threat of MOOCS, Intended learning outcomes, Accreditation, How students are enrolled and Course level, had the lowest consensus compared to all other criteria in the Round 2 survey. Some explanation for this can be found in the literature. For example, regarding Intended learning outcomes, some studies [10, 36] highlighted the importance of learning outcomes in selecting the delivery methods and suggested that certain types of learning outcomes might lend themselves best to certain delivery formats. On the other hand, other studies [37, 38], reported that there are no practical differences in learning between face-to-face and online delivery methods. This debate was reflected in experts’ rating and comments. One expert commented: “Learning outcomes of all types and levels can be delivered with an on-line course”, while another remarked: “Online tools should be chosen to maximise student-student interaction and collaboration, but not at the expense of face-to-face interaction”.

78

International Conference e-society 2005, 2005, pp. 473477. [6] Dziuban, C., Moskal, P., and Hartman, J., "Higher Education, Blended Learning, and the Generations: Knowledge Is Power: No More", Elements of quality online education: Engaging communities. Needham, MA: Sloan Center for Online Education, 2005, [7] Clark, D., "Blended Learning: An Epic White Paper", http://www.epic.co.uk/assets/files/wp_blended_learning_20 10.pdf, accessed 16 Apr 2012. [8] Bersin, J., "Blended Learning: Selecting the Right Media. How Do You Decide Which Media to Use? Courseware? Powerpoint? Webinars? Job Aids? Which to Use When?", http://www.bersin.com/blog/post/2003/05/BlendedLearning---Selecting-the-Right-Media.aspx, accessed 25 Apr 2012. [9] Tiirmaa-Oras, S., Pilt, L., Villems, A., and Ruul, K., "Easy Blending: Performance Support System for Blended Learning in Higher Education", http://www.ut.ee/blearn/orb.aw/class=file/action=preview/i d=358631/EPSS+workshop_+Stockholm.pdf, accessed 1 Mar 2012. [10] Harriman, G., "Blended Learning", http://www.grayharriman.com/blended_learning.htm#1, accessed 14 May 2012. [11] Vaughan, N., "Perspectives on Blended Learning in Higher Education", International Journal on E-Learning (IJEL), 6(1), 2007, pp. 81. [12] Carman, J.M., "Blended Learning Design: Five Key Ingredients", http://ipislam.edu.my/kplir/bahan/BlendedLearning-Design.pdf, accessed 29 May 2012. [13] Mortera-Gutiérrez, F., "Faculty Best Practices Using Blended Learning in E-Learning and Face-to-Face Instruction", International Journal on E-learning, 5(3), 2006, pp. 313-337. [14] Díaz, L.A., and Entonado, F.B., "Are the Functions of Teachers in E-Learning and Face-to-Face Learning Environments Really Different?", Educational Technology & Society, 12(4), 2009, pp. 331-343. [15] Bath, D., and Bourke, J., "Getting Started with Blended Learning", http://www.griffith.edu.au/__data/assets/pdf_file/0004/267 178/Getting_started_with_blended_learning_guide.pdf [16] Limniou, M., and Smith, M., "Teachers’ and Students’ Perspectives on Teaching and Learning through Virtual Learning Environments", European Journal of Engineering Education, 35(6), 2010, pp. 645-653. [17] Leh, A.S.C., "Action Research on Hybrid Courses and Their Online Communities", Educational Media International, 39(1), 2002, pp. 31-38. [18] Aycock, A., Garnham, C., and Kaleta, R., "Lessons Learned from the Hybrid Course Project", Teaching with Technology Today, 8(6), 2002, pp. 9-21. [19] Hsu, C.-C., and Sandford, B.A., "The Delphi Technique: Making Sense of Consensus", Practical Assessment, Research & Evaluation, 12(10), 2007, pp. 1-8. [20] Skulmoski, G.J., Hartman, F.T., and Krahn, J., "The Delphi Method for Graduate Research", Journal of information technology education, 6(2007, pp. 1.

A last point worth mentioning is that an expert’s discipline does not seem to have an effect on their rating of criteria. For example, of the four experts from the Information Technology discipline who participated in Round 2, two rated Intended learning outcomes criterion high, while the other two rated it low. Another example is that the three experts from the Business discipline rated Students' language proficiency criterion differently. While two of them rated it high, one rated it very low. Furthermore, the four experts who rated Threat of MOOCS criterion high, were from four different disciplines: Information technology, Human Science, Business and Management.

6. Conclusion This study identified criteria that teachers should consider when deciding the proportion of online to face-to-face components to incorporate into their blended courses. It measured the importance of each of these criteria and divided them into four categories: (i) course related criteria; (ii) student related criteria; (iii) teacher related criteria; and (iv) institutional related criteria. From the outcome of our investigation it is possible to conclude that institution plays the most important role in determining the proportion of online to face-to-face components of blended courses. Clearly, further research will be needed to: (i) analyze the impact of the design process criteria, that have been identified in this study; and (ii) test the impact of the criteria on the outcomes of blended courses. The results of this research will be used to inform development of a toolkit that can help teachers to easily design their blended courses.

7. References [1] Alammary, A., Sheard, J., and Carbone, A., "Blended Learning in Higher Education: Three Different Design Approaches", Australasian Journal of Educational Technology, 30(4), 2014, [2] Means, B., Toyama, Y., Murphy, R., Bakia, M., and Jones, K., "Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies", 2010, [3] Duhaney, D.C., "Blended Learning in Education,Training, and Development", Performance Improvement, 43(8), 2004, pp. 35-38. [4] Brunner, D.L., "The Potential of the Hybrid Course VisÀ-Vis Online and Traditional Courses", Teaching Theology & Religion, 9(4), 2006, pp. 229-235. [5] Dönmez, O., and Akar, P., "A Blended Learning Environment for a Course on Educational Software in the Framewok of Project Management", the IADIS

79

[38] Neuhauser, C., "Learning Style and Effectiveness of Online and Face-to-Face Instruction", American Journal of Distance Education, 16(2), 2002,

[21] Custer, R.L., Scarcella, J.A., and Stewart, B.R., "The Modified Delphi Technique-a Rotational Modification", 1999, [22] Snyder-Halpern, R., Thompson, C., and Schaffer, J., "Comparison of Mailed Vs. Internet Applications of the Delphi Technique in Clinical Informatics Research", Proceedings of the AMIA Symposium, 2000, pp. 809. [23] Martino, J.P., Technological Forecasting for Decision Making, McGraw-Hill, Inc., 1993. [24] Hearnshaw, H., Harker, R., Cheater, F., Baker, R., and Grimshaw, G., "Expert Consensus on the Desirable Characteristics of Review Criteria for Improvement of Health Care Quality", Quality in Health Care, 10(3), 2001, pp. 173-178. [25] Rubin, A., and Babbie, E., Brooks/Cole Empowerment Series: Essential Research Methods for Social Work, Cengage Learning, 2012. [26] Van Teijlingen, E., and Hundley, V., "The Importance of Pilot Studies", Nursing Standard, 16(40), 2002, pp. 3336. [27] Turoff, M., "The Policy Delphi ", in (Linstone Ha, and M., T., 'eds.'): The Delphi Method: Techniques and Applications, Addison-Wesley, Reading, MA, 1975 [28] Keeney, S., Hasson, F., and Mckenna, H., Analysing Data from a Delphi and Reporting Results, 2011. [29] Burnard, P., "A Method of Analysing Interview Transcripts in Qualitative Research", Nurse education today, 11(6), 1991, pp. 461-466. [30] Von Der Gracht, H.A., "Consensus Measurement in Delphi Studies: Review and Implications for Future Quality Assurance", Technological Forecasting and Social Change, 79(8), 2012, pp. 1525-1536. [31] Dalkey, N.C., Brown, B.B., and Cochran, S., The Delphi Method: An Experimental Study of Group Opinion, Rand Corporation Santa Monica, CA, 1969. [32] Dexter, S., and Greenhow, C., "Expert Teachers' Technology Integration Knowledge", 85th Annual Meeting of the American Educational Research Association, 2003 [33] Ertmer, P.A., and Ottenbreit-Leftwich, A.T., "Teacher Technology Change: How Knowledge, Confidence, Beliefs, and Culture Intersect", Journal of Research on Technology in Education, 42(3), 2010, pp. 255-284. [34] Zhao, Y., Pugh, K., Sheldon, S., and Byers, J., "Conditions for Classroom Technology Innovations", The Teachers College Record, 104(3), 2002, pp. 482-515. [35] Garrison, D.R., and Kanuka, H., "Blended Learning: Uncovering Its Transformative Potential in Higher Education", The Internet and Higher Education, 7(2), 2004, pp. 95-105. [36] Hofmann, J., "Why Blended Learning Hasn't (yet) Fulfilled Its Promises", in (Bonk, C.J., and Graham, C.R., 'eds.'): Handbook of Blended Learning: Global Perspectives, Local Designs., Pfeiffer Publishing, San Francisco,USA, 2006, pp. 27-40. [37] Aragon, S.R., Johnson, S.D., and Shaik, N., "The Influence of Learning Style Preferences on Student Success in Online Versus Face-to-Face Environments", The American Journal of Distance Education, 16(4), 2002, pp. 227-243.

80