Interactive Convergence - CiteSeerX

3 downloads 131459 Views 3MB Size Report
interactive media production course design including: a short historical context .... digital video software (Adobe Premiere™) and not Hypertext software enabled ...... As an illustration of the aforementioned ideas, this chapter briefly considers two ...... wave logo and coded flows of information that occur behind and at the.
Edited by

Scott P Schaffer & Melissa Lee Price

Interactive Convergence: Critical Issues in Multimedia

Edited by

Scott P Schaffer & Melissa Lee Price

Oxford, United Kingdom

Dr Robert Fisher Series Editor

Advisory Board Dr M argaret Sönsor Breen Revd Stephen Morris Professor Margaret Chatterjee Professor John Parry Professor Michael Goodman Dr David Seth Preston Dr Jones Irwin Professor Peter Twohig Professor Asa Kasher Professor S Ram Vemuri Dr Owen Kelly Professor Bernie W arren Revd Dr Kenneth W ilson, O.B.E

Volume 10 A volume in the Critical Issues project ‘Cybercultures’

First published 2005 by the Inter-Disciplinary Press Oxford, United Kingdom

© Inter-Disciplinary Press 2005

All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval systrem, without permission in writing from the publishers.

ISBN: 1-904710-09-3

Contents Preface

vii

Interactive Media Education in the U.S. and the U.K. Tim Hudson and Kavita Hayton

1

The Difficulty in Communicating with Computers Bertil Ekdahl and Lise Jensen

15

Accounting for User Needs and Motivations in Game Design Lucy A. Joyner and Jim TerKeurst

31

Spatial Context of Interactivity Stanislav Roudavski and François Penz

45

Interactive Multimedia = Whatever Intermedia Julainne Sumich

67

Mixed-mode Communication Courses at a Multicultural Technikon Dee Pratt

91

Construction, Consumption and Creation - The Convergence of Medium and Tool Anders Kluge

117

Multi-disciplinary, Cross-cultural Community Building in University Multimedia Design Environments Scott P. Schaffer and Melissa Lee Price

129

Is Electronic Community an Addictive Substance? An Ethnographic Offering from the EverQuest Community Florence Chee and Richard Smith

137

When Identity Play Became Hooking Up: Cybersex, Online Dating and the Political Logic of Infection Jeremy Kaye

157

Advising Student Software Development Teams for Maximum Performance Randy S. Weinberg and Jennifer A. Tomal

179

Use of Interactive Multimedia to Improve the Lecturing Experience Clive Chandler

195

Learning with Interactive Media: Characteristics of its Impact in Three Different Environments Corine Fitzpatrick and Michael Mucciardi

209

Learning Language through Video Games: A Theoretical Framework, An Evaluation of Game Genres and Questions for Future Research Jonathan deHaan

229

A Community Without a Vision Will Not Work Bettina Dimai and Martin Ebner

241

No Sense of Cyberplace: Personal Jurisdiction in Internet Litigation Lindsley Armstrong Smith

251

Preface This book was inspired by a desire to discover methods and models for doing research related to multimedia design, development and evaluation. Such research is inherently multi-disciplinary and inherently messy. The use of multimedia to visually represent knowledge, information, and processes, and to support computer-based and networked environments is now standard practice. The models and frameworks used to guide multimedia design thinking are rooted in relatively obsolete communications and software design approaches. The emergence of dynamic, networked web-centric environments has created new opportunities for theory-building and has opened new avenues of research into long-standing questions about the effect of multimedia on learning. Unfortunately, empirical evidence describing best practices and models for applying multimedia in various settings is relatively scant. It was for this reason that the first Interactive Convergence conference was conceived. The conference, held in Prague during the summer of 2003, was devoted to dialogue about multimedia research across a wide variety of disciplines. Papers were presented by a diverse group of individuals representing more than 10 countries and many forms of multimedia research. Subjects covered were impressive in their range from theorybuilding and conceptualizing to controlled studies examining multimedia effects, and almost everything in between! Much of the focus of this collection is on teaching and learning in school settings since these learning environments offer excellent opportunities to do research related to practice. Some chapters also focus on the design of multimedia environments and offer models for improving or radically altering the design process. In short, the chapters represent the best this conference had to offer. It is our fervent hope that this volume makes a small contribution to a burgeoning area of research. Scott P. Schaffer and Melissa Lee Price

Locating Interactive Media Production: Reflections on New Media and its Teaching Contexts Kavita Hayton and Tim Hudson Abstract The variety of courses in the UK teaching interactive or new media production is growing, each course team claiming to have come up with the perfect curriculum. This paper will offer perspectives on current interactive media production course design including: a short historical context; research into common curriculum content; the nature of multi and inter-disciplinary subjects; the role of technology and the industry in shaping the curriculum; maintaining the integrity of the curriculum, discussing the lack of quality learning resources to support us and the future of single honours interactive-media courses. It will also argue that a media department is the best ‘home’ for such study and asks whether interactive media courses should be drawn back into broader media production programmes. For purposes of this paper I will use the term interactive media to include multimedia and new media. 1.

The Current Situation A search on the UCAS web site http://www.ucas.ac.uk on 20th March 2003 found 826 courses degree courses that include ‘multimedia’. A further search found 819 including the words ‘new’, ‘digital’, ‘electronic’, and ‘interactive’ in combination with ‘media’. Bearing in mind that a single institution, particularly one with a modular scheme, may offer a whole raft of courses with shared components, the actual number of dedicated courses (and dedicated staff) may amount to fewer than these figures indicate. Universities with modular schemes and large student targets are using interactive media/digital media to enhance more conventional course offerings, I would argue that this is what University College Worcester are doing with Interactive Digital Media, http://everest.worc.ac.uk/cgi-bin/courses/course.p1?id=48. On the 22nd June 2003 there were 22 separate subjects that could be studied in conjunction with Interactive Digital Media at UCW http://www.ucas.ac.uk. It is clear from the UCAS web site that interactive media is a popular and therefore economically significant, area of media education. Not surprisingly the trend continues at postgraduate level. The web sites http://www.prospects.csu.ac.uk and Hot courses Prospects http://www.hotcourses.com both show a significant increase in new/interactive media MA courses since 1995. In 1995 the University of Luton set up its MA Media, Culture and Technology. Research into rival courses at the time revealed that the competition for courses dealing

2

Locating Interactive Media Production

specifically with multimedia at masters level in a media context came mainly from the University of Westminster and Middlesex University. A search on the 10th October 2002 found 33 Masters courses that offered interactive media or multimedia within a broad media context. This paper draws from personal experience of designing and teaching content for interactive media in three Higher Education institutions: University of Luton (BA Media Production, MA Media, Culture and Technology); University of the West of England (BA Timebased Media, MA Digital Media) and the University of Bournemouth (BA Interactive Media Production, MA Interactive Media). There will also be references to the MA Electronic Graphics at Coventry Polytechnic and the Royal College of Art in the early nineties, where I taught multimedia for a short time. The paper will also discuss a selection of other undergraduate and postgraduate course details available on the web to explore further the variety of curricula offered under the subject interactive media. 2.

Historical context How did interactive media evolve as a subject area in British Universities? In 1990 at Coventry Polytechnic, the curriculum for the MA Electronic Graphics was evolved rather than designed. Students were encouraged to learn the latest software and to use it to explore areas of personal interest. Based on the success/understanding of this model (and with the assistance of the course leader from the MA Electronic Graphics) the Royal College of Art in 1993 validated its first multimedia masters Interactive MultiMedia taught in conjunction with the London College of Printing, (although the successful Computer-Related Design MA was running at the RCA already). Along with Middlesex Polytechnic, the RCA/LCP were at the forefront of teaching multimedia at Masters level in an art and design context in the early nineties. The University of Plymouth’s MediaLab Arts course began in 1992 and was one of the first undergraduate courses in this area and took a multi-disciplinary approach from the start. MediaLab Arts is a BSc in the Institute of Digital Arts and Technology, it focuses on software as well as digital performance and design as key components of the course. Media courses took longer to incorporate multimedia production into their curricula. A few media/cultural studies writers began to look at the social and cultural impact of new media, Sherry Turkle (1985) Second Self: Computers and the Human Spirit; Carolyn Marvin (1988) When Old Technologies were new; Philip Hayward (1990) Culture, Technology and Creativity in the Late Twentieth Century; Jay Bolter, (1991) Writing Space: The Computer, Hypertext, and the History of Writing; Philip Hayward and Tana Wollen, eds. (1993) Future Visions: New technologies of the Screen and Roger Silverstone (1994) Consuming Technologies: Media and Information in Domestic Spaces; These books were studied on

Kavita Hayton & Tim Hudson

3

traditional media courses for their relevance to an emerging interest in the impact of computing and digitisation on society. The most significant impact on production aspects of course design was the invention of desktop digital video. I would argue that digital video software (Adobe Premiere™) and not Hypertext software enabled multimedia or interactive media to be considered by Media and Art and Design course teams for the first time. Hypertext was an anathema to image conscious students of the 1990s and its study did not blossom in the UK until the advent of the World Wide Web. In 1993, along with other institutions of the time, the University of Luton had the foresight to see multimedia as an important part of a new media production curriculum. It was digital video in this case that inspired the design of the multimedia pathway. Adobe Premiere ™ (1.0) enabled linear moving-image media to ‘unfix’ itself from its magnetic sprocket holes in the way the video enabled moving image production to be released from the literal sprocket holes of film. Being able to fracture linear time led students directly to investigating the production and consumption of media bytes and subsequently to an interest in using programmes like MacroMind (now MacroMedia) Director™. Using authoring software like Director they could change the sequence of those media bytes and tell different or even new stories. In 1996 Bournemouth introduced multimedia into its media production course taking out TV, Animation and Radio which became separate discipline areas. The course changed its name twice in the last five years until settling for the current Interactive Media Production. The original course change from Media to New Media at Bournemouth was designed to meet the perceived employment needs of the interactive TV industry in the late nineties and was driven by TV lecturers interested in new technology. When the anticipated dramatic changes to the TV industry did not occur, the course re-focussed on to the Internet and CDRom as delivery media. Today computer science, art and design and media studies can all provide an appropriate academic home for interactive media courses. I believe that media studies provides the most rounded and appropriate context for interactive media production. Media studies has a history of acknowledging the presence of the audience and so has an advantage over an art and design where such a concept can often be oblique and imaginary. The teaching of interactive media in computer science may draw on theories of cognitive psychology and therefore acknowledge the user very directly, but barely acknowledge the influence of media institutions and products. Of course there is now significant interchange between these broad subject areas and this will be discussed too.

4

Locating Interactive Media Production

3.

Curriculum Design As long ago as 1990, Tim Oren in The Art of Human Computer Interface Design (ed. Laurel, 1990)1 set an agenda for the discussion of new media. Emphasising that any study of the new multimedia capabilities of computing should take into account “issues heretofore peripheral to computing”, he cited “the psychology of media, the evolution of genre and form” and “the societal implications of media biases” (Laurel, p.467) He also included issues of authorship and user engagement/ reception (ibid p.471) These issues continue to be extremely relevant in contemporary discussions around networked media forms. The Art of Human Computer Interface Design came out of the now defunct Apple Advanced Technology group. They had the insight to place multimedia design and production into a critical context. They understood that the content produced by multimedia-enabling systems was as significant as the technology and they understood the essential multidisciplinary nature of the production and reception of that content. As Oren pointed out, “the boundary between form and content is itself plastic”. (ibid p469) It is because the form and content are intertwined in interactive media, that students need to learn different ways of creating and reading a variety of media forms, from two-dimensional picture space to the printed word to film-making. 4.

Commonality and difference in the interactive media production curriculum Studying a selection of 15 BA/BSc courses and 28 MA/MSc courses (see list of institutions at the end of this paper) whose curricula are available on the web, one can begin to draw out common elements in the course content. The following information is based on searches done between the 24th June 2002 and 24th June 2003. A teaching context was established based on the faculty or departmental home of each course, (Art and Design; Technology; or Media). A breakdown of topic titles was created for each teaching context in both undergraduate and postgraduate courses. Topics titles were edited to enable a more useful alphabetical list to be created, hence “Principles of Digital Media becomes “Digital Media Principles of”. Topics were then categorised as Theory, Production or Ambiguous. Topics were listed as ambiguous because the unit/module titles were unclear or because they seemed to be a genuine theory/practice subject. The percentages that follow are as accurate as one can be, allowing for flaws in the personal interpretation of a topic title. My hope is that this exercise forms the basis of further discussion and research. A. Undergraduate Interactive media taught in an Art and Design context: Not surprisingly practical skills feature strongly here with a total of 38 topics out of a list of 61 (62%) There are a high number of

Kavita Hayton & Tim Hudson

5

theory/practice or ambiguous topics - 20 (32%) and very few obviously non-practical topics – 3 (5%) Most common subjects at BA level: General interactive media/multimedia production skills; Animation; Video; Sound; Design, graphic and visualisation skills B. Postgraduate Interactive media taught in an Art and Design context: Practical subjects feature even more strongly on Masters courses, 53 out of 80 topics (66%). Ambiguous or theory/practice subjects are fewer but still significant at 19 out 80 (24%) and again there are few theory topics, 8 out of 80 (10%) Most common subjects at MA level: Design of interactive media; Software skills C. Undergraduate Interactive media taught in a Technology context: Production skills feature strongly, 35 topics out of 60 (58%). The non practical subjects mentioned vary from course to course but are mainly business and management skills, there were 9 (16%) in total and the theory/practice or ambiguous topics number 16 (26%) Most common subjects at BA: Multimedia authoring and design; Web authoring and design; Management skills; Sound D. Postgraduate Interactive media taught in a Technology context: Production features more strongly at master’s level, 40 out 61 topics (66%) and on the whole the subject is very focussed. There are few ambiguous topics 9 out of 61 (15%) and there are slightly more theoretical topics, once again mainly based on business or management issues 12 out of 61 (20%). Most common subjects at MA level: Multimedia/interactive media; Authoring; Systems; Technology; Marketing; Management; Business E. Undergraduate Interactive media taught in a Media context: The study of the Media itself features very strongly, there are 21 theory topics out of 51 (41%). There are 11 ambiguous or fused theory/practice topics (22%) and 19 production topics (37%) including two work placements. This excludes any accompanying optional units/modules. Most common subjects at BA level: Media studies; Multimedia production; Radio F. Postgraduate Interactive media taught in a Media context: As with the undergraduate course this area is the most theoryrich, 31 out of 82 topics (38%). Once again this excludes any

6

Locating Interactive Media Production

accompanying optional units/modules. Production does feature more strongly than in the undergraduate course and could be due to the more specialised nature of master’s courses in general, (remember these are all production courses). There are 35 production-based topics (43%). Ambiguous or theory/practice fused topics are 16 out 82, (20%) very similar to the undergraduate courses. Most common subjects at MA level: Design and production of interactive/multi media; Research methods Table showing percentages of each topic type set against Art and Design, Technology and Media contexts Undergraduate Context Art and Design Technology Media Postgraduate Context Art and Design Technology Media

Production 62% 58% 37%

Theory 5% 16% 41%

Ambiguous 32% 26% 22%

Production 66% 65% 43%

Theory 10% 20% 38%

Ambiguous 24% 15% 19%

It is of interest to note that some of these media courses sit within a modular scheme (Luton, Sunderland, East London) where there is an extensive and interesting menu of complementary subjects for students to choose from. (This complementary menu of subjects is distinct from the non-media subject combinations mentioned earlier in connection with University College Worcester.) The centrality of production skills is apparent for all courses regardless of context. Sound is an integral part of Art and Design and Technology based courses but virtually the only audio teaching in undergraduate Media appears as radio, (this is borne out by personal experience too, sound design for multimedia is unfortunately poorly covered in our courses). The theory/practice ratio is the most balanced in Media, however Art and Design courses may not advertise the content of contextual studies on their web sites so this needs further investigation. The most emphasis on managing multimedia is in Technology based courses. Project management is often cited as a strength of media courses but this is not borne out by this research. Project management and the study of the broader media in terms of institutions and social impact is virtually absent from the Art and Design courses mentioned here. It can be argued that the preponderance of theory in media based courses is historical as many early media production courses grew out of

Kavita Hayton & Tim Hudson

7

Media or Communication Studies (Media production at Bournemouth grew out of an English and Media course). Underpinning the curriculum with media theory topics such as History of Mass Media; Interface Culture; Issues of Gender in Visual Culture; Language and Power etc. (actual unit titles) can only enrich rather than detract from student production work. It is highly debatable that the production work produced on these courses is of a lower quality due to lack of “hands on” experience. In my personal experience excellent work produced on a media-based course is often more rounded and more carefully researched than its equivalent in an art and design context. Considering that often these media students have a heavily weighted dissertation at level 3 (at Bournemouth 33%, plus additional written analytical work 13%) gaining an overall 1st class honours degree is a real challenge. 5. Multi and inter–disciplinary subjects and the possibility of entirely new topics The list of course units/modules in this study confirms the multidisciplinary roots of interactive media, but are there new subject areas being taught that are genuinely interdisciplinary? There are only a few genuinely interdisciplinary topics in the 15 undergraduate courses mentioned earlier. Art and Design and Technology courses seem to have taken topics from each other and therefore expanded their multidisciplinary offerings but there are not many interdisciplinary subjects. Masters courses appear to have more coherent and focussed curriculum content, there is less ambiguity in the subjects and certainly more interesting (and in some cases more adventurous) topic titles: The new Digital Futures MA at Plymouth University is a case in kind: Unit titles include: Invisible Architecture, Histories and Futures, Liquid Media and Synthesis. Could these units be genuinely interdisciplinary? The future of successful interactive media course design lies in being able to identify new hybrid subject areas rather than relying on the multidisciplinary approach we have now. For example could information design for interactive media have the potential to become a hybrid subject - part graphic design, part cognitive psychology, part user interface design? What is the potential for an inter-disciplinary area in user/audience research? Perhaps it would involve the study of statistics linked with theories of audience reception and their engagement with networked interactive media? This could be a fusion of traditional theories of audience using statistics and demographics, linked to theories of identity and representation informed by cognitive psychology and the study of our relationship to technology/machines.

8

Locating Interactive Media Production

6. External influences on the curriculum - How much should we shape the curriculum to meet the needs of industry? There have been several attempts to assess the future needs of the interactive media industry. Within these assessments the role of universities in meeting these needs has also been discussed. The SEEDA publication Skills for the Digital Media Industry June 20002 states that there are few obvious skills gaps in the industry, but that highly skilled and versatile people are in great demand. This report also states that higher education has so far had very little to offer the digital media industry in terms of learning, “Qualifications are of secondary value” (SEEDA p.24). The report finds that higher education offers transferable and enduring skills but that the new media industry is not currently interested in those, (ibid p.15). This is borne out by a report on young i-professionals UK i-professionals–Education, Training and Development Audit (2002)3 undertaken by Dr. Elaine England. The report is the result of 50 interviews conducted with young i-media (interactive media) professionals about their education, training and development. There was a majority viewpoint that client/business skills had been absent from their higher education to the detriment of their training as professionals. Nearly all interviewed felt they’d had to learn on the job, “The greatest number of comments related to the gap between business and study, and how to bridge this in some way” (England p.19) A further two reports; The Creative Industries Task Force Inquiry into Internet use4, published in February 2000 by the Department of Culture, Media and Sport and the Identifying Functions relating to the Computer Games Industry project report published by SkillSet 20025, show that recruitment is a seen to be a major issue for the industry after its (then) recent rapid growth. SkillSet is keen to be involved with setting standards for any training needs associated with recruitment and they do see Higher Education as having a role to play here. The Media Employability Project 20026 run by four universities, conducted qualitative research by interviewing media students, graduates, employers and lecturers, also found that employers valued the trainability and flexibility of their employees more than their technical skills. The report states that a “balance of intellectual analysis and media production elements, made graduates multi-skilled and flexible” and that “skill development happened across the theory-practice divide.” The report goes further to say that their research showed employers believed that “critical engagement with major thinkers, debates and intellectual paradigms was thought provoking and made students more open minded”. The skills valued by media employers are therefore concomitant with those valued by academics – the ability to contextualise your production, to be trainable, flexible and to gain the transferable skills needed for management. Initial research shows that Technology based

Kavita Hayton & Tim Hudson

9

courses are fulfilling this need in terms of offering management skills but that Media courses are far ahead in teaching students how to place their work within the larger contexts of industry practice, current critical theory and analysis of personal development. However the extent to which we design courses to supply the labour market is determined often by pressure from internal sources too. The growth in vocational courses (such as Foundation Degrees) and the need to engage with enterprise funding, means that new universities have real financial pressures on them to work closely with industry. Personal preference in course teams may also determine how explicitly vocational the curriculum is. We need to solve the conundrum of taking heed of industry needs whilst keeping academic integrity. Universities offer opportunities for personal growth and reflection first and foremost and this should also be respected by industry. 7.

How much is the curriculum shaped by technology? There are a few significant problems associated with all technology based courses regardless of teaching context. The following is a list of issues related directly to hardware and software. Is the equipment used on the course of a high standard and is it appropriate to the aspirations of the curriculum? Are the staff competent to teach students how to use the resources? How much control do course teaching teams have over equipment purchases? Are assessments testing software competence or other more ephemeral criteria? Are students choosing courses based on comparative technical resources and is this because it is easier for people outside an institution to judge these resources than to judge teaching quality? While there is clearly more research to be done in this area, anecdotal evidence (my experience at Luton, UWE and Bournemouth) suggests that most equipment on degree courses is of a comparative standard but that there is a greater difference in staff competences. Courses fortunate enough to have software demonstrators (as we do in Bournemouth) have staff dedicated to constantly upgrading their skills. In teaching situations where lecturers have been responsible for technical demonstrating we have found the burden of assessment, administration, curriculum design and constant meetings, seriously affecting our technical skills development. 8. Maintaining curriculum integrity - quality teaching resources There are other difficulties facing interactive media course designers within any academic context. There is an impoverished supply of good academic sources and few records of the historical development of design for CD-ROM or the web. Compared with the sources we can draw on for the teaching of video and film production for example, good books in the field of interactive-media production are rare. A simple request to fellow

10

Locating Interactive Media Production

course leaders of interactive media in 7 different institutions for their favourite production books, revealed that we are resourceful when it comes to choosing teaching materials but also that most of our books were over 4 years old and some were very old indeed. This is their list: Curt Cloninger, Fresh Styles for Web Designers: Eye Candy from the Underground (New Riders) 2001 Bob Cotton and Richard Oliver, Understanding Hypermedia 2000, Multimedia Origins, Internet Futures (Phaidon Press, London) 1997 Mark Elsom-Cook, Principles of Interactive Multimedia, (McGraw-Hill Education) 2000 Elaine England and Andy Finney Managing Multimedia (AddisonWesley) 1996 revised 2001 Douglas Hofstadter, Gödel, Escher, Bach: an Eternal Golden Braid (Penguin Books) 1980, anniversary publication 2000 Bob Hughes, Dust and Magic: The Secrets of Successful Multimedia Design (Addison-Wesley) 1999 Richard Lanham, The Electronic Word: Democracy, Technology and the Arts (University of Chicago Press) 1995 Brenda Laurel, Computers as Theatre (Addison-Wesley) 1991 Brenda Laurel, ed., The Art of Human Computer Interface Design, (Addison-Wesley) 1990 Lev Manovich, The Language of New Media, (MIT Press) 2001 Mullet and Sano, Designing visual interfaces (Sun Microsystems Inc.) 1995 Janet Murray, Hamlet on the Holodeck, The future of narrative in Cyberspace (MIT Press) 1997 Donald A Norman, The Design of Everyday Things (Basic Books) Original 1988, revised ed. 2002 Robert Pirsig, Zen and the Art of Motorcycle Maintenance: An Inquiry into Values (Bodley Head) 1974, latest publication 1999 Oliver Sachs, The Man Who Mistook His Wife for a Hat (Picador) 1986 Tom Standage, The Mechanical Turk: The True Story of the Chess-playing Machine that Fooled the World (Allen Lane The Penguin Press) 2002 Edward Tufte, Envisioning Information (Graphics Press UK) 1990 Tay Vaughan, Multimedia - Making it Work (Osborne McGraw-Hill) 1994 revised 1998 Jefferey Veen, The Art and Science of Web Design (New Riders Publishing) 2000 Lynda Weimann, Deconstructing web graphics 2.0, (New Riders Publishing) 1998 Jeffrey Zeldman, Designing with Web Standards (New Riders Publishing) 2003

Kavita Hayton & Tim Hudson

11

This is an admirable list but it shows clearly that there is a need for new books of comparable quality to Norman’s The Design of Everyday Things and Tufte’s Envisioning Information. We especially need books that bridge the gap between accepted production techniques and new media theories, For example: designing for two-way communication, constructing socially sensitive avatars and guides, writing multiple narratives, playing/gaming design, understanding and exploiting the role of memory, dealing with the history of human and machine relationships and sound design for interactive media. In other words, books that help students apply new media theories to the making of work rather than the critiquing of it. There are literally thousands of software manuals and web design ‘how to’ books. These books should be handled with care. Students need to learn to discriminate between sources and they need better books. 9.

Keeping Subject Focus One of the most gratifying aspects of the research into curriculum content was to see that there was more commonality than difference in the topics offered. Interactive media now has a unique if nascent academic identity: interactive media production skills; project management skills; design and visualisation skills; some sort of contextual understanding and skills in using moving images appear in nearly all courses in any institution. These topics are mutually exclusive, a good course in interactive media must have a version of all these skills. Can a course that runs as a joint degree with a quite different academic subject, cover the full range of skills needed in 50% of the time? Joint degrees have been running successfully for many years, but the number of courses offered as combination degrees has increased greatly with the introduction of modular schemes in new universities. As the research shows, interactive media is already a multi-disciplinary subject in itself, does a further fracturing of curriculum content give students a partial view of the theories and practices of the subject? A joint degree does not often offer a genuine fusion of two subjects. The course team does not know who will take up what combination so there are no taught units that specifically tie the two subject areas together. What does Interactive Digital Media with Plant Sciences prepare students for? This paper argues that there should be more subject fusion across current interactive media topics rather than a further dilution of the curriculum. 10.

The future of single-honours interactive media courses This paper began with an assumption: that it would be too difficult to finance single-honours interactive media courses. The teaching expertise needed is spread too broadly to include in a normal sized course

12

Locating Interactive Media Production

team of perhaps 10 lecturers/demonstrators, (1 to 15 students in a BA course of 150). This original assumption was based on knowing and understanding the multidisciplinary nature of interactive media. However it has been interesting to see the few but significant topics that show a promise of subject fusion and that take an interdisciplinary approach: Interface Culture; Typography for Screen; Discourses on Interaction; Enabling Technology; Digital Art Practices. My argument now would be for academics to work together to write those missing books, to design new units, to hold conferences and to undertake practice-based research that addresses the issue of interactive media as a subject specialism in its own right. In other words to create a genuinely interdisciplinary subject area. The role of moving images in networked media must also be reconsidered. The advances of broadband technology will ensure that video production skills will have a new significance in interactive media production. “About 40 % of UK homes now have Internet access. 8% of British Internet users now surf the net with a fast access/broadband connection, a rise of 60% when compared to six months ago. With almost one-third of Internet users claiming to be likely to adopt broadband at home, forecasts that in the region of 15% of home users will be surfing using a broadband connection in 12 months time”. Reported by Oftel (http://www.oftel.gov.uk/publications/research/ as at Nov.2 2002) The transformation of traditional linear media into interactive forms (re-purposing material) will continue apace and students will need expertise in old media too. Adaptation skills will need to be developed by lecturers and taught to students. It is in these subject areas that the Media based courses have an advantage. Video production and the analysis of moving images is a staple of media courses. Adaptation, although relatively new as an academic subject area, is historically more akin to Media than it is to either Art and Design or Technology. At Bournemouth there is theory and practice teaching on adaptation that at the moment is taught in separate units to the same cohort of students. This would be an ideal subject to develop as an interdisciplinary topic - taking the analysis of existing texts straight into production. In conclusion this paper has argued that the future development of the subject area of interactive media production is dependent on maintaining a balance between theory and practice and has shown that currently Media provides this balance most successfully. However both

Kavita Hayton & Tim Hudson

13

Art and Design and Media can learn from Technology based courses where management and business units/modules are providing industry with the skills it needs. We should relish the opportunity to develop new subject areas such as streaming video and adaptation which need to be reshaped for interactive media. Course teams should re-conceptualise the current multidisciplinary approach of courses by writing and delivering entirely new interdisciplinary units/modules. In doing so interactive media production can become a truly interdisciplinary academic subject in its own right.

Notes 1. 2. 3. 4.

5. 6.

Oren, Tim. “Designing a new medium” in The Art of Human computer Interface Design, ed., Brenda Laurel, 467-479. Addison-Wesley, 1990. SEEDA and Human Capital. Skills for the Digital Media Industry-Research and Recommendations for the South East of England Development Agency, Final Report, June 2000 England, Elaine. UK i-professionals–Education, Training and Development Audit. ATSF ltd. in conjunction with the BIMA, 2002. Details from www.atsf.co.uk/atsf Department of Culture Media and Sport. Creative Industries – Internet Inquiry:’snapsjhot of a rolling wave’ – The Report of the Creative Industries Task Force Inquiry into the Internet. February 2000. PDF available www.culture.gov/internetinqpdf/ SkillSet. Identifying Functions relating to the Computer Games Industry project, report published by SkillSet, 2002 Media Employability Project. 2002 Joint project between University of Sunderland, Sheffield-Hallam University, DeMontfort University and the University of Central England. The project’s aims are: To identify skills and attributes (specific and transferable) which can be defined as enhancing the employability of Media Studies graduates. To identify those elements of curriculum and pedagogic practice which deliver these skills.

Bibliography Bolter, Jay. Writing Space: The Computer, Hypertext, and the History of Writing. Lawrence Erlbaum Associates Inc., 1991. Marvin, Carolyn. When Old Technologies were new: Thinking about Electric Communication in the Late Nineteenth Century. Oxford University Press Inc., USA, 1988. Hayward, Philip. Culture, Technology and Creativity in the Late Twentieth Century. University of Luton Press,1990.

14

Locating Interactive Media Production

Hayward, Philip and Wollen, Tana eds.. Future Visions: New technologies of the Screen. London: BFI Publishing, 1993. Silverstone, Roger. Consuming Technologies: Media and Information in Domestic Spaces. Routledge,1994. Turkle, Sherry. Second Self: Computers and the Human Spirit. Pocket Books, 1985.

The Difficulty in Communicating with Computers Bertil Ekdahl

Lise Jensen

Abstract In several areas of computer science, such as agent technology and artificial intelligence, it is tacitly understood that computers can be made intelligent enough to truly communicate knowledge. We argue that this is a void expectation and we show that computers will never be able to transfer semantics. They will only be able to transmit signals, or data, without understanding. Computers are not intentional devices. Our arguments are entirely linguistically and are a result of our considering computers as linguistic systems. We also take a fresh look at the design process of computing systems and argue that even in this case the machine perspective should be abandoned in favour of a linguistic approach. This understanding implies that social and cultural considerations should be taken more seriously and be regarded as the most outstanding property of the design process. 1. Introduction In this paper we make a distinction between signalling and communication. The latter takes place in a context where knowledge is transferred to a receiver that is able to share that knowledge. Signalling proceeds on a level where sender or receiver or both are unaware of the content of the signal 1 . Communication is a conscious action and proceeds particularly in language, but may proceed in a way that superficially may be confused with merely signalling. For example, if you meet a colleague who pulls a face because she does not like you, you are not unconscious about the meaning; you have understood the message. Signalling may be very advanced without being categorized as communication in the sense above. For example 2 , the red colour of an apple signals that it is ready to be eaten. On the surface it may seem as if there is an intention, but that is only a delusion. The apple tree has no knowledge to transmit. The intention of the system is a consequence of natural selection, not of any individual activity or manifest of the will from the apple tree. Furthermore the animals that choose the red apples react innately to the colour. They do not possess any knowledge of the matter, 1 2

For extensive arguments, see Sjölander, 1997. The example is from Sjölander, 2002.

16

The Difficulty in Communicating with Computers

but they have access to an embedded, unconscious interpretation process which natural evolution has provided. Ideas and knowledge are communicated by transmitting a description, consisting of finite strings of symbols, which are furnished with meanings by an interpretation process. So, in order to communicate by means of a language, the belonging interpretation process has to be mastered by both sender and receiver. Considering the interpretation process as part of the language is a holistic conception of language meaning that a language cannot be fragmented into parts without losing its essence. For formal languages, the view is quite different. Here language is defined as a string of symbols making a language a purely syntactical entity. Programming languages should be understood in this restricted sense. In this paper we argue that computers cannot communicate but only transmit signals on a level compared to the most rudimentary insects. This shortcoming is nothing that can be overcome but is a limitation that is inevitable because the meaning (interpretation) of a description cannot be attached to the description. Moreover, we show that the communicative restriction between computers is also partly valid for communication between human beings and computers. People do not share the programmer’s intended interpretation of the program. Hence, the most challenging effort in software design is to make the program’s interpretation as explicit as possible, but this requires a good understanding of how language works. 2. The Machine Metaphor According to Michael Denton (2002), almost all professional biologists today have adopted the mechanistic/reductionistic approach and assume that the basic parts of an organism are the primary essential things. The holistic (vitalistic) view is rejected and it is presumed that living organisms are no more than the sum of their parts. The parts determine the whole and a complete description of all properties of an organism may be had by characterizing its parts in isolation. This is the same approach that has permeated physics for long time and has been very successful. It presupposes that nature can be described in isolation and then understood by putting its parts together. Is this fragmentable paradigm a property of nature or it is a property of language? The question is reasonable since every description is done in a language, be it natural or formal. According to the difficulties in quantum mechanics there are reasons to argue that the fragmentable principle is applicable only in classical physics and not on the quantum level. It is this “macro” perspective that, according to Denton, is pervading in biology. This is also the approach taken in the field of ‘artificial life’ and in some

Bertil Ekdahl & Lise Jensen

17

directions in cognitive science, for example the belief that the human brain can be emulated by neural net and the belief that computers have the possibility of learning like human beings. The unpleasant with the holistic view is that it tells us what cannot, while the fragmentable view makes promises because it tries to tell us what can be done, that is, a much more promising view. We have seen that the mechanistic idea has had and still has a significant appeal in artificial intelligence and in so called agent research. We think it is fair to say that no progresses have been made, either in artificial intelligence or in agent research, concerning intelligent or autonomous agents 3 . Despite the discouraging results the mechanistic view is still viable. For example, Pattie Maes’ (1997) view is that the way agents differ from ordinary software is that an agent is personalized. It means, among other things, that an agent is proactive, which in her view means that it can take its own initiative rather than only react to events. According to Maes another difference between current software and software agents is that agents can run autonomously while the user goes about doing other things. She also argues that the reason to call it an agent is the fact that the software agent’s actions are based on its knowledge of the user’s preferences. Here Maes, however in good company, seems to overlook the very nature of autonomy. It is not only the knowledge of the user that is of concern for autonomous agent but the possibility to refer to itself. An autonomy, with its reference to self, refers to some language, because reference is a linguistic phenomenon. In Maes case, autonomy refers to a well known language, viz. a programming language or in this context we may speak of the programming language since the expressability is the same in all programming languages. However, according to Tarski (1956), no language can completely free itself from external influences meaning that a metalanguage is necessary to understand complete autonomy. Hence, the autonomous agents that Maes refers to, are given an operational and objective description in a mathematical or formal language, which leaves the understanding of the autonomy outside the description. So, here we see that the machine metaphor does not succeed in describing autonomy because it leaves out of account the language in which the autonomy is described. Ray Kurzweil (1999) goes several steps further than does Maes, arguing that by reverse engineering of the brain we may create computers that are much more intelligent than the person whose brain is transferred. This is only a question of time not of biological hurdles. We only have to await the progress in nanotechnology. Kurzweil (1999, pp. 220-222) predicts that in 2029,

3

For a thorough criticism of the agent idea, see Ekdahl , 2001.

18

The Difficulty in Communicating with Computers the vast majority of “computes” of nonhuman computing is now conducted on massively parallel neural nets, much of which is based on the reveres engineering of the human brain. Many – but less than a majority – of the specialized regions of the human brain have been “decoded” and their massively parallel algorithms have been deciphered. […] The machine based nets are substantially faster and have greater computing and memory capacities and other refinements compared to their human analogues. [T]here is extensive use of communication using direct neural connections. This allows virtual, all-enveloping tactile communication to take place without entering a “total touch enclosure”[…] The majority of communication does not involve a human. The majority of communication involving a human is between a human and a machine.

In 2099, Kurzweil anticipates that the reverse engineering of the human brain appears to be complete. Furthermore (p. 234): Even among those human intelligences still using carbon-based neurons, there is ubiquitous use of neural implant technology, which provides enormous augmentation of human perceptual and cognitive abilities. Humans who do not utilize such implants are unable to meaningfully participate in dialogues with those who do. Kurzweil’s belief that computers can be as intelligent as humans is in no way unique. He shares it with many researchers in artificial intelligence, cognitive science and agent technology. However, this view is deeply flawed and is based on the belief that life can be reduced to physics. However, life may be considered as the interplay between the genotype, that is, the genetic language and its interpretation (phenotype), and, accordingly, we may recognize life as language 4 and reformulate the question to whether language can be reduced to physics. We will show that this question cannot be affirmatively answered and, consequently, computers will not be able to communicate in the sense discussed above. The non-reducibility of language to physics also implies restrictions on the “communication” between computers and humans.

4

Compare with Löfgren, 1981.

Bertil Ekdahl & Lise Jensen

19

3. Conception of Language The human race is the only species that creates its on reality. Knowledge of the surroundings is in many respects a consequence of our existential perceptions. We do not know the world but have to create it in terms of existential objects. Our perceptive mechanism is connected to the cognitive structure. Meaning comes out of the cerebral description mechanism and is consequently perceptually grounded. That is why one may, at one’s own discretion, create a mocking world, good or bad. There is no point in having a well developed faculty for conceptualising the surroundings if we were not capable of communicating these concepts. Here language comes into play. Language allows us to communicate ideas, which implies the possibility of having a social meaning. Without language, we each live in our own separate mental world. With language, we can share the world inhabited by others. With a language it is possible to influence the mind of another individual. Considering the aim of language as a conveyor of cognitively created concepts how is it possible that subjective meanings become objective facilities? The same question is raised by Gärdenfors 1993, p. 290): But, if everybody can mandate his own cognitive meaning, how can we then talk about the meaning of an expression? And how can somebody be wrong about the meaning? Gärdenfors claims that the meanings are individually constructed but the social meaning of a locution is not determined by the mental conceptual structure of a single individual. Instead the jointly (social) meaning, together with a semantic power structure, determines the social meaning. Language is the glue of the social behaviour of human beings. It allows us to exchange knowledge amongst ourselves so that the whole community becomes wrapped up in the same set of beliefs. Without language, this “same set of beliefs” has to be part of the species phenotype and built in by the evolution. However, a language is superior to the built in case both for the obvious reasons that the built in set with necessity is restricted to a small part of the environment and that a language makes it possible for the individuals themselves to change the social meaning. The built in cases can be regarded as “algorithmically” steered in the sense that the behaviour follows a strict algorithmic pattern. When we try to describe a language we cannot proceed in the same way as we do when we describe physical phenomena. A language cannot be detached from the ideas and social surroundings in which it has evolved. Thus, we are stuck in a self-referential predicament from which we cannot liberate ourselves. This is a problem that does not appear when

20

The Difficulty in Communicating with Computers

we investigate a physical reality. Here we encounter a reality that is nonlinguistic, that is, the sentences in which the physical reality is described do not belong to the domain of physics. The fragmentation method, so successfully applied in physics, is not appropriate for language. We think that this lack of true understanding of this self-referential state of affair is the reason why researchers still believe that computers can be compared with human thinking and even surpass human intelligence. However, when we describe language, for example in education, we treat it in the same way we treat physical phenomena; namely, we try to objectify that which can be distinguished as observable entities. Such objects exhibit easily separable form and order such as grammar and sentences. With the semantics of a language it is different. When we leave the form and change to meanings we rely on an understanding beyond the language in use. We presuppose and refer to an interpretation in a social context in which the semantics can be explained. Unlike classical physics, language is impossible to completely objectify in itself. Understanding of a language is understanding of both form and meaning in a complementary conception in which fragmentation into parts does not succeed. It is a thesis of Lars Löfgren (1994, p. 158) that for every language, sentences and interpretations are complementary in the sense that both significantly enter the very idea of language and that neither can replace the other entirely. In its general conception, language is a whole of complementary description and interpretation processes. In this conception language consists of four parts; description, interpretation, interpretation process and description process. In the holistic conception of language, mastering a language is primarily knowledge of its interpretation and description processes. The undescribability of the interpretation process may be illustrated by the following example. Suppose you were finding a message sent in a bottle. Suppose furthermore that it was written in Swahili and you did not know Swahili. Then it was of no help to start the message with “this is written in Swahili”, written of course in Swahili. There is no way in which the sender can communicate to the receiver that the message is written in a particular language. The language has already to be comprehended in order to know that the message in fact is written in a language. This comprehension of language is the knowledge of the interpretation process in the language. In language, ideas are communicated by transmitting strings of symbols that must be finite otherwise the transmission would not terminate and the message would never be accepted. Also, a description is always time independent. How then can it be that infinity can be described by finite strings and change by constancy? The answer is that the interpretation process, which goes from description to reality, brings nondescribable, generative properties to the description.

Bertil Ekdahl & Lise Jensen

21

We may argue for the processual conception of language in the following way. Suppose that in a language we, in the classical sense, may consider its description D and its interpretation M as preconceived and R, classically conceived as a relation between D and M. Then, classical definability, that is, in the mathematical sense, prevents R from belonging to D or to M. Thus, it is vainly to ask for a language which can describe itself. If instead R is regarded as a process, we can define R in terms of a schema, S, that tells how S are going to operate the description D. (We may here compare with the instructions of a Turing machine.) In this case we do not meet any contradiction. The complementarity in language, as formulated by Löfgren, can also be compared with Niels Bohr’s (1934, 1958) view of complementarity in the language of quantum mechanics. There is a quantum wholeness, which implies that the world cannot be analysed into independent and separately existent parts. Bohr treated the entire process of observation as a single phenomenon, which is a whole that is not further analysable. For Bohr it was evident that the mathematics of the quantum mechanics is not capable of providing an unambiguous description of an individual quantum process. Bohr also realized that no new concepts would remedy the situation. There will always be ambiguity in describing the reality of the individual quantum process. It can be seen as the physical counterpart to Gödel’s first incompleteness theorem, saying that there is normally no complete axiomatization of a theory 5 . There will always be true sentences that cannot be proved to be so. Therefore, there is no way to understand what happens in such a process. Only in the Newtonian limit is it possible to obtain a picture of what is happening. When it was objected to Bohr that reality is more fundamental than language and lies beneath language, Bohr answered, “We are suspended in language in such a way that we cannot say what is up and what is down” (quoted in Löfgren 1994, p. 159) The interpretation of a language makes the concepts and the concepts we perceive give the language. That is why we cannot objectify language as is possible in (classical) physics. We cannot go outside the language but have to stay within it; we are imprisoned in our own language. 4. Computer Languages According to Wigner (1960) in mathematics the principle emphasis is on the invention of concepts. This seems reasonable if we consider the development in mathematics, particularly the last seventy years. For 5

If the theory contains at least ordinary arithmetic.

The Difficulty in Communicating with Computers

22

example, in about 1850, George Boole was developing methods for solving certain differential equations by applying algebraic methods. Nowadays, engineering and science students typically learn some of these methods already in their first or second year at university. Despite the evolution of mathematics, the most characteristic, since Aristotle, is the methods of proof. After all, the whole idea in mathematics is that of proving theorems. For example, the question whether a certain relation between the sides in a right-angled triangle could be proved in Euclidean geometry. This question was positively answered by Pythagoras (Pythagoras’ theorem). In a precise formulation of mathematics it is necessary to explain when some particular formula is considered a logical consequence of a set of premises. When we say that Pythagoras’ theorem is a logical consequence of Euclid’s geometry, we must have a proof that everyone can agree upon and that is independent of how the symbols used are interpreted. In order to achieve this it is necessary to detach semantics from formal mathematical reasoning; that “strict formalization of a theory involves the total abstraction from the meaning, the result being called a formal system […]” (Kleene, 1952, pp. 61-62) The purpose with a formal system is to be able to conclude things that are generally applicable and completely independent of particular interpretations. We may here contrast this with political speech in which conclusions are based on a certain interpretation that can be the object of criticism. In consequence of that, the emphasis of a formal system is on formal proofs and the mechanism to realize this. Of course, when a formal system is constructed there is an intended interpretation for the system but when the system is established we must not use this knowledge. Now, a formal system, as an abstract formal proof system, consists of three parts: (i) (ii) (iii)

A formal (symbol) language in which things are stated. Some postulates (axioms) to start from. Rules of inferences; the proof mechanism.

Some comments should be in place. First, a formal language is supposed to be completely specified when all its symbols and formulas are specified. This is not a language in the colloquial sense but makes language a purely syntactic object. Defining language in this sense is of course a direct consequence of the idea that formal systems can be manipulated without reference to meaning. However, most, probably all, formal languages have a meaning but this is outside the system itself. Second, some formulas in the language are called postulates (axioms) and are the point of departure for the proof mechanism. From the point of view of the formal system, the formulas may be arbitrarily chosen but normally they are selected with a certain interpretation in mind in which the

Bertil Ekdahl & Lise Jensen

23

formulas are considered true. Third, the inference rules must generate conclusions that are logical consequences of the set of the postulates. No one should be able to refute inferred conclusions on formal grounds. We will not dwell more on formal systems. What is of interest for us is that the notions of formal systems and computers are equivalent. In the form of Turing machines, Kurt Gödel, a forerunner in the analysis of formal systems, in 1930 expressed the correspondence in the following way (Wang, 1996, p. 204]): [A] formal system is nothing but a mechanical procedure for producing theorems. The concept of formal system requires that reasoning be completely replaced by “mechanical operations” on formulas in just the sense made clear by Turing’s machines. […] Singlevalued Turing machines yield an exactly equivalent concept of formal system. For our aim, we consider computers as Turing machines 6 . Then the equivalence tells us that a computing system is a completely mechanical system in which no semantics is involved and furthermore cannot be involved because this should break the rules of a formal system. Moreover, this equivalence implies that a computer will be able to determine semantic differences by syntactic means only in cases where there can be no ambiguities. Often there is no relation between meaning and syntactic form. For example, the whole differences between the following statements, easily recognized by humans, will never be entirely grasped by any computer (Devlin, 1997, p. 269): -

Susan saw the man in the park with a dog. Susan saw the man in the park with a statue. Susan saw the man in the park with a telescope

Now, the question whether computers are able to communicate can easily be answered by asking the equivalent question for formal systems. Since no semantics are involved in formal systems, communication in the sense discussed above, namely that understanding is involved, is obviously impossible. Computers are merely signalling devices and whether a computer “grasp” a received signal depends on the agreement of the programmers of the computing system. As illustrated in figure 4.1, communication between computers may be considered, at the best, as a communication, settled beforehand, by the programmers.

6

The difference is that Turing machines have infinite memory.

24

The Difficulty in Communicating with Computers

On the computing level, the program for the receiving data can always be considered a select-statement: Agree

Program

Computer

Program

Signals Data

Computer

Figure 4.1: “Communicating” computers.

input(signal) select (depending on the signal) action 1: … action 2: … . . . action n: …

As is clear, this corresponds to the most simple of all automata, namely the finite automata for which no one reasonably can claim that semantics is involved. 5. Design of Computer Systems Computer systems hold a unique position among engineered systems. They only exist as a “drawing” (formal system or program), say on a sheet of paper, and are not materialized in other forms, as are other engineered products like machines and buildings 7 . 7

We are even doubtful whether software design really should count as an engineered product.

Bertil Ekdahl & Lise Jensen

25

Formal systems are only momentarily “visible” as actions of the systems. It is the inference rules that are the reason for those actions. Because, with an inference rule there must be attached an action, namely the production of new sentences from old, which go beyond what completely can be described in the language. It means that inference rules can therefore not be described in terms of axioms (sentences in the language). To understand the difference between an inference rule and the attached action we may compare with a rifle. In order to shoot, the rule says that you have to pull the trigger but the action itself is not part of the rule; the action cannot be applied by the system (rifle) itself. Applying an attached action to a rule is in fact an act of interpretation. So, we may state that the computer is an interpreter of the language of the formal system. An action is real and vanishes as soon as the action is accomplished. Next time the computer interprets an instruction (apply an inference rule) a new action takes place independently of the previous action and, moreover, the old action is not even involved. Thus, it is not that all the different actions make up the system. The system is still the description on the “sheet of paper”. Hence, we cannot “see” a computing system; we can only form an idea of it by observing the different actions. However, the problem is that while a materialized system can be scrutinized in detail, this is not possible for a computing system since there could be virtually infinitely many actions of which only a few can be tested. The results of these actions may be pictures, numbers or sentences in a natural language. The users do not see the whole formal system but only the results of separate actions. So, when constructing computing systems, or more correctly, software systems, we have in principle two languages to consider. First, the programming language that has to be correctly interpreted by the computer. Second, the pictures, numbers or sentences, fragmented or not, that must be correctly interpreted by the users. The problem in the latter interpretation arises because the system is only partially understood by the users. A good comparison here is with a movie. The movie is an action, which is performed by the interpretation of the film by the film projector. When the movie is over, the “description” (film) is left while the action itself has vanished. Like computer system, the projector is not the only interpretation of the film. The spectators have to make their own interpretation that is highly culturally and socially dependent. Now, seeing the whole movie, which is possible, gives the entire understanding. Looking at single frames of the film gives incoherence. In a way, the computer users look at “single frames”. They cannot see the whole “movie” but have to understand the system from detached pictures. We may illustrate the design process as in figure 5.1. The preconditions are a set of universal constraints in the form of mathematical

26

The Difficulty in Communicating with Computers

and logical postulates. Upon these we have requirements specific for our aim, which impose further restrictions. These restrictions are part of our design in the way that they single out undesirable constructions. The nonuniversal requirements are the special decisions that restrict the final system. This stage is intrinsically social because requirements (needs) do not emerge from nothing but arise in a social context. Accordingly, the designer of a computing system must have a thorough understanding of the user’s social world. The ideal situation is when the designer and the user share the same category system (set of social norms) but this is not always the case. Irrespective of the case, the consequence of the demands on a deep knowledge of the user’s social reality implies that the mere collection of a number of facts about the given problem, together with the user’s situation and the context in which the system is to operate, is a necessary but not a sufficient condition. The facts must be organized in a way where their inter-relationship reflects the user’s social reality. This is again a linguistic process. Next step is to develop a “mental” model, that is, an idea how the new system should look and behave 8 . In order to be able to communicate these ideas we must be able to describe them in a communicative language and for that we use a description process (DP). The initial stage of the model is made in a more or less formalized language. Going from model to description is an iterative process since the description has to be so clearly formulated that it could give an interpretation (IP) that agrees with the originally conception of all participants. When the description gives the full idea of the imagined system, it has to be transferred to a formal description and this in fact completes the computing system. The actual system, that is, the system that the users comprehend is dotted in the figure to emphasize its immaterial nature. The actual system is only momentarily comprehended by the evaluation process (EP), which is also an interpretation. In the same way the actual system can only momentarily be verified (VP) (or confirmed) by comparing the actual actions with the intended actions in the model 9 . The aim of colloquial language is to transfer social meaning. This can also be said to be true of computing systems because the actual system is shown for the user as fragments in a colloquial language. Companies have different social and cultural structures and these dependencies are always reflected in the computer systems. Thus, it is necessary to understand the social context in which a computing system is supposed to work. By way 8

Strictly speaking, this idea is a structure that will become the model once the formal system is constructed. 9 For a more elaborated explanation of the design process we refer to Ekdahl, 2002.

Bertil Ekdahl & Lise Jensen

27

of an example, in the seventies computational social security systems began to be developed in Sweden. Social systems are municipal matters Requirements

Model

DP

VP

Actual System

IP EP Specification Formal Description

Figure 5.1: The design process of a computer system.

but they are strictly ruled by the government. So, it would be reasonable to expect that all cities would have the same computing system. However, this was not the case. Despite having a common kernel, the variations were noticeable for the different cities. When constructing computing systems, the main concern is often technical aspects; the process is in focus. Instead we will put forth computers as linguistic entities; it is a linguistic activity to design understandable systems. The main question will then be: How can we design computer systems that correctly transfer social meanings? How should we guarantee that the users have the same interpretation (model) of the system? Too many systems fail because the programmer and the user have different social and cultural background leading to quite different interpretations of the system as illustrated in figure 5.2. 6. Conclusion We think that the belief that computers can communicate semantics is an anthropocentric view and in fact very similar to the ancient attitude that language not only describes, but also mirrors reality, indeed, is its own

28

The Difficulty in Communicating with Computers

reality. In Cratylus, Plato argues that the semantics of a language reflects the system of “natural kinds”. Agree?

Program

Signals Data

Computer

Figure 5.2: Man-machine communication

Contrary to this view, language cannot be fragmented into syntax and semantics but has to be holistically considered. This means that semantics can only partially be described in a language itself which in turn implies that computers will never be able to communicate in the sense that knowledge is transferred. Consequently, computers are not able to communicate but only to signal. We do not think it does any harm to talk about computers as communicating but we think that it should be clearly understood that this is only metaphorically; computers interchange signals (data) but not semantics and never will. Even man-machine communication, which may be thought of as a one-sided communication, should, despite the poverty of a full language, be regarded as a linguistic system however rudimentary. This in fact makes the design of a computing system the design of a linguistic system contrary to other engineered products.

Bertil Ekdahl & Lise Jensen

29

We suppose that this is an understanding not very well appreciated and this, we believe, is one of the major reasons for the poor design of many computing (software) systems.

Bibliography 1. Niels Bohr. Atomic Theory and Description of Nature. Cambridge: Cambridge University Press, 1934 2. Niels Bohr. Atomic Theory and Human Knowledge. New York: Wiley, 1958. 3. Martin Davis. The Universal Computer. The Road from Leibniz to Turing. New York, London: W. W. Norton & Company, 2000. 4. Michael Denton. “Organism and Machine: The Flawed Analogy”, In Are We Spiritual machines? Ray Kurzweil vs. The Critics of Strong AI, edited by Jay W. Richards. Discovery Institute, 2002. 5. Keith Devlin. Goodbye, Descartes. The End of Logic and the Search for a new Cosmology of the Mind. John Wiley & Sons, Inc, 1997. 6. Robin Dunbar. Grooming, Gossip, and the Evolution of Language. Cambridge Massachusetts: Harvard University Press, 1996 7. Bertil Ekdahl. “How Autonomous is an Autonomous Agent?”. 5th World Multiconference on Systemic, Cybernetics and Informatics (SCI 2001) and 7th International Conference on Information Systems analysis and Synthesis (ISAS 2001), July 22-25, 2001, Orlando, Florida, U.S.A, 2001. 8. Bertil Ekdahl. “Design as a Linguistic Activity”, 6th World Multiconference on Systemics, Cybernetics and Informatics (SCI 2002) and the 8th International Conference on Information, Systems Analysis, and Synthesis (ISAS 2002), Vol. II, pp. 161-166, Orlando, USA, July 14-18, 2002. 9. Peter Gärdenfors. “The Emergence of Meaning”, Linguistics and Philosophy, 16, pp. 285-309, 1993. 10. Stephen C. Kleene. 1952, Introduction to metamathematics. (Tenth impression 1991) North-Holland Publishing Company, 1952. 11. Ray Kurzweil. The Age of Spiritual Machines. Penguin Books, 1999. 12. Lars Löfgren. “Life as an Autolinguistic Phenomenon”. In Autopoiesis, edited by Milan Zeleny, 236-249. North-Holland, 1981. 13. Lars Löfgren. “General complementarity and the double-prism experiment” In Symposium on the Foundations of Modern Physics 1994: 70 years of matter waves, editied by K. V. Laurikainen, C. Montonen, and K. Sunnarborg), Helsinki, Finland, 13-16 June 1994, Gif-sur-Yvette, France: Éditions Frontières (pp. 155-166), 1994. 14. Pattie Maes. “Direct Manipulation vs Interface Agents”, In Interactions, November and December 1997, vol. IV.6, pp. 42-61.

The Difficulty in Communicating with Computers

30

15. Sverre Sjölander. “On the evolution of reality – some biological prerequisites”, In J. theoret. Biol 187: 595-600, 1997. 16. Sverre Sjölander. Sverre Naturens budbärare. Från djursignaler till människospråk, Nya Doxa, 2002. 17. Alfred Tarski. Logic, Semantics, Metamathematics, Oxford, (Second Edition, 1983), edited by J. Corcoran . Hacklett, 1956.

18. Hao Wang. A Logical Journey: From Gödel to Philosophy. The MIT Press, 1996. 19. Eugene P. Wigner. “The Unreasonable Effectiveness of Mathematics”. In Communications in Pure and Applied Mathematics 13: 1-14, 1960.

Authors Bertil Ekdahl is an associate professor with the department of computer science, Lund University. From 1969 to 2000 he worked in the industry in different positions with development of computing systems. His main research interest is foundational questions about systems and particularly anticipatory systems and their relation to formal systems. His address is: Lund Institute of Technology Department of Computer Science Campus Helsingborg P. O. Box 882, SE-251 08 Helsingborg, Sweden [email protected] Lise Jensen is course director at multimedia technology, Lund University, at the same time she holds a post as external lecturer at the IT University, Copenhagen. Lise Jensen is a trained architect and from 1990 to 1999 she worked in fields as diverse as architecture, virtual reality and multimedia communication. Her main field of interest is visual communication. Her address is: Lund Institute of Technology School of Engineering Campus Helsingborg P.O. Box 882, SE-251 08 Helsingborg, Sweden [email protected]

Accounting for User Needs and Motivations in Game Design Lucy A. Joyner and Jim TerKeurst 1.

Abstract Computer games are developed around the world, and the United Kingdom and Japan are still among the major developers of computer game software. Notwithstanding the games industry having overtaken the Hollywood box office in terms of revenue, user centred research on computer game playing preferences and cultural variation in playing habits is a largely unexplored field. Preliminary research was carried out to develop our basic understanding of how and why computer games are used within British and Japanese culture. The research explored two questions: do computer games have different meanings and uses in Britain and Japan? If so, are they embedded in personal and social identity? Observational and empirical data from British and Japanese teenagers and adults suggested cultural variation in game playing habits and preferences, across the age groups. Data were reviewed in relation to intrinsic and interpersonal motivations and needs motivations and a tentative model that illustrates and explains cultural variation in computer game playing will be presented. This model, a work in progress, recognises that we exist within a social and political environment comprised of beliefs and values, and that we have needs that must be fulfilled for us to develop. These needs interact with our intrinsic and interpersonal motivations and the model suggests that some motivations may interact more strongly with certain needs. Findings have crucial implications for usability practice and game design. By carrying out usability testing of games with targeted players from different cultures, we can modify components of the game, fine tuning them to optimise the balance of motivations to meet the needs of different players. Successful global game design can become reality. Human factors has a central role to play in the process, highlighting the continuous need for user centred testing in game design. While we can suggest how players motivations and needs interact, and illustrate how different elements dominate at different stages in the player’s life, usability testing is always necessary and crucial to evaluating a game design and delivering a game that meets the requirements of the consumer.

32 2.

Users Needs and Motivations in Game Design

Introduction “If we were always to judge from reality, games would be nonsense. But if games were nonsense what else would there be left to do?”1 Tolstoy may have written these sentences years ago, but they still resonate today as so many people consider games, fantasy, and play, to be diametrically opposed to what they consider the more important things in life; namely work and the real world. These critics of games argue that games are at best a waste of time, and at worst a sign of a deep-rooted deviance. Unfortunately while this approach might help people maintain their existing sense of moral superiority, it does nothing to explain the enduring popularity of games throughout oral and written history. In recent years we have seen an explosion of interest in computer games as home computers have gone from expensive and unique, to affordable and commonplace. Computer games are now the latest thing in the gaming world, and interesting to investigate, as they are an evolving form of mass entertainment, popular culture and gaming. Computer games are developed around the world, and the games industry has overtaken the Hollywood box office in terms of revenue. As modern lifestyles place ever-increasing demands on our time, and considering all of the entertainment available, there must be some significant reasons why people choose to play computer games. Still, user centred research on computer game playing preferences, and cultural variation in playing habits, is a largely unexplored field. One reason for this dearth of research may be the enormous quantity of games, their varying quality, and the fact that typically computer games are played in our precious and socially crucial free time. How can we evaluate between all of these variables, and explain the reasons for playing, and choices of, computer games relative to the other leisure activities people could engage in? When observing computer game players, they usually appear to be both interested in and captivated by the game they are playing. Many players put such value on games that they are even willing to part with significant amounts of money to play a game, whether by purchasing a product or pushing coins in to an arcade machine. So how can we understand what motivates gamers to play computer games; why they choose the games they play, and what satisfaction they get from the game playing experience: especially as the rewards gained may be subtle and vary from player to player? Thomas Malone argued that players are intrinsically motivated to play games.2,3 Malone’s work is considered seminal to understanding motivation and computer gaming, but it is generally acknowledged that any experiments to test subtle and complex theories of motivation in games fail to account for varied motivations and their relationship to the vast array of external rewards available to players. This lack of accountability typically leads to either a direct contradiction between

Lucy A. Joyner and Jim TerKeurst

33

theories, or ambiguities in the definition of either intrinsic motivation or rewards. On the other hand, while potentially incomplete, theories outlining the necessary features of intrinsically motivating environments need not be viewed in competition to one another. In fact, many features can contribute to the motivational and reward environment, and the optimal balance of them will vary from person to person.4 So, what motivates a person to play a computer game instead of engaging in other activities? Theories of intrinsic and interpersonal motivations can be useful tools for understanding important features for game design and how to make a game fun, but they do not explain why a person is motivated to play a game in the first place.5 Could the answer be skills? Some have argued that the key to a computer game’s appeal must involve an aspect of contest, or putting skills to the test.6 Malone has also stated that we should consider the popularity of games over time, as the novelty of a new game can wear off and it will not always be top of the popularity list, although it could remain a steady favourite.7 Conversely, some recent research has suggested that computer games have evolved greatly, and we need to take a second look at the design of “modern games.” Gamers expect new, cool features in new titles, with games making it easy for players to form and find subgroups within a community, forming a “natural community” with an obvious common interest.8 So, what might be some of the drawbacks to these approaches? Lepper and Malone’s theories are useful and highly regarded, but it has been suggested that they give little consideration to an individual’s cognitive beliefs and how their beliefs and values may influence their interest in a game and their motivation to play it.9 After all, individuals exist within a social, educational, political and economic nexus, and gamers can be seen as generating meaning and identity in the social worlds games open to them.10 Thus, when examining preferences in games and game design, we should consider the gamers individual needs, as well as the cultural, physical and communal environment that the player is imbedded in. The cultural aspect of games is common knowledge in the games industry, where geographically defined consumer predispositions are typically recognised as genre preferences.11 Each of these genres is associated with culturally based concepts of content and game play: elements of which may, or may not, be appreciable to other cultures. When viewed globally, it is clear that different cultures may have strong cross-cultural affinities for some genres and their concepts, while others are appreciated for completely different, or even oppositional reasons. Some genres and concepts simply don’t translate at all; at best they may become cult games, but generally they remain unknown. Take for example, the enormous success, in Japan, of Tokimeki Memorial and Princess Maker. Neither of these games is popular in Europe or North

34

Users Needs and Motivations in Game Design

America, and many people would consider the genre and its interwoven game concepts as deviant. The consequence of all this is that when researchers study only one culture/market, with its own regulated and constrained title releases, they limit themselves to studying playing preferences amongst games that have been preselected as apposite. Instead, by focusing on individuals, their needs, motivations, and relations with others, we can comprehend the dynamics of collective action and possibly discover “those cultural elements that are not necessary components of the games played, but are constructed by the members of the subsociety.”12 According to Maslow’s Motivation Theory, a person’s needs are organised into a hierarchy of relative basic needs that must be fulfilled before the individual can grow as a person to the stage where they can be self-fulfilled. Maslow recognised that the hierarchy is not rigid, and can in fact undergo reversal, for example, with a person’s need for self-esteem being stronger than their need for love.14 Maslow’s theory begins to explain why different people choose and like certain types of games, but only aids our understanding of how games are played across different cultures when comparative evidence of the existing balance of motivations can be obtained and the similarities, contrasts and contradictions considered. The authors propose that a relationship exists between needs, interpersonal motivations, game concepts and genres. In this paper, the authors present a model of this interrelationship, and argue that user centred testing during the design process is crucial for understanding cross-cultural game acceptance. 3.

A Model Explaining Game Playing Preferences The model on the following page was developed to illustrate the nexus of people's choices for computer games (see Figure 1). The model integrates usability testing as an invaluable tool to determine how changes in experience and environment may alter motivations and game playing preferences. The model can also be used for explaining and optimising motivation and ensuring that a game’s design meets the needs of the target players. The model has three basic components. These represent the players lived in world and its relationship to Entertainment Preferences. The Motivating Needs are the individual or social needs that people want fulfilled. The Interpersonal Motivations are the individual's preferences for fulfilling their Motivating Needs. These two components interrelate as deficits in one can lead to a desire for fulfilment in the other. The interaction of these two components encourages people to look for entertainment that will help maintain equilibrium. Computer games are one of a broad range of entertainments and people who play computer games use them to maintain the equilibrium between their Motivating

Lucy A. Joyner and Jim TerKeurst

35

Needs and Interpersonal Motivation. Preferred game concepts and genres are the outcome of the equilibrium maintaining process.

Entertainment Preferences Forms Concepts Genres

Motivating Needs Self-Actualization Esteem Belonging Safety Physiological

Interpersonal Motivations Competition Recognition Cooperation

Figure 1: Concept Usability Example 1 Interaction of Motivating Needs and Interpersonal Motivations: Belonging A player with a deficit Belonging Need, may wish to fulfil this Need by choosing cooperation over competition, to be recognised as a team player, thus fulfilling the Need to belong. A game concept or genre that fulfils the player's Need would be preferred. For example, a player may choose a game that requires team based play and a strong sense of cooperation to achieve a game play success or win for the group. Example 2 Interaction of Motivating Needs and Interpersonal Motivations: Esteem A player who wants to fulfil the Need for Esteem may choose to play a computer game with a strong sense of competition that enables them to demonstrate their skill and ultimate mastery of the game to others.

36

Users Needs and Motivations in Game Design

Successful completion of the game would increase their esteem, through the recognition of being a talented winner. The player may choose a genre with competitive game concepts. However, this model is not without its shortcomings. One of the most crucial limitations is that players can only choose from existing genre choices to fulfil existing Needs. Thus while the relationship between Motivating Needs and Interpersonal Motivations may appear clear and obvious, the concept is nevertheless theoretical and does not explain or clearly demonstrate the actual process of choice and interaction. One way of revealing the process and choices may be found by employing usability in the design process. Employing usability can assist in overcoming the restrictions in game choice, which can typically lead to game testing scenarios where players are unable to see the possibilities in a game concept because they are unable to relate it to anything other than existing products. Carrying out usability testing during the design process can facilitate researchers in understanding the players’ preferred balance of features and game play within the game. Observing the interactions, vocalisations and types of game play occurring between players, while simultaneously testing elements of the game concept or prototype, can indicate how the game concept and content match target users Needs, enabling the design process to be user centred. By studying players’ interactions with a prototype and evaluating what maintains their interest, elements of game play can be manipulated to provide the appropriate challenge, control and fantasy required within the game to maintain players’ interest and curiosity. The authors therefore propose a model of Game Usability (see Figure 2). Combining observations and feedback from game play of a prototype, a player’s Intrinsic Motivation to play the game can be tested, and features of the prototype refined, to express the preferred “Concept Usability” of players, and accordingly revealing their gaming preferences. Nevertheless, a causal relationship between Needs and Motivations is not inferred, rather, the authors suggest that there is evidence of a strong relationship between some Motivations in interaction with some Needs.

Lucy A. Joyner and Jim TerKeurst

Concept Usability

Intrinsic Motivation

37

Game Preferences

Figure 2: Game Usability The authors hope this model can be used to assist human factors researchers in recognising the interaction between people’s Motivations to play games and the Needs that players must fulfil. As an example, usability testing could be a useful tool for identifying if target users prefer intrinsic or extrinsic fantasy in the games they choose to play. Knowing this information could be an invaluable resource for game developers and publishers, and could also provide a way for academics and researchers to consider what the main Motivations are for game playing across cultures, and how these Motivations relate to peoples culturally based Needs and their identity within subgroups. 4.

The Model in Practise It has been inferred by some researchers that gaming is a social, and not a serious achievement oriented activity, with frequent game players being more extravert and less achievement-oriented than occasional game players.14 We will now use the Game Usability model to consider this suggestion while exploring variation in gaming across ages and cultures.

5. Discussion of Implications of Model: The Value of Usability Testing for Understanding Gaming Preferences Across Cultures The Game Usability model suggests that the balance of Interpersonal Motivations can be grounded within a social and cultural context, and usability evaluations are critical for establishing the right weighting of features of Intrinsic Motivation, within the game design, to meet the Needs and social Motivations of the player. In previous research conducted by the authors, usability testing of console games that were in development provided insight into players’ attitudes towards what makes them enjoy a game, and why they want to continue playing it. Prototypes that were used cannot be named, but aspects of them can be described. When testing a prototype cartoon game with 8 to 12 year olds in the UK, initially some children selected a small character that they thought

38

Users Needs and Motivations in Game Design

looked cute and would be fast. However, after using that character once, and losing to their opponent, the children asked to change character as it was perceived to be slow and weak. Typically the children would then choose to play with the character that had beaten them in the previous game – even when the game designers had given both characters identical strengths and speed. Observing pairs of opponents, where one player was consistently winning, revealed that the losing player characteristically didn’t think the game was fun and wouldn’t want to play it again. By allowing each opponent to play at a different and better-matched difficulty level, both players reported that they were having fun. The weaker opponent was able to compete for longer, and have some fun fighting, allowing the stronger player time to demonstrate their skill and gaming moves before declaring victory. These findings have been replicated in subsequent game usability tests done by the authors. In another example, undergraduates played a prototype multi-player car racing game in groups of eight. The adults were able to articulate that beating a much weaker opponent was not satisfactory because of the lack of challenge. Players wanted to demonstrate their skill and expertise by winning within what they perceived as a “fair competition.” The authors game usability research suggests that children and adults in the UK tend to prefer playing games that they excel at and frequently win. However, it is important to evaluate the initial level of challenge present in the game and its optimal increments. “Unfair competition” can frustrate both opponents. Games help children recognise their skills, abilities and the enjoyment of being good at something. Being good at a game can provide children with an almost cult status amongst peers as they demonstrate their skills to their friends and are revered for them. British adults do report enjoying the social aspect of playing games, but this also relates to having conversations and discussions about the game, and demonstrating one’s knowledge of the skills involved with completing levels. In other research conducted by the authors, findings from interview and questionnaire responses from British and Japanese teenagers indicate that unlike their Japanese counterparts, UK teenagers at secondary school prefer to play with another person, rather than against the game AI. Data showed that UK teenagers were 50% more likely than Japanese teenagers to play at a friend’s house. With dominant esteem Needs, UK teens played a wide variety of genres with 90% of Participants playing at least once a week and more that 50% playing at least twice a week. Participants reported enjoying playing alone to get used to a game and practise moves, developing skills, before playing with another person. Findings suggest that in the UK, while gaming is a social activity, it is very achievement oriented. Competitive games that allow for the demonstration of skill and mastery are preferred. Games are generally

Lucy A. Joyner and Jim TerKeurst

39

used to display achievement and raise esteem. One consequence of this is that game playing endures with age, as esteem Needs remain strong. In comparing Japanese and British teenagers aged 14-15 years, we found that Japanese teenagers were 300% more likely to play in very social arcade environments. Observations made in Japanese arcades identified large groups of teenagers socialising together along with parents and their small children. In the arcade environment, parents met and socialised while providing their young children with an opportunity to meet and play games together. Some arcade games were of appropriate height for small children to use, while other games provided multi player opportunities for adults and children, such as sitting in a canoe and steering it to play the game. Time in arcades appeared to be spent meeting, socialising and sharing gaming rather than playing alone or competing against others. This sharing behaviour while playing games was also seen with older Japanese players, and the authors suggest that the Need to belong, and be recognised as a part of the group, was so important to Japanese players that it constrained their genre preferences and game selection. For example, in Japan only car racing, role-playing and fighting games were reported as being consistently played by Participants, while British players enjoyed playing a wide variety of games. Indeed, it seems that by the time Japanese students reach university, their belonging Needs are being met by other activities. Data from Japanese undergraduates indicated that gaming is not a major activity in their life, with 80% of students playing computer games once a month or less. In addition, undergraduate students preferred to play individually against the game AI while the others shared in the experience by vocalising support and empathy for the player. Playing this way, each player received statistics about their game performance and these statistics were discussed and ranked. Accordingly, there was competition, and an overall winner, but it was indirect competition between the players, as each person’s experience with the game AI was different. These findings suggest that in Japan gaming is more of a social activity than an achievement-oriented activity. Games are played in a shared and cooperative way, to be recognised as a member of a group or sub-group with its own distinct idioculture, or as part of a larger community, or to fulfil a Need to belong.15 In Japan competitive game play is commonly played against the game AI, ensuring that competition between people is less confrontational. In this case recognition would be being identified as part of a team. Therefore, usability evaluations would be central for designing a game with the right balance of Intrinsic Motivations to meet the belonging Need of the players, enabling them to cooperate within the game and identify as part of a team or group Findings may also be representative of cultural variations in infrastructure. The following example shows how observations and reports

40

Users Needs and Motivations in Game Design

of game playing might not reflect a usual pattern of behaviour, and illustrates how important it is in Japan to meet the demands of the society. Japanese high school pupils reported that they were playing games less often than they’d like to as exams were approaching. Some teenagers had even stopped playing games altogether. In considering this information, the authors found that the education system in Japan, and Japanese society, promote academic excellence and success at school, which was crucial to being accepted to a good university. Teenagers who were interviewed answered that they attended evening classes during the week, and cram school on Saturdays. With little free time, and probably no sibling to play with at home, it is arguable that teenagers choose to play games that do not put them under more pressure to compete to beat their friends and rather, enable them to share in a group identity with their peers. Game material also readily crosses media in Japan, so even when they can’t play games they have access to the material through television, comics and cards. As game playing becomes part of a larger group identity, or community, a lack of space and privacy at home will increase the popularity of playing games at an arcade. This type of game playing declines once at university because it is no longer useful. For Japanese adults, games may be played occasionally for fun, but they have largely been replaced by adult activities that fulfil the Need to belong to a group. They are not used to fulfil esteem Needs as these needs are met by being associated with a good university, and ultimately a company. In contrast, UK teenagers have more free time to socialise and join clubs, are more likely to have siblings and their own private space in which to entertain others, and also have a sense of community or belonging. In addition, UK teenagers may have less pressure to succeed at school, as it is less of a social norm, or demand, to excel academically. This might explain why UK teenagers play games alone to perfect their skills, and then use games competitively to demonstrate skill and mastery in a way that is acceptable to, and revered by, their peers, raising their self-esteem. The authors’ additional findings have implications for bridging some of the common cultural chasms experienced by the games industry, as it is important to have culturally meaningful concepts within games. Even within a broader genre, if a specific concept for a game is too unappealing to potential users, the game content may not be accepted, regardless of the usability testing carried out on it. Usability testing game content can allow some changes to be made to concepts, to enhance their appeal, but the greater the need for concept change, the greater the expense and timescale for delivery. An example of when changes can be made to content for increased appeal of a familiar concept comes from comments made during evaluation of a car racing game. While players liked racing games as a genre, they wanted the idea of the game to be meaningful to them in many diverse ways. Players

Lucy A. Joyner and Jim TerKeurst

41

reported finding it difficult to engage with the game because level names were given in a way that they were not familiar with. Similarly, they wanted to see environments and scenery that they recognised, be able to choose and drive their dream cars, few of which were present in the game. In addition, the music and soundtracks were also areas of content that needed to be culturally grounded and appropriate for game concept to be appealing. Nonetheless, some games may never have appealing and appropriate cross cultural content. Games like Princess Maker and Tokimeki Memorial represent examples of when game concepts are so alien certain markets will either reject, or even condemn them. Familiarity with culture, social interaction, language and game content are crucial considerations when designing a game. These findings suggest the centrality of incorporating socially and culturally relevant material in to game design, and highlight the constrained and understandable nature of the player’s relationship to the game. 6.

Conclusion and Future Directions This early research suggests that games are used differently to support personal and social needs and motivations within a player’s social and cultural environment. Obviously any sort of research making these claims needs considerable development. For instance, the role of fantasy in games and whether preference for intrinsic or extrinsic fantasy varies across cultures is still uncharted research territory. As suggested by Fine, future research could also extend beyond this, seeing fantasy as constrained by the social expectations of players and of their social world.16 Focusing on the players’ definition of the gaming situation and how they orient themselves to the game could also provide insight as to how game playing captivates the player and promotes identification within the game. Additionally, a focused study of players self reports of the goals that gaming provides may highlight variation in preferences for a fixed goal in a game, whereby the game has a clear culmination; or events (smaller goals), that emerge within a game that has no clearly defined endpoint. The model supports a relationship between belonging Needs, or esteem Needs, and Interpersonal Motivations, but the relationship between self-actualisation and Motivations to play computer games still needs explication. By furthering research, studying goals and player fantasy within games and targeting those who are motivated to fulfil growth Needs, the games market could innovate, providing material that crosses cultures and allows gamers to fulfil their Needs at every stage of their development.

42

Users Needs and Motivations in Game Design

Notes 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16.

Tolstoy 1959 338. Malone, 1980 Malone & Lepper 1987 Malone, 1980 Malone & Lepper 1987 Dempsey et al 1997 Malone, 1980 Dyck et al., 2003 Tzeng, 2001 Fine, 1983 TerKeurst, , 2002 Fine, 1983 p, 28 Maslow, 1970 McClure & Mears 1986 Fine, 1983 p.136. Ibid

References Dempsey, John V., Barbara Lucassen, Linda Haynes and Maryann Casey. An Exploratory Study of Forty Computer Games. College of Education Technical Report No 97-2. Mobile, Alabama: University of Southern Alabama, 1997. Dyck, Jeff, David Pinelle, Barry Brown and Carl Gutwin. “Learning from games: HCI Design Innovations in Entertainment Software.” Paper presented at Graphics Interface 2003, Halifax, Nova Scotia, June 2003. Fine, Gary Alan. Shared Fantasy: Role-Playing Games as Social Worlds. Chicago: The University of Chicago Press, 1983. Malone, Thomas W. What Makes Things Fun to Learn? A Study of Intrinsically Motivating Computer Games. Cognitive and Instructional Sciences Series, CIS-7, SSL-80-11. Palo Alto: Xerox Palo Alto Research Center, 1980. Malone, Thomas W. and Mark, R. Lepper. “A Taxonomy of Intrinsic Motivations for Learning.” in Aptitude, Learning and Instruction, Vol. 3: Conative and Affective Process Analyses, edited by R. Snow & M. Farr, 111-140. Hillsdale, New Jersey: Lawrence Erlbaum, 1987.

Lucy A. Joyner and Jim TerKeurst

43

Maslow, Abraham. Motivation & Personality. 3d ed. New York: Harper and Row, 1970. McClure, Robert F. and F. Gary Mears. “Videogame Playing and Psychopathy.” Psychological Reports 59 (1986): 59-62. Opie, Iona, and Peter Opie. The Language and Lore of School Children. London: Oxford University Press, 1959 TerKeurst, Jim. Games are Like Fruit: Japanese Best Practice in Digital Game Development. Dundee: University of Abertay Press, 2002. Tzeng, Shyh-Chii. “Optimizing Challenges and Skills in the Design of an Educational Computer Game and Exploring Adolescents’ Gaming Beliefs.” Paper presented at Association for Educational Communications and Technology, Atlanta, Georgia, November, 2001.

Author Affiliation Lucy Joyner Scottish Usability Lab at International Centre for Computer Games and Virtual Entertainment (IC CAVE) University of Abertay Dundee Dundee, Scotland [email protected] www.iccave.org Dr. Jim TerKeurst International Centre for Computer Games and Virtual Entertainment (IC CAVE) University of Abertay Dundee Dundee, Scotland [email protected] www.iccave.org

Spatial Context of Interactivity Stanislav Roudavski and François Penz Digital Studios, Cambridge University, Department of Architecture, 1 Bene’t Place, Lensfield Road, Cambridge CB2 1EL, UK [email protected] franç[email protected]

1.

Introduction The discussion in this chapter arises from the premise that dramatic engagement, expressing an author’s intent 1 , is desirable as a fundamental capability of compelling and meaningful virtual environments (VEs). With this premise in mind, the spatial context of navigable, realtime, three-dimensional (RT 3D) VEs is examined as a resource for the production and manipulation of meaning and drama. Interactivity, with its support for functional and spatial explorative experimentation, is usually cited as the most significant characteristic and achievement of VEs. This chapter argues that in navigable VEs, space/place is the setting for and part of the action, and that interactivity and meaning cannot be divorced from the spatial experiences that deliver them. The unique ability of computers to support simulations as well as representations positions them on the brink between systematic and situational orders. A VE is only accessible to users’ experience through a presentation layer (e.g. through cinematic mediation): it exists as navigable experiential space/place only in the user’s mind. This chapter proposes that the ‘mental image’ 2 of this space/place is distinct for each user and can be actively manipulated through the use of mediation devices. An integrated approach to the design of VEs – considering spatiality, interactivity and mediation together – can provide access to an extra set of expressive means, unique to this emerging medium. A. Chapter Structure The chapter consists of three sections. The first, Meaning in Context, introduces the concept of situation, discusses how meaning is integral to places and suggests that VEs have an intrinsically representational na1

It is understood that the author’s intent can manifest itself in a multitude of more or less prescriptive ways made apparent to the user or deliberately hidden. E.g. in a situation with multiple potential engagements the user might be coaxed rather than directed. 2 The word ‘image’, as used by Lynch [38], can be misleading here as it might appear to suggest undue emphasis on the visual. Other terms have been used to denote the concept: e.g. ‘cognitive map’, ‘mental model’ or ‘mental collage’.

46

Spatial Context of Interactivity

ture. The second, Access to Meaning, suggests that in navigable VEs meaning is dependent on a structure of involvements and possibilities that are actuated via interactive access. The final section, Meaning in Mind, discusses acquisition of environmental knowledge in the context of VEs and proposes that mental images of the environment can be shaped via cinematic mediation and serve as a reserve of meaning and expression. 2.

Meaning in Context

A. Space as Experience We ‘dwell’ in spatial environments and are present there through our bodies [10][26][39]. The spatial environments of our lives contain information that is available through direct experience and is perceived as a set of affordances: as opportunities and constraints for action [20]. Metaphoric understanding and use of abstract notions also depend on our experience of spatial phenomena and our ability to relate to spatial schemata [30][34]. Negotiating either real or virtual navigable space, people naturally employ their innate capacities formed by evolution in a spatial world and an accumulation of real-world experiences that have shaped their consciousness and behaviour from early childhood; this chapter maintains that perception automatically activates our knowledge of visible reality [2]. We participate in the world at many levels of involvement; spatial context, cognitive processes and behaviour are inseparable and cannot be considered in isolation. Our involvement with context manifests itself not only at the level of individual understanding and behaviour but also at the level of large groups of people so that it is possible to study how differences in spatial configuration are reflected in society. For example, empirical results collected by space-syntax research – which analyses buildings and cities as systems of space rather than as aggregates of physical matter [27][28] – suggest that space is the common determinant of movement patterns and behaviour in populated environments. Space-syntax theory suggests that the nature of spatial structure, even when analyzed as an abstract pattern, has a formative influence on pedestrian and vehicular movement, land-use distribution, social and economic performance, local crime rates and pollution. Analogous experiments have been attempted in VEs [12]. Space can be defined in several ways. Lefebvre [35] distinguishes between intertwined layers of physical, mental, and social space. Mental and social spaces are ‘projected’ onto the physical space. This projection takes place when a person perceives objects in the physical space as meaningful entities. The meaning of objects is presented via signs and accessed

Stanislav Roudavski & François Penz

47

via actions 3 that perform as interpretive devices [44]. Social space, an accumulation of shared experiences, consists of conceived, anticipated facts and as such is not coincidental with its collateral ‘real’ space. However, social space is perfectly real in that it guides our behaviour. Post-modern discourse in cultural studies assigns particular significance to layers of meaning and interprets modernity as characterized by simulation, the term that in this context means the process whereby representations of things come to replace the things being represented. If one chooses to look at culture in this way, one comes to realize that not only are the boundaries between space, behaviour and experience blurred, but also that VEs cannot be considered as separate from reality. Rather, they are a part of a complex conglomeration that includes several orders of signification such as signs that are thought to represent basic reality, signs that mask reality, signs that mask absence of reality and signs that are not related to reality – simulacra [5] 4 . Relph [47] suggests a taxonomy that includes primitive, pragmatic, perceptual, existential and cognitive/abstract spaces. Much of the extant discourse about space in VEs is to do with primitive/pragmatic and/or perceptual spaces. These are the spaces of basic individual experiences, relatives of ‘functional circles’ of animals. Humans tend to resort to these fundamental spaces when conscious effort is missing from their dwelling- and place-making activities; when appropriation of space occurs as a response to conditions rather than as a result of a planning process. Perceptual spaces are spaces of immediate action and, as such, carry some low-level meanings, like distances and directions. The attention to these definitions of space has generated much interesting work on, for example, navigation in spatial environments (e.g. [13]). However, in order to relate to more complex, higher-level meaning-structures that are an inseparable part of our situated behaviour, it is interesting to turn to more inclusive definitions of space. Space, considered together with its dynamic cultural meaning, can be described as experiential and, potentially, social space or place, the notion which constitutes a fundamental level of our consciousness. It is impossible to imagine being in some uniform empty space, in ‘no-place’. As the novelist Proust wrote [46]: ‘not knowing where I was, I could not even be sure at first who I was’. For the philosopher Bachelard [3], the link between self and place was so important that he proposed a method of 3

All behavior is necessarily spatial and aggregates into spatial practice expressed as social or cultural habits. 4 VEs are also a part of reality in a very direct sense: they are accessed from a physically located and characteristically arranged place (e.g. from a couch in a bedroom, from an office, a laboratory, in an amusement park or an exhibition). It would be interesting and important to consider how these widely differing settings impact on the way a VE is experienced and understood. However, this discussion will have to be developed elsewhere.

48

Spatial Context of Interactivity

‘topoanalisys’ defined as ‘the systematic psychological study of the sites of our intimate lives’. This kind of space/place comes into prominence when perceptual spaces are interconnected by cultural knowledge where understanding is common to the members of a group that shares experiences, symbols, and signs. Created though practice and appropriation, places afford shared social setting and determine behaviour. They can be contrasted with ‘nonplaces’ that have an inflexible functional base and are as such not easily appropriated. For example, typical urban car parks [cf. 1] are hostile to most of communicative patterns of social behaviour and can accommodate only a narrow range of activities usually conducted in opposition to the established routine of society (e.g. street crime or youth culture) and therefore forced into the periphery of urban spaces. 5 Place-making can be described as a process of arranging objects and spaces with the purpose of creating an environment that supports necessary activities while embodying and conveying the social and cultural conceptions of the actors and their communities [8]. This section draws attention to a number of ways in which settings, spatial experiences, actions and meaning are interconnected in the context of real and virtual environments. For VEs to attain expressive and dramatic goals they should be planned with a holistic approach to the design of spatial layouts and interactive content in order to create experiences that utilize their potential as a medium. B. Systematic and Situational Orders This section briefly introduces the concepts of ‘systematic’ and ‘situational’ in the context of the following discussion. Systematic order is the term that expresses relationships between things that are internally consistent; the coherence is due to the rule-based system structure. Geometry and logic are examples of such systems. The entities of geometry are defined in terms of how they behave in the system, and all other possible modes of relationship are disregarded. To use Plato’s example [45], within such a system, a circle drawn in the sand is not considered to be a true circle, which can only be imagined in terms of the rules of relationship. Computer code can be considered as an entirely systematic operation. Even ‘chaotic’ and ‘emergent’ computer-simulated phenomena are still based on algorithms that never leave the domain of calculable systems. 5

It is important to be aware of the danger of any qualitative judgments motivates by such rhetorically useful rationalizations. Places are phenomena in transition and human spatial practice is uniquely malleable. The aforementioned urban car parks can be described as ‘peripheral’ only from a particular and static viewpoint: e.g., graffiti and murals are becoming more and more integrated into established art practices; these practices become inseparable from their native urban settings redefining urban car parks as places and cultural artifacts.

Stanislav Roudavski & François Penz

49

Situational order is the term that expresses the way humans have a world and meaning. Significant parts of reality can be described as systems but in principal reality is of a situational nature. All being is necessarily involvement with people or things and all involvement has a structure of beginning-middle-end, i.e. everything occurs somewhere sometime. Therefore, situation has the character of an event. The nature of situation can be described as the reciprocity between conditions and possibilities or between necessities and freedom. Conditions refer to plausibility, to that which is held in common and, ultimately, to nature as the shared basis of objectivity while possibilities refer to one’s capacity to understand, interpret, imagine or act. This multi-level structure of involvements is something into which we are born and something that makes meaning possible. The world does not depend upon understanding, rather one participates in the world (acknowledges its claim) through understanding. In this context, the concept of space understood as a threedimensional (Euclidean) continuum is misleading as it implies uniformity and simultaneity, thus misrepresenting the complex structure of situational order. C. Expressive VEs VEs can support a multitude of activities and applications. While installations such as flight simulators or industrial visualizations aim to construct ideal simulations, most of the VEs have declared or inherent expressive qualities. This chapter is particularly concerned with VEs that are produced with expressive or dramatic goals as works to be accessed by others 6 but also argues that all VEs share characteristics that are concerned with mimesis and metaphor. Our interpretation of the role of experiential space in VEs contradicts the enthusiastic comments of early writers on VR and VEs who were excited about their ‘limitless’ possibilities. According to these writers, such possibilities included modified physical laws, multiple dimensions, impossible geometries ‘liquid architectures’ and unusual methods of transportation [6]. Now, in 2003, it is evident that most of the work in VEs has moved in another direction and is instead focused on manufacturing highly representative environments such as those found in videogames. Game-like VEs, despite their quick conceptual and technical progress, are now dominated by attempts to mimic real environments; these attempts 6

Of course, all environments are expressive of something and potentially dramatic. Different modes of expression (e.g. educational, artistic or theatrical) provide different structures to communicate ideas and can vary greatly. However, for the purposes of this chapter it is important to note the distinction between the environments where dramatic goals are declared at the outset and the ones that focus on simulation (e.g. for pilot training) or social exchanges (e.g. in virtual communities).

50

Spatial Context of Interactivity

are frequently superficial and are made predominantly at the level of visual style. This chapter suggests that navigable VEs ought to recognize their inescapably hybrid nature as both systematic and simulative, and that they should rely on their two unique characteristics: spatiality and interactivity, designed in conjunction. Rich examples for consideration and analysis are provided by the VEs developed for modern computer games. These VEs are set in navigable space and exhibit graphical, functional and expressive sophistication. Impressive visual output and a sense of presence are typically counted among the primary appeals of such environments. However, as an emerging medium, VEs can aspire to tackle provocative, emotional, intellectual, or aesthetic issues, from positions that are informed by the current cultural and artistic practice and make use of the progression of ideas in other arts and media. The current situation in games, utilitarian VEs or even interactive arts is still characterized by the naïve approach to the design of spatial context, mediation, sound or narrative content that is conducted in isolation or is based on stereotypical and limiting popular-culture recipes. The sense of ‘presence’ in its established meaning: ‘the perceptual illusion of non-mediation’ [36] is a powerful effect unique to VEs but apart from its intrinsic entertainment value it is a unique expressive instrument. We need to review the expressive vocabulary that VE-based forms have at their disposal and the suggestion of this chapter is to reconsider the role and structure of such environments as situations that are grounded in reality. Jenkins [29] has suggested that game-interactions are concerned with the competition for space and has argued that games tell stories by organizing spatial features. At the same time an opposing view exists that criticizes games for their preoccupation with spatial navigation and spatial puzzles [9] characterized by a large number of nouns (i.e. a multitude of objects) and a small verb-set (i.e. a limited number of available actions). Obsession with spatial reasoning at the expense of interactivity is seen as detrimental to the staging of social interaction and preventive against engagement with wider moral, social and aesthetic issues. This chapter argues that the space in VEs does not need to be (and in many cases cannot be) reduced to a puzzle-like contraption where the user can exercise abstracted spatial reasoning. Even the most austere task- or rule-based VEs cannot be constructed as fully enclosed systems. The embedded design- (or game-) logic does have systematic nature, but the way this logic is accessed and interpreted engages the user’s knowledge and experience that are external to the systematic confines of VEs. In their very nature, navigable VEs are representational: they always activate mimesis, recognition, re-enactment and interpretation, whatever the style of visual implementation or system-mechanics. A large part of the expressive potential of VEs lies in their promise to become places that are memorable and emotionally rich, places that can exude emotions such as the sense of belonging and safety or, con-

Stanislav Roudavski & François Penz

51

versely, of adventure and danger. This promise can only be achieved if the user is actively engaged with the expressive dramatic potential of the experiential and social context. 3.

Access to Meaning

A. Dramatic Goals Unlike real-world environments, expressive VEs do not exist (or not only, in the case of massive multi-player fantasy worlds) to support an existing social practice, but to provide an authored experience that poses a challenge, illuminates a point, or entertains. The intent of the author, incorporated in the design of situational and systematic features of a VE, can be an expression of a particular philosophical, aesthetic, or moral standpoint. In the real-world (e.g. architectural or urban) design, legibility [38] or intelligibility [27][28] of an environment is sought as a quality that is positive by default. This is particularly true in the case of authorless complex environments shaped over time by social practice (as in the realworld or even the ActiveWorlds [57] cities). In the real world, the spatial configuration of an architectural structure presents logistical and other operational problems, but is, at the same time, thought to serve as a support system that contains the information necessary for the solution [43]. By contrast, most VEs have an (collective) author and are constructed in accordance with author’s intent. The structure of the environment, the agency it supports and its presentation are constructed in response to this intent. Because the users of VEs are presented with a set of constraints and opportunities that are generated by the author and external to their own habitual spatial practice, the criterion of legibility is losing its default positive value – the meaningfulness and readability of VEs have to be measured against and discussed in relation to dramatic goals as declared (or suggested) by the author and incorporated into the mechanics of the VE. This way of looking at places and whole habitats is not unheard of and can be compared to world-views of many traditional cultures. Numerous civilizations (e.g. Pueblo Indians, Chinese, Maya or Ancient Greek) have developed their world-views into structures of conceptual mythical spaces. According to these views, countries and civilizations have their factual and mythical geographies. It is not always easy to tell them apart, or even to say which is more important. The way people act depends on their comprehension of reality, and that comprehension, since it can never be complete, is necessarily imbued with myths [52] [53]. Massive multi-player online role-playing games function, according to the participants, as communities and public spaces. These RPGs succeed in producing both public and place [42]. However, the cosmologies, creationmyths, and other knowledge and logic only have a limited in-game presence in such worlds (e.g. via textual descriptions or person-to-person in-

52

Spatial Context of Interactivity

role verbal exchanges) and are typically constructed and maintained externally, for example via associated web pages. This does not need to be the case. In VEs, such cosmologies can be supported by the structure of purposefully designed space and events. This is only one of the possible approaches to generating a coherent spatial dramatic goal structure but this approach is related to the body of knowledge that has been extensively researched by architects, anthropologists and geographers. As examples, one can mention the carefully arranged symbiosis of conceptual and spatial in Chinese gardens or classic Mayan urban constructions. In a related line of argument, Jenkins notes [29] that designers might aim to construct, for example, Expressionist (emotion mapping onto space), Romanticist (assignment of moral values to landscape), Surrealist (psychological symbolization of object) or other non-simulatory VEs allowing access to coherent alternative worlds. This section argues that the way an authored expressive VE is (or should be) constructed, presented and accessed is governed not (or not only) by the need to express and support established, predicted, or desired social and cultural processes and requirements but by the compositional (and author-interpreted) logic of myth, narrative, drama or other emotionally and intellectually significant motivators. It is therefore important to design the VE dramatic goals, spatial structure, system-mechanics and presentation layer in unison and the understanding of the spatial foundations of such design can be derived from several established fields of knowledge and creative practice such as anthropology, geography, architecture and urban planning. B. Narrative Potential According to some commentators (e.g. [11] [19]), the sequence of events as experienced by the user in a VE amounts to a story when looked at in hindsight. What the user is given by RT VEs can then be described as ‘narrative potential’, the term that refers to situated affordances that can be assembled into a meaningful progression through playeractions. In an authored VE with expressive aspirations, it is always desirable to maximize the narrative potential in order to provide an agency- and meaning-dense field for interactive exploration and dramatic engagement. To achieve this, it is advantageous to create relationships between elements in ways that suggest complexity, dependencies, and hierarchies, and that generate suspense and mystery, thus constructing dramatically potent environments that are complex yet knowable, purposeful yet unpredictable. This chapter considers some of the contributions spatial design and mediation design can make to this purpose.

Stanislav Roudavski & François Penz

53

C. Agency The value of interactive access lies in the agency it provides. Agency consists of intention and perceivable consequence and relies on the coherence and cognoscibility of the VE [11]. Intention is understood in terms of nested goals and plan-making in response to the game state. Agency allows users to engage in sophisticated, dynamic, exploration/planning processes that result in the construction of a mental image of the environment (with all its ingredients and events) that is active in their minds. This mental image is constructed and referred to in response to, and under the influence of, the overlapping goal- or plan-structures that the user has to construct and continuously reformulate. The ability to make inferences and interpretations in response to the situational circumstances, and thus engage in action that results in perceivable and meaningful consequence, allows users to convert the narrative potential into interconnected sequences of personal and meaningful experiences. A common approach in game-design is to use a limited set of objects/actions and to ensure their coherent and consistent usage and meaning. This approach often results in an environment in which the situational meaning of objects, derived from the real world through mimesis, is wastefully disregarded – the VE utilizes them exclusively as abstracted parts of the game mechanics. While this might be the basis of a successful game (e.g. chess in the extreme case, or Tetris [62], or even Quake [60]), such a reductive approach fails to exploit a number of meaning-layers that are either already available in, or can be brought into, the interactive environment. It is one of the principal arguments of this chapter that in a navigable VE, it is impossible to construct an ideal watertight world of perfect internal coherence – the users will always interact with the VE on the basis of their life experiences and bring with them their own interpretations and understanding. How can a complex multi-layered VE be constructed so that multiple readings (suggested by design but referring to wider, realworld knowledge and culture) are made possible while the author-intended interdependencies between user-actions, world-events, and spaces are apparent or deducible and the principle of perceivable consequence is maintained? The proposed approach that guides our current practical work is to rely on the architectural-design strategies for creation of meaningful places that stage, inform, and provoke exploratory, constructive, and/or imaginative user actions and to utilize a specifically arranged mediation-layer for dramatic presentation.

54 4.

Spatial Context of Interactivity Meaning in Mind

A. Mental Image In our physical world, the mental image of our environment [38][22][15] depends on a number of parameters. 7 We construct and perfect ‘acting in places’ in accordance with our cultural knowledge (e.g. frames of reference), personal experience and mindset, our familiarity with a given place, and also with the peculiar way that the process of knowledge-acquisition unravels (cf. different circumstances of travel: progression along a constrained route, pursuit of a guide, flâneur-like exploration or resolute navigation to a particular location). It has been suggested that mental image construction is a cyclic process: existing images shape the nature of newly constructed images and those, in turn, update the already established ones [49]. Mental images vary greatly depending on the information acquired through spatial travel, image-like representations, and verbal communication. They are never complete, accurate, surveyknowledge-like representations. Mental images always have inbuilt distortions (e.g. [21][54]) and can be described better as mental collages [55]. Research has also shown that different people rely on different sets of outstanding situational features in order to acquire landmarks for navigation [48]. Distinct shape, function, smell, unique events, sentimental memories, or repetitive encounters can all come into play when a situated feature acquires prominence in a traveller’s mind. Our environment is not static: we expect change and associate varying ‘speeds’ of change with different components in an environment (e.g. people transform ‘faster’ than a table which, in turn, transforms faster than walls). Our expectations are conditioned relative to these speeds. Two different people can both expect a familiar place to be there the next time they come to it, but will associate it with different opportunities and atmosphere if their encounters with the ‘fast’ layer differ significantly (e.g. if one comes on quiet mornings as opposed to during rowdy nights) (cf. [14]); i.e. the composition of our mental images depends on our ‘sampling strategies’. Furthermore, our presumptions about the meaning of an environment extend beyond our practice as individuals; such presumptions are always socially dependent and always refer to collective practice. The structure of the mental image transforms as familiarity with the place grows and the meaningful relationships between components become established. 8 Using redundant elements, deduction and prediction, we can very efficiently compensate for change or inconsistencies that routinely 7

The use of the term here is equivalent to the term cognitive map understood as a metaphorical device, see [32] for further references.

8

This transformation can also be seen as an accumulation of layers. We retain limited access to these layers, e.g. we usually can still recall our first impression of a significant place we have become intimately familiar with.

Stanislav Roudavski & François Penz

55

occur in the environment. This chapter suggests that the distinct personal character of mental images, their redundancy, flexibility and dynamism can be exploited when affecting the user’s abilities to understand, remember, navigate and interact. B. Mediation Devices A VE is a dynamic system of algorithms and data sets that are only accessible to experience through a layer of mediation. The digital world exists as navigable experiential space/place only in the user’s mind. The processes of learning about and appropriating space are constrained and shaped by mediation devices. The influence of these constraints can be averaged for all users at all times if the set of mediation devices is identical for all. A typical example would be a VE (such as a first-personshooter game) where players have points-of-view (POVs) that are positioned identically relative to their avatars (e.g. when cameras are aligned with and constrained to the heads of user avatars) and the visuals (e.g. rendering style, object depiction), sound, lighting and effects that are identical for every camera in the world. One reason for such uniformity is that these games are contests of skill and it is only fair to provide identical conditions for all competitors. However, even in these lapidary examples, variations are introduced when players can choose avatars of differing height or switch between the first- and third-person POVs. VR research has traditionally strived to minimize the interference of mediation devices, which makes sense for installations such as a flight-simulator or its relatives. However, current technologies do not show how the mediation layer can ever become fully transparent. The first-person POV, even when enhanced by immersive stereoscopic displays with head-tracking systems, is never automatically the ‘correct’, ‘natural’ or even the ‘best’ camera choice. In a mediated environment, a choice of camera is an artistic decision that is a function of expressive goals of the author. C. Mediation in Cinema If the generative role of the mediation layer is unavoidable, why not use it for place-making? Cinema demonstrates how this can be done in a multitude of powerful and versatile ways. A number of framecomposition principles, editing strategies and solutions for set-design have been developed over the years to deal with presentation of spatial environments on a two-dimensional screen. Cinema routinely constructs, modifies, interprets and repurposes real and imaginary places to support its narrative and dramatic needs. This chapter suggests that cinematic strategies can usefully inform the design of the mediation layer in VEs. This vast topic requires a lot of further work and experimentation. For the purposes of the discussion as presented above, this chapter includes one example that demonstrates how cinematic mediation can utilize the affordances of a place to create a dramatic construct that aims to elicit a strong emotional and moral response from its audience.

56

Spatial Context of Interactivity

D. Example 1: Odessa Steps The example is the famous Odessa Steps sequence from Eisenstein’s Potemkin [58]. The events of the sequence take place on the grand Classicist 136m-long and 27m-wide Steps (1837-42), built to give access to spectacular views towards the sea from the surrounding parkland, a popular place for promenades, concerts, romantic rendezvous and the like. This sequence develops a theme that was of interest to Eisenstein throughout his career [18] and has since been reinterpreted many times elsewhere. This is an inherently spatial idea of an implacable automatic machine-like advance of an evil overwhelming force that, once set in motion, is out of anybody’s control. The soldiers on the steps are such a force: a faceless (no close-ups) mass of marching boots that steadily roll on unperturbed by the havoc they create. In order to deliver the idea of blindly advancing force that is carried as if by momentum, the sequence exploits the axial nature of the place and transforms it, both visually and symbolically, into a corridor-like trap full of anxiety and danger. The beginning of the sequence shows a fleet of small sailboats crossing the harbour and approaching the rebellious battleship. After that, the Steps are introduced with a series of medium shots and close-ups showing an enthusiastic crowd of townspeople admiring the scene from the top of the Steps. The images of joyful women and children waving to the ship introduce the Steps as a safe public place with a commanding view over the harbour. The intercut between the shots of the activities around the ship and the townspeople establishes a direction of sight, to which the Steps are aligned, that becomes a symbolic axis along which the opposing political forces press forward towards collision. The unexpecting people find themselves jammed between two symbolic powers (materially represented) that compete in the tubular space of the Steps. As the massacre scene unfurls, the spectators, together with the panicking on-screen crowd, are forced to re-evaluate the place in a hectic search for the ways to flee: open spaces spell danger; low retaining walls afford tiny pockets of refuge; the top end of the sloping axis is the source of threat and the fenced-in bottom is the only point of escape. This event and its dramatic portrayal reveal unexpected and shocking affordances previously dormant in the lavish structure of the Steps built for the pleasure of the local bourgeoisie.

Stanislav Roudavski & François Penz

57

Fig. 1. The tight-shot selection throughout the sequence locks the viewer within the confines of the polarized space, in the midst of the action. In reality, the flanks are open for views and movement.

Fig. 2. The first of these shots shows, compressed, The Steps interpreted as a symbolically laden (high vs. low, dark vs. light, sculpture ↔ soldiers ↔ people ↔ church) axis of power and danger. The only visual connection to the ‘outside’ shows the people hopelessly searching for a refuge that this cinematically re-fabricated place does not afford.

E. Mediation in VEs In non-game RT VEs, related work exists on automated cinematic cameras and automated expressive lighting [16][25][50]. The subfield of games is also becoming more sophisticated [37] in its use of ingame camera work (e.g. in Resident Evil [61]) and proclaims cinematic aspirations in such titles as 007 Nightfire [56], The Two Towers [63] and FIFA 2003 [59][31] in addition to the well-developed use of cut-scenes and ‘machinimas’ [33]. Our work on the Haven project [40][41] and observation of the creative practice in computer games, suggests a number of benefits that can be achieved if the activities of the user, their spatial context, and the mediation devices are designed together with an intention to support and guide user interest. Narrative theory has suggested [4][7] that suspense, either proairetic (created by anticipating an action’s resolution) or hermeneutic (caused by unanswered questions) is the main attractor that drives the reader through a story. In VEs, the context of situated actions is one of the carriers of dramatic tension and one of the generators of suspense. The store of mediation devices includes virtually unbounded camera positioning, shot selection, and shot-sequence editing; lighting, sound-design, and special effects. Interactive events, spatial structure, and mediation devices work in unison when integrated with dramatic purposes in mind. For example, it becomes possible to reinforce or diminish the significance of an environmental feature as a landmark by associating it with some suggested socio-historical background, functionality, and unique interactive dramatic encounters and by providing dedicated light-

58

Spatial Context of Interactivity

ing and camerawork that idiosyncratically depict form, establish scale and relationships with the other spatial elements and the avatars. The new relationships between environmental features and their impact on mental representation can have a dynamic character. As the user returns to a location to complete an action or relates to a location for navigation, an interpretation can change, creating contrast and contributing to narrative accrual. Functional zones of architecture, thematic content-divisions, and distinct moods or styles can be arranged to align in a particular fashion to correspond with dramatic requirements. Mediation devices can establish and animate distinct emotional atmosphere in a location, delineate its boundaries, suggest genre and expected behaviour. In a multi-user setting, information and interpretations can be selectively distributed among the users, either according to their roles in the VE, their goals, the author’s intentions, or users’ previous actions. This strategy enables users to communicate, collaborate, and compete within the same coherent experiential field that they recognize as a shared spatial context. At the same time, purposefully arranged permanent or dynamic misalignments of users’ mental images can provide rich scope for creating dramatic conflicts, encouraging collaboration, or facilitating mutual learning and social behaviour. F. Example 2: Resident Evil As an illustration of the aforementioned ideas, this chapter briefly considers two examples from a game with innovative camerawork and a strong theme. Resident Evil [61] is a role-playing game (RPG) adventure/survival-horror title. The game mechanics are structured like a system of spatial puzzles and resource-management tasks. The game controls the camerawork and attempts to create darkly atmospheric environments consistent with the horror genre. It must be emphasized that apart from conceptual, artistic or technical problems, the game by design does not attempt to go past stereotypes in its story or character development and the same is true about the design of the environment. Structured according to an inflexible recipe, the game presupposes an average fixed-profile player that is expected to correspond to a speculative sociological description and be able to refer to a set stock of cultural conventions. Paradoxically, despite the apparent freedom of navigation and interactive access, the game is constructed as a unimodal prescriptive work that expects the player to fit into a pre-cast model, a fact that makes the game-play prone to accidental misreadings and misinterpretations. Work by Eco [17] and Hall [24] on the encodingdecoding of media discourses can be particularly illuminating in this context. In their view, the work (text) allows alternative readings and its meaning is located somewhere between the producer (author) and the audience (reader, user or player). Cf. Tosca’s [51] description of typical adventure games as ‘closed’ texts that operate with expectations in a genreconstricted space and do not aspire to utilize, subvert or break our interpre-

Stanislav Roudavski & François Penz

59

tive schemata. Reception theory and reader-response theory offer a toolset that appears to be well-suited to the analysis of VEs. Such in-depth analysis of ‘negotiated’ readings and the ‘model reader’ is, however, outside the scope of this chapter. The way the knowledge of genre-conventions and typical game routines influences the experiences of the world and produces randomly disruptive results can be frequently observed when inexperienced or firsttime players have to come to terms with the way the avatar of the playercontrolled protagonist is implemented as an extension of the game-world. For an experienced player it is given that the avatar and the game-world are designed together so that if the player encounters an obstacle it is understood that the avatar will have ‘physical’ or ‘functional’ capacities to overcome them. In fact the puzzles can often be resolved via a process not unlike that of reverse engineering. For example, the discovery of an unreachable ladder suggests that the environment can be reconfigured to access the ladder or the avatar has (or has recently acquired) the capabilities beyond the ones that the player has been aware of before this moment. Inexperienced players often get baffled by relatively simple problems that are sometimes not even designed as challenges and it is now a common practice to rely on walkthroughs even among the more experienced people. On one hand the novice players might not see the opportunities for agency that a game-world provides but on the other hand they also tend to expect to be able to actuate events and make inferences that that are not supported by the game-world. A common strategy for a horror title is to introduce an archetypal place associated with privacy and safety and gradually reconfigure its image into something that is a source of distressing danger. The introductory sequence of Resident Evil attempts to do exactly this. The sequence begins with a pre-rendered cut-scene. The player arrives in a helicopter to a dark forested area and finds another crash-landed helicopter that belonged to the team he/she is here to rescue. The environment appears to be full of danger and the sense of helplessness is strengthened by the fact that the player cannot see his/her way in the dark. The player sees a brightly lit building in the distance and recognizes it as a country mansion. It promises rescue and when suddenly attacked by dangerous dogs the player speeds towards it. The player enters the building and recognizes it as an old European-style mansion, an exotic and curious artefact in an American (as it is defined by the background story) landscape and a stereotypical setting for many romantic, mysterious or detective narratives. The cutscene ends and the player is in control of the avatar.

60

Spatial Context of Interactivity

Fig. 3. The shot on the left is set to support navigation in this confined space and to draw attention to the functional details in the environment: the typewriter allows saving a game state, the box holds weapons and ammunition, and the ladder is the exit. These three shots also demonstrate that the game attempts to create environments with different character. However, the distinction between the rooms is usually purely visual (and sometimes supported by such effects as running water with its associated sound).

After entering the building, the player proceeds through a series of rooms (The Main Hall, the Dining Room, the Kenneth Room, etc.) and engages with a number of game-world artefacts that exhibit distinct gamelike behaviour. The player sees several spatially and conditionally triggered cut-scene dialogs with the team members that are presented in a wide-screen format with cinematic editing that is distinct from the incontrol standard-screen format. The player encounters game-objects such as doors that, when caused to open, play an abstract-looking animation of an opening door on a black background and transport the player into a new room; a typewriter that allows saving; and even the first zombie. The player is introduced to the environment and the agency it supports and is explicitly encouraged (in fact forced by the conditional event-flow) to explore the Main Hall which is topologically complex and will be revisited many times during the game. From the start, the player has to engage with a number of object-metaphors that have a schematic or purely symbolic connection with their real-world analogues. However, up to a moment the environment remains relatively congruent with genre- and story-driven requirements of the game - the interface-level conventions that the player is asked to accept do not disrupt the suspenseful atmosphere of this imaginary world. Unfortunately, as the experience of the game-world accumulates the discrepancies, the illogical and misleading spatial layouts come to undermine the mysterious atmosphere that the game sets out to build and maintain. This negative impact is reinforced by the use of abstract puzzlelike setups that are familiar to the player from many other games and whose self-contained logic often fails to relate to the place-like qualities of a VE. In Resident Evil, one of the first disruptions occurs in the Woman Pouring Water Room: the sculpture is positioned in the middle of the room for no obvious reason, a map is hidden in the pot it holds and its position is indicated by glistening light, a large crate blocks the way and can be moved towards the sculpture so that the map can be extracted. From this moment the game-world balance shifts towards the systematic component of the equilibrium and the player begins solving repetitive trial-and-error

Stanislav Roudavski & François Penz

61

puzzles and managing items at the expense of dramatic engagement. The systematic character of interactive involvement and clichéd gamemechanics become disconnected with and even contradictory to conditions and possibilities provided by the place seen as situation. This type of dis-coordination between systematic and situational orders is reoccurring throughout the game. Imaginatively furnished and lit, the game locations create mysterious atmosphere but do little to support complex inferences and agency. The game relies heavily on its internal logic and uses a repetitive limited set of objects, sounds, and camera-shots to give signals to the player. The plans of the environments are architecturally incongruous and even where it is possible to guess the function of a location (e.g. a dining room or a picture gallery); there is no connection to the action-potential open to the player or to the staged events.

Fig. 4. The structure of the Courtyard plan is typical for the game. While the views outside the Aqua Ring suggest open space, the actual topology of the environment is a structure of labyrinthine corridors with little actual freedom for exploration. Looking at the Ring in the garden setting, we are intrigued by the circular opening in its centre but also presuppose paths and connections to the rest of the estate that do not exist.

The spatial configuration succeeds in encouraging speculative thought that suggests rich situational structures but never utilizes them for dramatic or narrative purposes and effectively neutralizes their potential with persistent reliance on abstracted systematic interactions (some of the typical pairings signalling agency are zone ↔ activation, door ↔ passage, hoarse breathing or threatening music ↔ monster, object in frame ↔ container or button). The examples show that some of the suggested techniques are already in use in games. Resident Evil lends itself to analysis and criticism precisely because of its innovative aspirations and achievements. This chapter suggests that it is possible to strengthen the dramatic impact and provide access to a new range of expressive solutions by creating VEs that recognize the holistic nature of places and, accordingly, implement spatial structure, interactivity, and mediation in tight integration. 5.

Conclusion In an attempt to suggest how navigable RT 3D VEs can be made more expressive, meaningful, and dramatic, this chapter discusses the holistic nature of spatial interactive mediated experiences. The first section considers space as an experience and suggests that meaningfulness of VEs

62

Spatial Context of Interactivity

stems from their hybrid nature that is a symbiosis of systematic and situational elements. The second section claims that significant components of this contextual meaning are only available via interactive access and is dependent on the fusion of intent and perceivable consequence: agency. The final section discusses mediation devices in RT 3D as an unavoidable layer that brings VEs into existence in the user’s mind. The flexible role of the mental image is considered and some generative opportunities are suggested. Dynamic virtual place, delivered and interpreted through purposefully arranged mediation devices, can be used for contextual placement, meaning-assignment, attention-direction, narrative time manipulations, spatial-relationship definition and development, dramatic user positioning, user characterization, interactive flow interpretation, and alterations of narrative perspective without removal of interactive access. Limited space has not allowed detailed analysis of either the functional environments built for our research or that developed commercially. This remains a task for the future. 6.

Acknowledgements The authors of this chapter are grateful to Peter Carl for his contribution on the concept of situation as well as to Elliott Dumville, Kai Fierle-Hedrick, Michael Nitsche, Maureen Thomas and others for collaboration, advice and discussion.

Bibliography 1. Augé, Marc. Non-Places. An Introduction to an Anthropology of Supermodity. London: Verso Books, 2000. 2. Aumont, Jaques. The Image. London: BFI Publishing, 1997. 3. Bachelard, Gaston. The Poetics of Space. New York: The Orion Press, 1994. 4. Barthes, Roland. "Introduction to the Structural Analysis of Narratives." In A Barthes Reader, edited by Susan Sontag. New York: Hill & Wang, 1982. 5. Baudrillard, Jean. Simulations. New York: Semiotext(e), Inc., 1983. 6. Benedikt, Michael, ed. Cyberspace: First Steps. Cambridge: MIT Press, 1991. 7. Brooks, Peter. Reading for the Plot: Design and Intention in Narrative. New York: Vintage/Random House, 1984. 8. Canter, David. The Psychology of Place. New York: St. Martin's Press, 1977.

Stanislav Roudavski & François Penz

63

9. Carless, Simon. "Power Balancing: An Interview with Chris Crawford." 6 June 2003. Gamasutra, (2003). 10. Casey, Edward S. Getting Back into Place. Bloomington; Indianapolis: Indiana University Press, 1993. 11. Church, Doug. "Formal Abstract Design Tools." Game Developer, 1999. 12. Dalton, Ruth C. "Is Spatial Intelligibility Critical to the Design of Large-scale Virtual Environments?" International Journal of Design Computing 4 (2002). 13. Darken, Rudolph, P., and Peterson, B.. “Spatial Orientation, Wayfinding, and Representation.” Handbook of Virtual Environment Technology, edited by Stanney, Key M. (2001). 14. Debord, Guy. The Society of the Spectacle. New York: Zone Books, 1994. 15. Downs, Roger M. and Stea, David, eds. Image and Environment: Cognitive Mapping and Spatial Behaviour, 1973. 16. Drucker, Steven M. "Intelligent Camera Control for Graphical Environments." 1994. 17. Eco, Umberto. The Role of the Reader: Explorations in the Semiotics of Texts. Bloomington: Indiana University Press, 1979. 18. Eisenstein, Sergei. Immoral Memories: An Autobiography. Boston: Houghton Mifflin, 1983. 19. Fencott, Clive. "Virtual Storytelling as Narrative Potential: Towards an Ecology of Narrative." In LNCS, 2197, edited by O. Balet et al. 90-99. Berlin, Heidelberg: Springer-Verlag, 2001. 20. Gibson, James J. The Ecological Approach to Visual Perception. Hillsdale; London: Erlbaum, 1986. 21. Golledge, Reginald G. "Environmental Cognition." In Handbook of Environmental Psychology, 1, edited by D. Stokols and I. Altman. 1987. 22. Gould, Peter. Mental Maps. Boston; London: Allen & Unwin, 1986. 23. Griptonite Games (developer). The Lord of the Rings: The Two Towers. Electronic Arts (publisher), 2002. 24. Hall, Stuart, ed. Culture, Media, Language. London: Hutchison, 1980. 25. He, Li -wei, Cohen, Michael F. and Salesin, David H. "The Virtual Cinematographer: A Paradigm for Automatic Real-time Camera Control and Directing." In Proceedings of SIGGRAPH 96, 217-224. New Orleans: ACM SIGGRAPH, 1996. 26. Heidegger, Martin. Being and Time. London: SCM Press, 1962.

64

Spatial Context of Interactivity

27. Hillier, Bill. Space Is the Machine. New York: Cambridge University Press, 1996. 28. Hillier, Bill. The Social Logic of Space. Cambridge: Cambridge University Press, 1984. 29. Jenkins, Henry and Squire, Kurt. "The Art of Contested Spaces." In Game On: The History and Culture of Video Games, edited by L. King. 64-75. New York: Universe, 2002. 30. Johnson, Mark. The Body in the Mind. Chicago, London: The University of Chicago Press, 1987. 31. Kane, Brad. "Convergence - Feature Film and Game." 6 March 2003. Gamasutra, . (2003). 32. Kitchin, Rob and Freundschuh, Scott. "Cognitive Mapping." In Cognitive Mapping : Past, Present and Future, edited by Rob Kitchin and Scott Freundschuh. London: Routledge, 2000. 33. Klevjer, Rune. "In Defense of Cutscenes." In Computer Games and Digital Cultures 2002 Conference Proceedings, edited by Frans Mäyrä. Tampere University Press, 2002. 34. Lakoff, George and Johnson, Mark. Metaphors We Live By. Chicago: University of Chicago Press, 1980. 35. Lefebvre, Henri. The Production of Space. Oxford, UK; Cambridge, USA: Blackwell, 2001. 36. Lombard, Matthew and Ditton, Theresa. "At the Heart of It All: The Concept of Telepresence." Journal of Computer-Mediated Communication 3 (1997). 37. Luban, Pascal. "Turning a Linear Story into a Game: The Missing Link Between Fiction and Interactive Entertainmen." 15 June 2001. Gamasutra, (2003). 38. Lynch, Kevin. The Image of the City. Cambridge, MA: MIT Press, 1960. 39. Merleau-Ponty, M. The Phenomenology of Perception. London: Routledge, 1962. 40. Nitsche, Michael; Roudavski, Stanislav; Thomas, Maureen and Penz, François. "Building Cuthbert Hall Virtual College As a Dramatically Engaging Environment." In Proceedings of the PDC Malmoe 2002, 386-390. Palo Alto CA: CPSR, 2002. 41. Nitsche, Michael; Roudavski, Stanislav; Thomas, Maureen and Penz, François. "Drama and Context in Real-Time Virtual Environments: Use of Pre-Scripted Events as a Part of an Interactive Spatial Mediation Frame-

Stanislav Roudavski & François Penz

65

work." In Proceedings of 1st International Conference on Technologies for Interactive Digital Storytelling and Entertainment, Darmstadt: 2003. 42. Oliver, Julian H. "The Similar Eye: Proxy Life and Public Space in the MMORPG." Computer Games and Digital Cultures Conference (2002). 43. Passini, Romedi. Wayfinding in Architecture. New York: Van Nostrand Reinhold, 1992. 44. Peirce, Charles. Philosophical Writings of Peirce. New York: Dover, 1955. 45. Plato. The Republic of Plato, Oxford: Oxford University Press, 1970 46. Proust, Marcel. Remembrance of Things Past; Vol.1 Swann's Way. London: Chatto & Windus, 1981. 47. Relph, Edward. Place and Placelessness. London: Pion, 1976. 48. Steck, Sibylle, D. "The Role of Global and Local Landmarks in Virtual Environment Navigation." Presence 9 (2000): 69-83. 49. Thinus-Blanc, C. "The Organising Function of Spatial Representations." In Wayfinding: Cognitive Mapping and Spatial Behaviour, edited by Reginald G. Golleg. 294-307. Baltimore: Johns Hopkins, 1999. 50. Tomlinson, William M., Jr. "Interactivity and Emotion Through Cinematography." Media Lab, 1999. 51. Tosca, Susana Pajares, "Reading Resident Evil - Code Veronica X." 15 october 2003. Proceedings of the Fifth International Digital Arts and Culture Conference, (2003) 52. Tuan, Yi -Fu. "Geography, Phenomenology and the Study of Human Nature." Canadian Geographer (1971). 53. Tuan, Yi -Fu. Space and Place: The Perspective of Experience. Minneapolis; London: University of Minnesota Press, 1977. 54. Tversky, Barbara. "Distortions in Memory for Maps." Cognitive Psychology (1981): 407-433. 55. Tversky, Barbara. "Cognitive Maps, Cognitive Collages, and Spatial Mental Models." In Spatial Information Theory: A Theoretical Basis for GIS, edited by Andrew U. Frank and Irene Campari. 14-24. Berlin: Springer-Verlag, 1993.

Videogames, VEs and Films 56. 007: Nightfire (PC). Developed by Gearbox Software. Published by Electronic Arts. UK release, 2002. 57. ActiveWorlds. (2003). 58. Battleship Potemkin. Eisenstein, Sergei M. and Aleksandrov, Grigori. Russia, 1925.

66

Spatial Context of Interactivity

59. FIFA Soccer 2003 (PC) Developed by EA Canada. Published by Electronic Arts. US Release, 2002. 60. Quake III Arena (PC). Developed by id Software. Published by Activision. US Release, 1999. 61. Resident Evil (GameCube). Developed by Capcom. Published by Capcom. US Release, 2002. 62. Tetris (Nintendo Gameboy). Developed by Bullet-Proof Software. Published by Nintendo. UK Release, 1989. 63. The Lord of the Rings: the Two Towers (Sony PlayStation 2). Developed by Stormfront Studios. Published by Electronic Arts. UK Release, 2002. Biographies Stanislav Roudavski is an architect and designer with interests in visual art and digital media. He has more than ten years of combined professional experience in graphic, interior, architectural and urban design, 3D modeling, animation and special effects as well as research experience in several European countries. He has been involved in post-graduate teaching at the University of Cambridge, MIT and other higher-education institutions. He currently works in multimedia and VE within the practicebased research environment of the Cambridge University Digital Studios and Cambridge University Moving Image Studio (CUMIS). François Penz is an architect and the founder-director of CUMIS. He teaches in the Department of Architecture at the University of Cambridge where he runs an MPhil course in 'Architecture and the Moving Image', part of the Digital Studios. He co-edited with Maureen Thomas a book on 'Cinema & Architecture' (BFI, 1997) and ‘Architectures of Illusion’ (Intellect, 2003). His main research topic is the history of the relationship between Cinema and Architecture and how this relationship informs the current debate on the use of digital moving images in Architecture. He is a fellow of Darwin College and a founder Director of the company ScreenSpace.

Interactive Multimedia = Whatever Intermedia Julainne Sumich “Interactive Multimedia = Whatever Intermedia” offers a perspective on how intermedia as an autonomous affect occurring between any media might be important to critical issues of research in multimedia. The form its writing takes is similar in its autonomy, analogous to the independence of the work of art. It starts with points of interest, defines a few of these, asks questions of them such as does that look right, does it work? The initial phrases have an affect on each other. They mutate and multiply and writing begins to talk back, create its own image. When people hear or read the term ‘intermedia’ they often seem uncertain as to what it means. Some suggest that it has ‘something to do with ‘multimedia’. It is the difference and similarity between the concepts of intermedia and multimedia as art practices that motivate me to investigate the parts and processes that bring them into proximity such as media and mediums, interactivity and multiplicity; what it is that distinguishes one expression from the other; and what critical contribution the process of this investigation might make to research in interactive multimedia. Alternating with these concerns is the development of a theory of affect. 1 How might a consideration of the actuality of affect contribute ideas as to what happens at the intersection of art and technology in interactive multimedia? Do the processes of affect come to characterize virtually real spaces, i.e. spaces that exist but cannot be named or seen? Can we picture the human as a medium through which technology affects other mediums? What happens if we think of technology as a science of insight occurring in the interaction between such media? To begin to assemble what might emerge between these currents of interest the writing sets out explanations for investigation. 1

Definitions In definitions of interactive multimedia it is a term generally explained as a combined use of several media into an interactive whole. The focus of its definition has often been on education and training in communications systems with an emphasis on hardware necessary to empower the production process, and standardisation of production in software design.

68

Interactive Multimedia = Whatever Intermedia Interactive multimedia can be defined as "the integration of text, audio, graphics, still image and moving pictures into a single, computer-controlled, multimedia product" (McCarthy, 1989, p. 26). Most current definitions describe a powerful computer connected to a variety of other equipment: videodisc players, compact disc players, scanners, music synthesizers, high-resolution monitors, etc. 2

In his book Theoretical Foundations of Multimedia, Robert Tannenbaum emphasises the importance to multimedia development of a multidisciplinary approach that interweaves “principles, concepts, and theories” from a wide variety of disciplines in the arts, human and computer sciences, and law to provide a knowledge base for designing and strengthening multimedia productions. 3 Definitions of communication(s), - “Communication (no s): the transmission of information between two minds. Communications: the system used to accomplish the communication. Communication in computer science: the electronic transmission of bits” 4 - and diagrams in Tannenbaum’s on-line slide displays demonstrate an understanding of interactive multimedia as a circuit of transmission of computer information modelled on the transmission of sensorial data in human communication systems. 5 Is communication a closed circuit? The diagrams variously refer to ‘air’ and ‘noise’ in the transmission of data in communication systems. What is going on? What is missing from the equation? Under the heading Communication as a Process, Tannenbaum comments on David Berlo’s contention “‘that one could prepare a list of the components of the communication process, but that would not capture the essence of the process, its dynamic nature’ – the same could be said for multimedia.” 6 At SIGGRAPH 1999 it was evident that as well as designing in the electronic components of multi-media the discipline needed an expanded awareness of the interactive human component. A cross-disciplinary approach enables students to explore new ideas that inform their understanding of the implications of multimedia production on the individual and society. The evolution of communication design has reinforced the need for an understanding of human behaviour and perception. 7

Julainne Sumich

69

Intermedia is in correspondence with interactive multimedia in that it does not fit with traditional art disciplines. 8 Its attention is on the conceptual process between media, related to video, sound, happenings, and installation art practices that emerged during the Process art 9 and Fluxus movements ‘beyond the object’ during the 1960s. 10 A definition of intermedia’s difference to multimedia was outlined at SIGGRAPH 2001. In intermedia, the compositional process works across the boundaries between media or even fuses media. Thus intermedia implies structures that are shared by or translated from one medium to another: in this respect it is a more specifically defined term than multimedia. While it is sometimes called "synesthetic art," intermedia does not seek to imitate the physiological phenomenon of synesthesia, but approaches it metaphorically. With the advent of digital multimedia and real time interaction and performance with computers, intermedia can now achieve a precision and synchronicity of events that were not possible until the last two decades. Moreover, digital media enable compositional structures to operate at all levels of granularity and with a degree of abstraction that places all media on the same plane. One could argue that digital intermedia is the high-level process that corresponds to the low-level truism: all media is data, a single substance. 11. “A single substance” can be said to define intermedia as a singularity in the sense that specific to digital intermedia and consistent with the identity of intermedia practice is a focus on the interactive potential for affect between media. The parts can combine into a multimedia event but attention to the autonomy of the components’ interaction affords a consciousness, or affection-image, not of a whole but of the event’s potential, a radically open any-space–whatever. Space is no longer a particular determined space, it has become any-space-whatever [espace quelconque], to use Pascal Augé’s term… Any-space-whatever is not an abstract universal, in all times, in all places. It is a perfectly singular space, which has merely lost its homogeneity, that is, the principle of its metric relations or the connections of its own parts, so that the linkages can be made in an infinite number of ways. It is a space

70

Interactive Multimedia = Whatever Intermedia of virtual conjunction, grasped as the pure locus of the possible. 12

Any-space-whatever [espace quelconque] is a term borrowed from the anthropologist, Marc Augé, by the philosopher Gilles Deleuze to describe “a virtual space open to the potential of multiple combinations” that affects or qualifies something by the intensity or novelty of its expression. An affection-image is how Deleuze pictures the process of being moved in any way. It is the before and the after of affect. The before is analogous to the stimuli of incoming perceptual data that affects [acts on] the autonomic neuro-physical process of bodily sensations. This process of affect is qualified by any-space-whatever of the body-brain environment, the contextuality of any number of possible combinations; for example, the micro-movements of facial nerves that move muscles to form any particular ‘look’. In film it is a look assembled from how parts are framed by their relation to each other into a movement of complex expression. 13 2

Defining IM = WhI and the Genesis of an Image Interactive Multimedia = Whatever Intermedia (IM = WhI). The title is an equation. It is put this way to indicate a situation or problem in which a number of factors need to be considered. The ‘=’ sign stands for a correspondence of conditions between things. The abbreviations IM =WhI are used as a means for giving the equation a character of inquiry, in the sense of I’m in correspondence with why. They occur recursively applying their function to their own values to generate an infinite series of variables. In the opening pages of E=mc2: A Biography of the World’s Most Famous Equation David Bodanis breaks apart the equation to reveal the character of the individual figures who constitute the theory of relativity. 14 It is a novel device for its affects how the stories behind each part start to overlap in the reader’s mind. In a similar but different fashion does reading the meanings of parts of words or origins and functions of words make it easier to understand the complexity and syntax of phrases they produce? By reading the following definitions as a vertical-horizontal array do we begin to assemble a code for deciphering IM = WhI?

Julainne Sumich

71

Inter- prefix: between or among; together, mutually, or reciprocally. Interactive adj: allowing or relating to continuous two-way transfer of information between a user and the central point of a communication system, such as a computer or television; (of two or more persons, forces, etc) acting upon or in close relation with each other; interacting. Media n: a plural of medium; the means of communication that reaches large numbers of people such as television, newspapers, and radio. Medium n: an intermediate or middle state, degree or condition; mean; an intervening substance or agent for transmitting or producing an effect; a means or agency for communicating or diffusing information, news, etc., to the public. Biology: the substance or surroundings in which an organism naturally lives and grows. Art: the category of a work of art, as determined by its materials and methods of production Multi- combining form: many or much; more than one. 15 Engagement of the reader’s attention attempts to affect the act of reading, to move from a literal to a figurative way of seeing, of defining things. It is a device for constructing an image in the tradition of ut pictura, ut poesis (as it is for painting, so also for poetry). 16 The literal translation says that poetry is of equal value to painting. In its figurative sense, pictura is defined as a painting in words; picture; description. The definition suggests that painting is variable in contrast to poetry which is equal to both sides of the equation, and might read as it is for painting, so also poetry paints (as well as speaks). Brought into conjunction the two parts elevate themselves out of their literal character and syntax into the expression of an idea. Thinking this way does IM = WhI embody a picture of multimedia more than multimedia itself? Can ‘intermedia’ help to evaluate what is happening in and between the uses of media? Such an expression complicates the examination of inter(active)(multi)media and its means, to include evaluation of “a space of overlappings” 17 and the affects generated between these agencies as a result of their convergence. This complexity is distinguished by two states of potentialities or affects: the affect things have as they are here and now, real connections, and the affect that is virtual, expressed in its own right, arising out of actualities but separate from their events. 18 Convergence of any mediums whatever, the data of people, words or things, gives rise to a perspective of what might be seen as a “surplus of signification” 19 always assembling itself in relation to its surroundings. It is an autonomous self-organizing affect, the constant of its identity being found in its condition of flux. It is comparable to the process of autopoiesis, a neologism introduced by biologists, Humberto Maturana and Francisco Varela, “to designate the organization of a

72

Interactive Multimedia = Whatever Intermedia

minimal living system”, which has come to be recognised as “ a view of the relation between an organism and its medium, where its selfconstituting and autonomous aspects are put at the center of the stage.” 20 How a system sees itself, how it is affected, becomes characterized, in the context of its environment is how it comes to be seen, regarded, in the environment. Such is IM = WhI’s complex system. The equation IM = WhI suggests that Interactive Multimedia can be defined as an integrated open-ended series of events in which a play between media’s component parts produces an affection-image or perspective related to and yet distinct from those properties. 3

SINE language As a way of bringing these processes center-stage as a conceptual image WhI uses metaphorical associations between different kinds of connections. Metaphor foregrounds language’s potential for making sense of the complexity of human experience and our interaction with the world. Figures of speech work by implication to produce and transfer meaning beyond the literality of language’s component parts. Some say that this imaginative use of language works only if it fits with an established notion of objective reality, that “there is a rational structure to reality, independent of the beliefs of any particular people, and correct reason mirrors this rational structure”. 21 Besides its ‘fit’ with traditional theories of meaning metaphor can be seen to influence meaning in ways that can refit and thereby reconfigure those traditions on the contention that “the embodied origins of imaginative structures of understanding, such as image schemata and their metaphorical projections” play an intrinsic and indispensable role in the constitution of rationality. 22 WhI capitalizes on the word SINE that sounds like SIGN to draw attention to the gaps in language in picturing a reality that cannot be seen. As understood in sine wave the word indicates a force field of alternating currents and by association lends its character to the acronym SINE (Science Intermedia Network Environment), a trans-disciplinary and interdisciplinary digital research hub at The University of Auckland.

Julainne Sumich

73

In picturing the SINE interface WhI’s mode of expression alternates between the transfer of objective information and, signs, sites, and sight as metaphors for the interactive processes of SINE website architecture, the interaction of SINE researchers, neurophysiology and technology as means of investigation into what happens along and between different streams of information. SINE site 23 The SINE is a virtual space whose purpose is to actively support the development of collaborative and individual projects between the arts and sciences. To make the temporal affect of the SINE’s activity visible the web-site designer Jamie Kydd took his cue from the SINE wave logo and coded flows of information that occur behind and at the surface of the site’s interface like a genome - like an array or a spread of information along a line, a string” that go off in any direction … or ‘off the page.” 24 He wanted to continue the metaphor of the spread of information by arraying them in overlapping curves like the sine wave from which they emerge. (above: SINE site Overview) 4

On Beta Test and Circuit Board links, Jamie’s intelligent design interactively embodies an artificially living intelligence. His interpretations of biological cellular processes acquire new meaning through the computer-generated ‘probes’ of main page information. "The pseudopods in this zone are modelled on amoeba behaviour – amoeba send out 'probes' seeking nutrients and the nucleus moves to a new area." 25 (left: Beta Test Pseudopod probes) “It’s as if passing attention from one thing to another. Something that was decentred (like the Internet) so that each time one is selected for digestion other bits retract.” 26

74

Interactive Multimedia = Whatever Intermedia

The cell on Beta Tests page is generated from the main page sine wave. Interaction with the cell membrane sends out probes that engulf particles of material. Once a particle ‘icon’ is selected the membrane folds inward, forming a membrane around the particle. The newly formed sac breaks away from the outer cell membrane and the ‘solid’ material inside the sac is then ‘digested’ by the user. On selection of a different sac the cell reorganizes itself in space (left: re-orientation of Beta Test Pseudopod probes). The design reflects a study of the body at the cellular level of organization, and the mechanisms for moving substances across cell membranes that are essential to living systems. 27 It is a powerful image that brings into convergence the software-probing function of beta tests and the lifesupport system that software is to the computer system, bridging experiences of surface and depth analogous to biological life. Based on its particular power of synthesis, metaphor can bridge the gaps between experience and thought, between imagination and concept, and between the new and the known. The central moment of this synthetic power is the iconicity of metaphor, which selectively evokes sensory perceptions and integrates them into meaningful constellations. Through this selective process, metaphor makes possible not only the conceptualisation of experience but also the linkage to prior experience. 28 How does interactive multimedia affect us physically and by its power affect our thought? It is an encounter between the audio, visual, sensory faculties, technological processes and scientific objectivity not normally in relation to each other that form new intermedia concepts. The challenge is how to give substance to the emergent properties of their combination. How is it possible to talk about the reality of an affect without being able to point to it and to say: There, there it is! Now! Do you see? We can try pulling strings.

Julainne Sumich

75

3.

Pulling Strings. Pulling strings of information becomes strategies for recoding assessment of interactivity, introducing new sets of sensibility at the interface of multimedia practice. The intermedia process of affect acts as if bridging a gap between whatever media and their product. It is as if it takes on the moment when bodily sensations are triggered by external phenomena: an alert affect transferring impulses between synapses in the nervous system. An afteraffect of this interaction assembles a direct time-image contingent upon these component media but which simultaneously obtains an image of singularity autonomous from its parts. 29 “The now is not just here; it is the process of this slow arising of combinations which forms the complex assemblage.” 30 It is a gradual consciousness of the autonomic affect informing multimedia. The autonomic affect of WhI is substantiated by its embodiment in the nervous system. Affect is a precognitive phenomenon, a state of visceral anticipation. Receptors (as in the gut, blood vessels, muscles etc.) conduct information about the state of the body to the peripheral and central nervous systems where the data is sorted as specific sensations of touch, hearing, vision, position, etc “When sensory information reaches the cerebral cortex we experience precise localization. It is at this level that memories of previous sensory information are stored and the perception of sensation occurs on the basis of past experience.” 31 WhI can be described as a process of ‘pulling strings’, of stimulating the exertion of personal influence, and how we are codified with the ability to ‘pull the strings’, to conceptualise from interaction with any data whatever, and to put those concepts to the test. Beta Test pages scroll by pull-down strings to pull sets of words that are already ‘strings’ conducting technical information. They play between the information, and information about the information; they play on control systems. (above: Pull-down Strings. Vision Sam 2001, Robots project between Electrical & Electronic Engineering and Fine Arts, University of Auckland)

76

Interactive Multimedia = Whatever Intermedia

In later tests the strings resemble nerve fibres connecting synaptic knobs connecting sensory information to memory. (left: Synaptic links to Memory in Robot Theatre Sam 2002) The time it takes to register an assemblage’s “intrinsic and practical relations” 32 and to become aware of their combined resonance can be related to brainwave tests carried out by the neuroscientist Benjamin Libet which demonstrated a “readiness potential” in neurological processes. Libet’s experiments showed brain activity already occurring a split second prior to the decision to act and that following this there is a split second between the decision and the completed action. 33 It amounts to a half-second of autonomic and conscious activity where the autonomic is pure sensation; an affect which is then mixed with memory traces. Brain and skin form a resonating vessel. Stimulation turns inward, is folded into the body, except that there is no inside for it be in, because the body is radically open, absorbing impulses quicker than they can be perceived, and because the entire vibratory event is unconscious, out of mind. Its anomaly is smoothed over retrospectively to fit conscious requirements of continuity and linear causality. 34 This autonomic half-second is the thesis of the intermedia affect, the WhI of interactive multimedia in its embodied time. Its affect is pure sensation, the deep now. 35 It occurs in as in-flashing, bringing into play the accumulation of sensorial data and neuronal activity, and projecting points of its insight. This form of insight is not understood as a conscious reflection but something that arises autonomically as an in-forming, 36 of vision [Einblick] as in-flashing [Einblitz], 37 in a shot of time. 4.

The Alert-affect Coinage of the term alert-affect prepares the way for IM to become embodied with the readiness potential intrinsic to WhI. Modelling itself on Libet’s readiness potential it works introspectively in the body and retrospectively in memory depending on the context of its stimulus and the intensity of the affect.

Julainne Sumich

77

The alert affect arises through the interaction of any means of communication whatever and is a time of engagement, ready for action. It is material in its substance: When neurons become active (a state known in neuroscience jargon as “firing”) an electric current is propagated away from the cell body and down the axon. When this current arrives at a synapse, it triggers the release of chemicals known as neurotransmitters. In an excitatory neuron, the cooperative interaction of many other neurons whose synapses are adjacent determines whether or not the next neuron will fire, that is, whether it will produce its own action potential, which will lead to its own neurotransmitter release, and so forth. 38 The word chemical derives from Late Greek khemeia “the art of transmutation.” 39 These naturally occurring chemical substances, that create a bond between nerve impulses and receptors in the brain, can be considered as the substantial evidence for affect as an intermedia process, and the circuits within which their activity simultaneously arises as the any-spaces-whatever. Neurons are organized into functional units known as neuronal pools or circuits of different patterns and with specific roles in maintaining equilibrium in the nervous system. These include complex circuits such as a reverberating (oscillatory) circuit amongst whose body responses are thought to be the rate of breathing, coordinated muscular activities, waking up. (above: a reverberating circuit). 40 In this pattern, the incoming impulse stimulates the first neuron, which stimulates the second, which stimulates the third and so on. Branches from the second and third neurons synapse [from Greek sunaptein to join together 41 ] with the first, however, sending them back through the circuit again and again, A central feature of the reverberating circuit is that once fired, the output signal may last from a few seconds to many hours. The duration depends on the number and arrangement of neurons in the circuit. 42

78

Interactive Multimedia = Whatever Intermedia

The SINE site’s alert-affect is in its eyeball hang-time. 43 It is a time of attention when interaction takes place processing the potential for new patterns of thought and action. The SINE logo is an animated sign of the activity of the SINErs who contribute to the alternating currents of events in offline/online SINE research hub discussions. This intermedia alertness is a process of animation, a series of self-systematized codings and events characterized by selection and retraction working in parallel with the spread and retract design of the SINE site. It is a self-organizing system, a form of autopoeisis, similar to the culture emerging between SINE researchers. 5

SINErs Can the SINE be described as an autopoetic system? Thinking about this question gives rise to an affection-image, an image of a movement. If such a definition relies on how it appears to observers in the local environment then the answer is ‘no’. If its definition amounts to more than meets the eye; if it is a living entity constituted by its environment and in a simultaneous mutual operation constituting a world of its own i.e constituting a perspective of significance, an identity, surplus to what is visible then ‘yes’ the SINE can be defined as an autopoetic system. It is a growing culture. The SINE grew out of sensing something intentional going on internationally at the science /art interface. Conferences, festivals, and exhibitions in Australia and the United Kingdom - SciArt’99 Brisbane, CaiiA – STAR Consciousness Reframed III, 2000 Wales, Creating Sparks, the British Festival of Science 2000 Royal College of Art, and Force Fields: Phases of the Kinetic, Hayward Gallery London were some of the examples. In response the hub came into being through a collection of interested people in 1999 and developed further in 2001 by sending out probes across the faculties for whatever interest there might be in such a research environment. The purpose behind the SINE combines the University of Auckland’s mission, to foster and disseminate high quality research and creative work, with aspects of CaiiA- STAR’s established philosophy on the integration of art, science, technology, and consciousness research. 44 SINE researchers include faculty staff and postgraduate students from Fine Arts, Architecture, Electrical & Electronic Engineering, Computer Science, Maori Studies, Art History, Immunology, and Psychology. We make a loose-knit virtual environment; our reason for coming together, as one researcher put it, is “the difference in perspective.”

Julainne Sumich

79

As participants in an intermedia network of different disciplines we are situated in relation to the research hub’s agency. In other words we are the hub. SINErs constitute the SINE as a cognitive entity with its own perspective and can be described as both singular and multiple systems of autopoesis. Recontextualizing the “neuro-logic” of Francisco Varela’s Biology of Intentionality this identifiable cognitive entity “relates to its environment in relation to the constantly emerging properties of the agent itself and in terms of the role such running redefinition plays in the system’s entire coherence.” 45 As he did in his analysis of the minimal cellular life, Varela differentiates between environment and world, going on to develop a theory on the neuro-logic of the cognitive self as a both closed and distributed process of embodied behaviour, realized in operation with, and inseparable from these different forms of environment. First, I have tried to spell out the nature of its identity as a body in motion-and-space through the operational closure of the interneuron network [i.e. as an autonomous system]. This activity is observable as multiple sub-networks, acting in parallel and interwoven in complex bricolages [patchworks of different temporal and generic parts], giving rise again and again to coherent patterns which manifest themselves as behaviours. Secondly, I have tried to clarify how this emergent, parallel, and distributed dynamics is inseparable from the constitution of a world, which is none other than the surplus of meanings and intentions carried by situated behaviour.” 46 In this diagnosis of identity the genetic elements described as interneuron network, emergent, parallel, and distributed dynamics, and world, run parallel to Deleuze’s interactive circuit of affect, any-spacewhatever, and affection-image in the constitution of a movement-image. That is, Deleuze’s movement-image is a ‘world’ unto itself as a concept but which remains radically open. SINErs come and go, exchange ideas; the interacting waves of their own in-formation, always different in volume and frequency, produce new waves of thought. They bring probing questions reorientating the SINE’s coordinates and by extension its virtual space, simultaneously retaining past experiences in the now and evolving projections of potential.

80

Interactive Multimedia = Whatever Intermedia The action of neurons depends on the nearby assembly of neurons they belong to; whatever systems do depends on how assemblies influence other assemblies in an architecture of interconnected assemblies; and finally, whatever each assembly contributes to the function of the system to which it belongs, depends on its place in that system 47 .

Some SINE ideas hold the attention and gel together for a project; then disassemble again – just as the Circuit Board breaks off from its mapping, breaks open into new work – then retracts again. (left Circuit Board map) The Circuit Board was designed through processing the SINE logo in 3D to form a map, bits of which could be accessed randomly to add information on conferences, exhibitions, visiting interdisciplinary artists, new works. Interacting with other SINE disciplines is a practice of uncertainty relations between familiar things and whatever differences. Yet to a certain extent aren’t we are always in that situation “between knowing and non-knowing”? Never quite knowing what’s coming next? 48 The research for a current collaboration project, Interaction in Embodied Time, is based on a reality-based ‘readiness potential’ simulation of human interaction with images, sounds, and time delays that provides a sensation of what is usually stimulated in the body autonomically of cognition. The design’s artificial intelligence is built on the model of subsumption architecture in robotics structured according to basic necessity control system and progressively ‘subsumed’ or incorporated into more complex control systems. The third element is novelty processing – where whatever is “new (never experienced before), unexpected (out of context) or unpredictable stimuli” 49 is reticulated or actively networked. Can we say that these dynamics of different stimuli and various combinations in WhI correspond to IM, Interactive Multimedia?

Julainne Sumich

81

The affect of these interactions produces behaviours in common with the synchronised and sometimes mismatched interactions between the autonomic and consciousness systems in human neurology. Mismatched interactions can add grist to a system; they provide a sudden opportunity to turn its affect to advantage. Readiness to take advantage of what might usually be dismissed as a momentary lapse, a miss-take in productivity relates to the basic temporal interval as the most important component of ‘readymade’ music in the conceptual art of Marcel Duchamp 50 [an early SINEr]. Exemplified in works of ‘canned chance’ and readymade ‘noise’ occurring in the system, its autonomous moment is integral to the intermedia art process working both synergistically and agonistically against the grain to activate a movement of re-action. 6

Technology A picture has emerged of the SINE’s autopoetic processes, its technology, simultaneously sympathetic and autonomic to the control systems of different disciplines. (Left:a parallel after-discharge circuit). 51

In a parallel process an image schemata, an affection-image, arises between technology and human imagination in Interactive Multimedia. WhI asks how do we define this ‘technology’? [F]or techne signifies neither craft nor art, and not at all the technical in our present-day sense; it never means a kind of practical performance. The word techne denotes rather a mode of knowing. To know means to have seen, in the widest sense of seeing, which means to apprehend what is present, as such. 52 Does technology organize our daily lives? The conceptual process of art is an attempt to enable a sense of truth to emerge from the everyday; something incommensurable, indefinable, invisible, something that reveals itself as an actuality instrumental in watching over 53 an informed ordering of resources. It is a technological process for revealing what is really going on.

82

Interactive Multimedia = Whatever Intermedia

In The Question Concerning Technology Martin Heidegger situates modern technology as an enframing; a challenging to mankind. From the ordinary usage of the word “Gestell” [frame] he intends its use as Ge-stell [Enframing]. Heidegger explains taking this liberty in light of Plato’s daring to use the word eidos [which in common usage meant the outward appearance] as “that which in everything and in each particular thing endures as present”, and which is also palpable to the senses yet “what precisely is not and never will be perceivable with physical eyes.” 54 Heidegger similarly uses an externally oriented word to determine an idea. Enframing means the gathering together of that settingupon which sets upon man, i.e., challenges him forth, to reveal the real, in the mode of ordering, as standingreserve. Enframing means that way of revealing which holds sway in the essence of modern technology and which is itself nothing technological. 55 In its hyphenation Heidegger intends an animation of Ge-stell recognising both stellen [to set upon] from which it stems, at the same time retains a sense of other forms of the same derivation, “namely, that producing and presenting [Her- und Dar-stellen] which, in the sense of poiesis, lets what presences come forth into unconcealment.” 56 As the translator’s footnote emphasises, Enframing is a callingforth a summons. 57 It is both a charging with responsibility and an invitation you cannot refuse to respond ‘technologically’, with knowledge aforethought. Enframing can be thought of as containing a sense of the continuous “sift and select” process in the reticular activating system, the physical basis of consciousness, the brain’s chief watchguard. 58 Heidegger uses words from the same linguistic ‘stem/cell’ world to suggest that the limits of enframing can be productive – that it can affect creativity, be generative just as the world of the human- or biotechnology interface while on the one hand being confining can on the other hand be productive; that one stands as the [p]reserve of the other .In other words it is a double movement of a permeable world whereby what enframes is coherently and physically linked to a bringing forth in much the same way as Varela’s scheme of how the cellular self is in-formed produces an image of intentionality. Varela maintains “that we gain from seeing the continuity between the fundamental level of self and the other regional selves such as the neural and linguistic.” Any living system becomes “in-formed for a self” in its interactive argumentation.

Julainne Sumich

83

“Such in-formation is never a phantom signification or information bits, waiting to be harvested by a system. It is a presentation, an occasion for coupling, and it is in this entre-deux that signification arises.” 59 Heidegger’s question is whether mankind is to live instrumentally according to the dictates of technology or through questioning technology to live poetically. 60 In its instrumental sense technology is thought of as a set of practices conforming to rules and criteria established in relation to the uses to which new technologies have been put. In this instrumental sense technology is formatted or framed according to its predetermined application. A basic example in relation to digital video practice: camera, tape, computer and software are the media and electronics by which a representation of real life can be recorded and made into a movie. In this common sense, what is science and what is art? Can the instrumental uses of technology be called art practice? They can be termed ‘scientific’ in that they involve a technical know-how on the workings of the digital world and how the properties its parts behave under certain conditions. They are instrumental in determining to what commercial or industrial ends technology can be pragmatically or effectively applied. Then there is the alternative aspect of technology, the affective use, or the “techno-allegorical” designation 61 , one that involves the interaction of the user as also a technological component who brings substantial know-how on art practice and a specific sensibility to the media and tools irrespective of their predetermined ends. Between the user of any media whatever and technology there is bringing forth of an indeterminate sensation. This combined affect [poiesis] brings about a revealing of something ‘other’ concealed in the practical aspects of technology. 62 For this aspect there can be no art or science instruction booklet. It emerges out of the mix of technology in the broadest sense of the term. It reveals a “technological unconscious” 63 or affect. The project for Intermedia artists is to combine the instrumental use of technology with a use of its enframing to liberate technology’s essential capacity for poiesis. That is to create an affective use of technology beyond the functional recording and representation by which technology is commonly understood. As demonstrated in SINE Beta Tests, living systems can inform technology affecting our behaviour. This process is generative and remains open to further mutations. It answers Heidegger’s question in that technology can come to be seen “as an ongoing, generative process or agency that opens rather than enframes, a process that is closer to aesthetic production than to the instrumentality of modern technology.”64

84

Interactive Multimedia = Whatever Intermedia

Waking Life, a digital movie directed by Richard Linklater, combined real-life footage with computer rotoscoping animation that worked by over-drawing selected frames from the real material, then interpolating (transitioning) between the frames, and in so doing emptying the real of its significance. These multiple intermedia processes affect the cinema-goer’s persistence of vision whereby the brain makes up for what it doesn’t see “to fit conscious requirements of continuity and linear causality”. The technological novelty of the animation is engaging and disorienting, affecting physical sensations both pleasurable and discomforting, movements reorganizing the body’s internal logic. In his inquiries into the logic of knowledge the philosopher Immanuel Kant derived his definition of affect by combining two of Aristotle’s interpretations of pathos [suffering] as a certain ‘mode or movement of the body’ and as ‘where pathê are the feelings that are accompanied by pleasure and pain’ 65 . The definition of Kant’s use of affect where it is not “simply a movement as such, but a movement which overcomes an obstacle ‘like water that breaks through a dam’ 66 connects affect with feelings of pleasure and pain leading Kant to distinguish between two kinds of affect either ’sthenic’- exciting and frequently exhausting’ – or ‘asthenic’ – sedative and relaxing.” 67 The software technology used for Linklater’s movie was the breakthrough affect on the many artists who worked on the film enabling them to collectively represent the incommensurable in Waking Life. The animation’s visual vibration mirrors oscillations in the storyline providing its dream scenario with an image of authenticity, yet the need to orient to these in-forming novelties physically relaxes or blocks the brain’s neurological system momentarily ‘losing the plot’. In this way a real image is produced of the autonomic affect’s visceral embodied life at work in cinema. On leaving the theatre the after-affect is stunning - in both senses - amazed and confused! These thoughts turn back to the reverberating circuit in the architecture of the brain where “there is a ‘relaxation’ time of back and forth signals until everybody is settled into a coherent activity.” 68 Each affect stirs another in turn in-flashing with another with the potential to be en-grammed into long-term memory, 69 depending on the affect’s duration and intensity, wairua, attitude, or spirit. 70 Wairua also means ‘two waters’. Its alternating currents in-form the comic character and complexity embodied in writing the equation IM = WhI. The know-bot is kicking up its heels on the desktop – a sign that writing’s time is up.

Julainne Sumich

85

7

Sign Out Interactive Multimedia = Whatever Intermedia attempts to determine the indeterminate value of an intermedia practice. Its concern is not to do with the surreal or the limits of what we may know, how we might imagine things to be, but rather the aim has been to demonstrate a real image of creative potential embodied in the body, grounded in the substance of everyday life which forces us to think between contested sites, different disciplines, arguing that it is the contest between mediums that provides its juice, its agonistic muscle, and that such practice can inform an evaluation of and philosophy behind interactive multimedia. The parallel after-affect of the equation IM = WhI is that its parts can organise themselves into some thing novel. It reveals an open interactive entity, a world rich in potential. (left and below: processing novelty, SINE meeting .4.10.03) Processing novelty in WhI has interlaced alternate fields of text as an image of IM as it moves beyond its temporal stage into any-spaces-whatever. The orienting response is an involuntary shift of attention that appears to be a fundamental biological mechanism necessary for survival. Orienting is a rapid response to new (never experienced before), unexpected (out of context) or unpredictable stimuli, which essentially functions as a ‘what-is-it’ detector…The detection of the event precedes orienting and, if it is sufficiently deviant, engenders the involuntary capture of attention, enabling the event to enter consciousness, thus permitting an evaluation of the stimulus. This could lead, if the event is deemed significant, to behavioural action. 71

86

Interactive Multimedia = Whatever Intermedia

Acknowledgments: Anthony Allan MFA Scuplture, Fine Arts, The University of Auckland, New Zealand Jamie Kydd BFA Intermedia. Designer SINE arts web site. Dr Maureen Lander Senior Lecturer, Maori Studies Department, The University of Auckland; Coordinator of Doctoral Seminar Programme, Fine Arts, The University of Auckland. Dr Bruce MacDonald Senior Lecturer, Electrical & Electronic Engineering, The University of Auckland. Dr Peter Shand Senior Lecturer, Head of Graduate Studies, Fine Arts, The University of Auckland Alex Sumich, PhD candidate, Brain Image Analysis Unit, Biostatistics and Computing, Institute of Psychiatry, London, UK.

Notes

1 ‘affect’ as understood from the Latin affectus, from afficere to act upon. Collins English Dictionary: 21st Century Edition. (Glasgow, HarperCollins Publishers, 2000), 24. 2 Schroeder, Eileen E. Interactive Multimedia Computer Systems. ERIC Digest.ED340388. November 1991 (4 October 2003). McCarthy, R. (1989, June). Multimedia: What the excitement's all about. ELECTRONIC LEARNING, 8(3), 26-31. ERIC number EJ 395 537. http://www.ericfacility.net/ericdigests/ed340388.html 3 Tannenbaum, Robert S. Theoretical Foundations of Multimedia (W.H.Freeman and Company, A Computer Science Press Book, 1998), 8 October 1999, (11October 2003) http://www.uky.edu/~rst/mmbook/preface.html

Julainne Sumich

87

4 Ibid, Chapter 5 Slide 4 (5 October 2003). http://www.uky.edu/~rst/mmbook/slides.html 5 Ibid Chapter 5 Slides 28-29 (5 October 2003).

6 Ibid. Chapter 5 Slide 32 (5 October 2003). 7 Mitchell, Bonnie. Defining Interactive Multimedia Design Education: Expanding the Boundaries. 1999, 1, (3 October 2003) .http://www.siggraph.org/education/conferences/GVE99/papers/GVE99.B.Mitchell.pdf 8 “ Video cannot be defined as a singular field or entity but, rather, has to be defined at the level of what Gene Youngblood, early in video art’s development called intermedia.” Breder, Hans & Rapaport, Herman. The Luminous Object: Video Art and Theory. (Rhode Island School of Design, Visible Language29.2, 1995), 117-121. 9 “The origins of the process idiom can be traced to the Surrealists and their ‘automatism’, or abandonment of conscious control.” ... “Once the Process artist has decided on a systematic method his behaviour becomes automatic, and the results are accepted, without regard to their visual appeal, as if they were natural products.” Walker, John A. Art Since Pop. (London, Thames and Hudson, 1975), 34-35. 10 “Fluxus, the loose-knit international community of artists … a non-conformist grouping noted for their Happenings, actions, publishing, and mailing activities.” Walker, 49. 11 Polli, Andrea et al. Erasing Boundaries: Intermedia Art in the Digital Age. SIGGRAPPH 2001 (June 2003) http://www.siggraph.org/artdesign/gallery/S01/panels.html 12 Deleuze, Gilles. Cinema 1: The Movement Image, (London, The Athlone Press 1997), 109. 13 Ibid, 87. 14 Bodanis, David. E=mc2: A Biography of the World’s Most Famous Equation. (London, Pan Books, 2000). 15 Collins English Dictionary 16 The adverb - conjunction uses of the word ‘ut’ provides the potential for retaining the syntax in its literal description while constructing at the same time a conceptual image. (Cassell & Company, Cassell’s Latin Dictionary, 1949), 601. 17 Deleuze, Gilles. Cinema 2: The Time Image. (London, The Athlone Press, 1989), 203. 18 Deleuze, C1, 102. "… as they are actualised in an individuated state of things and in the corresponding real connections (with a particular space-time, hic et nunc, particular characters, particular roles, particular objects) and as they are expressed for themselves, outside spatio-temporal co-ordinates, with their own ideal singularities and their virtual conjunction."

88

Interactive Multimedia = Whatever Intermedia

19 Varela, Francisco J. Autopoiesis and a Biology of Intentionality. (Paris, France, CREA, CNRS-Ecole Polytechnique, 1992), 7. 20 Ibid, 5. 21 Johnson, Mark. The Body in The Mind: The Bodily Basis of Meaning, Imagination,and Reason.. (Chicago and London, The University of Chicago Press, 1987), .x. 22 Johnson, .xv xii 23 http://www.sinearts.auckland.ac.nz, Requires Flash software. 24 Interview with Jamie Kydd. April 2003 25 Kydd. 26 Kydd. 27 “[P]seudopodia engulf solid particles external to the cell. Once the particle is surrounded, the membrane folds inwardly, forming a membrane sac around the particle. This newly formed sac…breaks off from the outer cell membrane, and the solid material inside the vacuole is digested.” Gerard J.Tortora and Nicholas P.Anagnostakos, Principles of Anatomy and Physiology 4th Edition. (New York,Harper & Row Publishers, Harper International Edition, 1984), 56-57. 28 Debatin, Bernhard. Precis of The Rationality of Metaphor: An Analysis Based on the Philosophy of Language and Communication Theory. 11 June 2001, (4 July 2003), http://www.uni-leipzig.de/~debatin/english/Books/Diss.htm, 2. 29 Deleuze, C2, xii. “[W]hat we call temporal structure, or direct time-image, clearly goes beyond the purely empirical succession of time – past-present-future. It is, for example, a coexistence of distinct durations, or of levels of duration; a single event can belong to several levels; the sheets of past coexist in a non-chronological order.” 30 Varela, Francisco. The Deep Now. Interviewed by Arjen Mulder in “Machine Times”. NAI Publishers/V2_Organisatie, Rotterdam 2000. 012 31 Tortora and.Anagnostakos, 350-351 32 Varela, TDN, 011. 33 Massumi, Brian. The Autonomy of Affect in “Deleuze: A Critical Reader”, ed Paul Patton Blackwell Publishers Oxford 1996 p 223. Libet, Benjamin. Neurophysiology of Consciousness: Selected Papers and New Essays. BirkHaüser, Boston/ Basel / Berlin.1993 Readiness-Potentials Preceding Unrestricted “Spontaneous” vs. Pre-Planned Voluntary Acts (1982) 229-242 34 Massumi, 222 35 Varela, TDN,011 36 Varela, Autopoeisis, 8

Julainne Sumich

89

37 Heidegger, Martin. The Question Concerning Technology and Other Essays. (New York, Harper & Row, Publishers, Translated and with an Introduction by William Lovitt, 1977.), 45-46 38 Damasio, Antonio. The Feeling of What Happens: Body, Emotion and the Making of Consciousness (London, UK: William Heinemann, 1999), 330. 39 Collins English Dictionary, 35. 40 Tortora and.Anagnostakos, 280-284 41 Collins English Dictionary, 1552. 42 Tortora and.Anagnostakos, 283. 43 Manovich, Lev. The Language of New Media. (Massachusetts, Cambridge, The MIT Press 2001. p.161 44 CAiiA, Centre for Advanced Inquiry in the Interactive Arts established 1994, University of Wales College Newport, and STAR, Science Technology and Art Research centre, established 1997 University of Plymouth. http://www.caiia-star.net/mission/index.html

45 Varela, Autopoesis, 11. 46 Ibid, 13 47 Damasio, 331 48 Interviewed about his interest in neurology, Deleuze talked of “running a risk”, that he speaks from the border “between knowing and non-knowing”. “To some extent, one is always at the extreme of one’s ignorance, which is exactly where one must settle in , at the extreme of one’s knowledge or one’s ignorance, which is the same thing, in order to have something to say.” Gilles Deleuze's ABC Primer, with Claire Parnet, Directed

by

Pierre-André

Boutang

(1996)

3

May

2000

(24

July

2000)

http://www.langlab.wayne.edu/CStivale/D-G/ABC1.html 49 Friedman, David et al, The novelty P3: an event-related brain potential (ERP) sign of the brain’s evaluation of novelty, Neuroscience and Behavioural Reviews 25 (2001) 356. http://www.elsevier.com/locate/neubiorev 50 Adcock, Craig. Marcel Duchamp’s Gap Music, in “Wireless Imagination: Sound, Radio, and the Avant-Garde”, ed. Douglas Kahn and Gregory Whitehead, MIT Press 1992, 116 51 Tortora and.Anagnostakos, 284. 52 Heidegger, Martin. The Origin of the Work of Art in Poetry, Language, Thought, (New York, Harper & Row, Publishers, Translation and Introduction by Albert Hofstadter, 1979), 59. 53 Heidegger, TQT, 12 n 12, definition of “wahrheit” [truth] as understood by Heidegger. 54 Ibid, 20.

90

Interactive Multimedia = Whatever Intermedia

55 Ibid. 56 Ibid, 21 57 Ibid, 19, n17 58 Tortora and.Anagnostakos, 357. 59 Varela, Autopoeisis, 8. 60 Heidegger, TQT, 35 61 Rutsky, R.L. High Techne: Art and Technology from the Machine Aesthetic to the Posthuman. (University of Minnesota Press, 1999), 89-90. 62 Heidegger, TQT, 11-12 63 Rutsky, 21, 27, 132. 64 Rutsky, 149 65 Caygill, Howard. A Kant Dictionary, (Oxford, Blackwell, 1999), 57. 66 Ibid. 67 Ibid, 58. 68 Varela, 10. 69 Tortora and.Anagnostakos, 365. “An incoming signal facilitates the synapses in the circuit used for that signal over and over and you recall the thought. Such a neuronal circuit is called an engram.” 70 Ryan, P.M., The Reed Dictionary of Modern Maori (Auckland, Reed Books, 1995). Maori is an official language of New Zealand and is therefore not written in italics. The ‘two waters’ explanation was given in conversation with Maureen Lander, Doctor of Fine Arts, Maori Studies, University of Auckland, 16.10.03. 71 Friedman, 356.

Mixed-mode Communication Courses at a Multicultural Technikon: A Pilot Study Combining Web-based Learning and an Internet Search Project with Face-to-Face Classroom Instruction. Dee Pratt 1.

Introduction

1.1 Overview Vocational education is a key area in any country, but in South Africa, with an unemployment rate of over 40% in spite of increased economic growth, 1 effective delivery of vocational education, particularly in technical and commercial fields, is essential not only for the country’s development, but for its survival. Moreover, further education plays a pivotal role in the process of transformation and redress in post-apartheid South Africa, which, coupled with a rapidly developing-economy, will require “lifelong learning” rather than a few years of post-school degree. 2 Computers are popularly viewed as the solution to all educational problems, 3 but computer technology is only effective when it can be seen to enhance learning approaches catering for specific local needs. This paper describes an attempt to enhance delivery of semester Communication Skills courses at a multicultural technikon, the Durban Institute of Technology, by running them in mixed mode, i.e., partly in conventional face-to-face lectures and tutorials, and partly over the Internet. The Comm. Skills Online project generated an incredibly rich layering of experiences in mixed-mode course delivery for the author, who personally facilitated and assessed the work of over 230 students in three different mode blends, and, in addition, supervised the computer laboratory work of over 300 students - many of whom had come to DIT from disadvantaged educational backgrounds and who were first-time computer and Internet users. This account will first give an overview of the use of Information and Computer Technology in the South African educational context. It will then focus on the learning approach used in the project, describe how it was translated into mixed mode, discuss the resulting blends of mixed-mode delivery which were used in the project, and give an account of both the enhancements achieved and the problems encountered. 1.2 The academic setting for the project The Durban Institute of Technology, the result of a recent merger between ML Sultan Technikon and Technikon Natal, has at present over 20,000 students and is spread over seven campuses. The name of the new institution is in keeping with the newly-formulated role of technikons in

92

Mixed-Mode Communication Courses

South Africa, that of “Universities of Technology”. 4 While the use of “Durban” in the institution’s name is ethnically neutral, the student demographics are a volatile and challenging mix: predominantly African (67%), with the rest mainly Indian (Asian 21%) and students of British/Dutch descent (7%), but also with minority groups of Eritreans, Batswana, Mozambicans, Namibians, Zimbabweans, Chinese and Taiwanese to name but a few. The large African component of the student population, even when from South Africa, should not be viewed as homogenous: a multiplicity of indigenous African cultures (and languages) can be found at DIT. Staff demographics also constitute a multicultural blend, with former TN lecturers being predominantly white male, and MLS, predominantly Indian male, remnants of Afrikaner upper management, with top management positions being reserved for “high flyer” African academics. The diverse student demographics provide academic staff with a constant challenge in delivering academic programmes which are not only accessible to all students, but also professionally relevant, as DIT is primarily a vocational institution. It must be remembered, too, that while the majority of our students come from an educationally disadvantaged background, diploma/degree groups contain a hugely disparate range of academic abilities, which makes it even more difficult to design courses which will be relevant for all students. An institution in the throes of a merger might not seem an ideal setting for the English & Communication Department’s first major venture in mixed-mode delivery 5 . However, the breaking down of boundaries in times of institutional change can also be seen to offer opportunities for innovation and growth. Moreover, it was thought that mixed-mode delivery might provide solutions for some of the problems resulting from the merger, in particular, larger student numbers, a reduced staff complement and dwindling teaching resources. 1.3 ICT in the South African educational context It must be emphasised that ICT (Information and Computer Technology) in the South African educational context has a somewhat different scope and purpose from that implemented in developed countries over the last ten to fifteen years. During this period South Africa has been involved in a transformation and empowerment process politically, economically and educationally. ICT use in education, then, follows a transformation and empowerment model, as in many under-developed countries, with the added advantage that South Africa is in a better position economically than most of sub-Saharan Africa, and has also had the opportunity to learn from the example of earlier innovators. In countries such as the UK and the USA, by contrast, while the introduction of ICT did indeed represent a transformation in education, this was effected through an extension of already plentiful material resources by moving into the information-rich

Dee Pratt

93

virtual environment of the Internet. It would be misleading, then, to assume that in South Africa use of ICT in education is merely retracing the path taken by more developed countries, or that it is a journey similarly supported and buffered by a materially-affluent educational infrastructure. Over the period where ICTs were gradually being introduced overseas, education in South Africa was undergoing a radical transformation to redress the inequalities and injustices caused by apartheid. The blueprint for this transformation process is contained in the Green Paper of 1996, the White Paper of 1997 and the CHE Report, 6, 7 , 8 with the role of ICT set forth clearly in the subsequent TELI (TechnologyEnhanced Learning In South Africa) and NADEOSA documents. 9, 10 , 11 Neil Butcher, “guru” of ICT in South Africa, has contributed extensively to the arduous research, policy-making and documentation process involved in the transformation of higher education though technology, much of which is summed up in his recent SAIDE retrospective. 12 Implementation has in fact been extremely well informed by comprehensive preliminary surveys and research, government policies and documents, and joint nationwide initiatives such as TELI and SAIDE (South African Institute for Distance Education). South Africa has not only done its homework in advance, but has done it thoroughly. 1.4 ICT infrastructure in South African educational institutions In spite of this excellent and extensive preparation, implementation has been uneven. Apartheid education policies have resulted in huge disparities and inequalities in the present educational infrastructure. This has meant that the majority of our educational institutions do not possess the human or material resources typical of developed countries, and which provide a firm platform for the “launch into cyberspace”. Resources such as teachers, classrooms, notes, text books and libraries are a given in more developed countries, as well as basic amenities such as electricity, running water and toilets: in South Africa, this is by no means the case. The majority of schools in KwaZulu-Natal, which is particularly undersubsidised by Government in spite of having a larger population than all of the other provinces put together, do not have all of the basic amenities, let alone sufficient teachers, classrooms, notes, text books and libraries. Tertiary institutions are far better resourced, in spite of the existence of “historically disadvantaged institutions”, 13 but the student-lecturer ratio is still generally too high for effective learning, particularly in the languages: 14 it is not unusual for one language lecturer to hold a face-toface “tutorial” session with a group of over a hundred students. Factors such as government cutbacks in funding, non-payment of student fees, and a low pass rate - resulting in huge subsidy loss - contribute to even historically-advantaged institutions having insufficient academic staff, learning resources, facilities and, increasingly, basic amenities such as

94

Mixed-Mode Communication Courses

washroom fittings. Regrettably, the very processes - including the proposed mergers - set in place by Government to transform higher education have had the unintentional effect of leaching higher education institutions of their financial reserves and putting a severe strain on their existing resources. The outlay in setting up the infrastructure for ICTmediated educational programmes, including the equipment and facilities, staff induction and support needed to run mixed-mode or distance education, is prohibitively expensive. Once set up, this infrastructure needs to be maintained, and financial sustainability is an overriding problem at all levels of education. However, even where finance, equipment, and trained personnel already exist at HE institutions, poor governance of available resources and a “silo” mentality can impede student access: this serves to entrench inequalities, in view of the fact that only the minority educationally- and economically-advantaged students have private access to the Internet. In the recent SANTEC Online Conference, which attracted international participation as well as delegates from throughout South Africa, the following solutions were suggested to the problems of limited student access to computers and the Internet: 15 •

Longer opening hours at the (computer) learning centre



Using blended learning approaches



Scheduled classes where access is guaranteed, even if this contradicts the flexibility of eLearning



Use of Internet cafes



Provision of more Internet cafes, even through the government



Strategic partnerships with government, private sector, donors and learning providers



Strategic placement of computers



Strategic preference given to subjects and learners working in pairs on a computer



Blended learning and off-line learning, e.g. providing content on floppy disk or CD-ROM, to facilitate support for students



Ubiquity - more widespread distribution of resources



Free Internet access in public libraries



Use of ultra-thin client products which can be used simultaneously by two users

1.5 The scope and purpose of ICT enhancements in HE in South Africa While the paucity of resources generally might seem to militate against introducing yet another expensive component into the higher education

Dee Pratt

95

array, paradoxically it renders the shift to ICT-mediated education all the more essential. It is precisely because there are limited material resources that a shift to the information-rich environment of the Internet is necessary. For the majority of students in South Africa, ICT enhancements represent an intervention which fast-tracks them into the Age of Information Technology, virtually skipping over the Industrial Revolution in the process, and with no fall-back position because of the rapidlydiminishing material resource base. Not only do most South Africans not have access to educational resources taken for granted overseas, but they do not have access to education itself, owing to factors such as the geographical separation of communities, the loss of breadwinners through AIDS, and widespread poverty. The accelerated need for further adult education because of work-force depredations caused by the AIDS pandemic, or for sheer economic survival, means that some form of ICTmediated distance education, while not necessarily an ideal solution, will be the only viable option for most adult South Africans, particularly those in rural areas. 16 Personal empowerment, community upliftment and economic growth will depend heavily on some form of electronic mediation, whether this be through computers, the Internet, radio, TV or cell phone technology. The role of HE institutions will increasingly be to implement such programmes, whether delivery takes the form of distance education or blended learning. 17 Although South Africa has arrived relatively late on the cyber-scene, some points in our favour are the rapid developments made in ICT over the last decade, and the fact that we are able to benefit from the example of implementation in other countries, particularly in less-developed countries where the contexts and needs more closely match our own. ICT pioneers in South Africa should not, then, be viewed as retracing the same path taken ten to fifteen years ago overseas, but as leap-frogging into the future, often on uneven terrain, but in the true spirit of transformation 18 . 2.

Implementing mixed-mode delivery in the project

2.1. Some key issues in mixed-mode learning Because “information” and “knowledge” are not synonymous, it is important to ensure that the new technology is directed towards enhancing learning and not mere information-retrieval, particularly as the predominant learning mode in most South African institutions remains the transmission model, in spite of the introduction of outcomes-based education. 19 This is a result of the “old guard” corporate technocrat mentality of the fifties, 20 which is still entrenched in South African institutions mainly as a result of apartheid education policies. This mentality is particularly endemic in technically-focused institutions such as DIT which have strong ties with Industry. Indeed, the higher education merger process such as that recently experienced at DIT can be seen as a

96

Mixed-Mode Communication Courses

Government-legislated attempt to transform the “old guard” corporate mentality into an empowerment model. 21 The challenge in mixed-mode course delivery, then, is to arrive at a blend of resources and activities which has the potential to enhance learning. My induction into web-based learning in the Pioneers Online 2002 programme had emphasised the importance of a sound pedagogical base for Internet learning: 22,23 assumptions about learning need to be made clear right at the outset of designing a course, and any electronic enhancements need to be thought out carefully so that they have the potential to accelerate and facilitate learning rather than adding yet another layer of difficulty. Setting up materials or exercises on the Internet in itself is not necessarily conducive to learning, nor is there any guarantee that students will actually use an online course once it has been set up. There needs to be an intrinsic reason for Internet use, for example, that it facilitates learning, or adds dimension and depth to the learning interaction itself. Moreover, in the case of students who are not computer literate at the outset (and questionnaire sampling indicated that up to 40% of respondents were not) mastery of the skills needed to use the Internet and regular Internet access need to be built into the course (less than 10% of South African households possess a computer). Finally, the induction programme emphasised the importance of good course design based on sound educational principles: we were encouraged to be clear about our course outcomes and how we expected mixed-mode delivery to contribute to the fulfilment of these outcomes. 2.2 Course design For the mixed-mode course design a course template was used which had already been piloted by the English & Communication Department, derived from an integrated approach to language learning which the author had developed while teaching high school English. Over the last four years the Department had adapted this approach for teaching Business Communication skills by running scenario-based OBE courses in which students generated their own knowledge in professionally-orientated scenarios using basic research strategies and their own multicultural resources. Ideally, a scenario - such as the designing of a low-cost housing complex to fit an actual site by Construction Management students would require students to find out specialist first-hand information, some of which could best be supplied by the disadvantaged students in the group, making them the “experts”. In a reversal of traditional academic norms, the best projects were often produced by economically and educationally disadvantaged students, who now comprise the majority of our student population. These students do not necessarily obtain high marks when this approach is used. However, they develop English language skills more readily in the context of an actual situation than in a

Dee Pratt

97

textbook, which often provides no context at all, or, when it does, provides no real incentive (i.e., an actual purpose and audience) for using language. 24,25 Moreover, when language tasks are set in professional contexts, the interest and skills students bring to their chosen professions appear to spur them on to better linguistic performances. In the context of mixed-mode course design, the most salient aspect of the “scenario” approach to language learning lies not in its use of project work, experiential learning, problem-solving in small groups, or even in its use of scenarios or simulations 26 , but in the integrated way in which course elements are blended to imitate or model real-life professional functioning. This involves a complex interweaving of tasks in different communication modalities (i.e., speech, writing and nonverbal communication). The professionally-based scenario or project provides the matrix which integrates and coordinates the communicative tasks, such as basic research on a topic with written and oral report back. Learning is thus effectively “blended”, which is why this approach translates so readily into mixed-mode delivery. 27 As illustrated in Fig. 1, the learning interaction takes place mainly in small student groups of five, in which intense interpersonal communication takes place, for example, interpreting briefing materials, planning projects, solving problems, making decisions, delegating tasks, and sharing information gathered from a data source (usually printed texts and interviews, when this approach is run offline).

Fig. 1. Profile of integrated project work

98

Mixed-Mode Communication Courses

Because the main learning interaction is centred around a core of dynamic small group communication, effective group functioning is essential. The student groups are therefore facilitated by the lecturer, so that interpersonal and small group communication skills can be developed: these skills are also reflected in the official course outcomes. Although outwardly task-focused, group work is geared towards developing complex higher order competencies in communication, interpersonal relations and problem-solving (see Fig. 2). Where time constraints allow, opportunities for reflection are built into the course. 28 Spady identifies the development of such higher order competencies as crucial to effective adult life-role functioning. 29 After the first pilot study in 2000 with Office Management and Technology students, a course template was worked out to co-ordinate and integrate course activities. A typical course template contains a scenario in which teams work on an application for a grant or award, which can easily be tailored to fit each diploma. The group work, formal test and report back (written and oral) are all woven into the grant scenario. This template was found to work well when tested out in subsequent pilot studies. In courses based around professional scenarios we have generally found that student attendance is excellent, there is a high pass rate, and the approach works well with multicultural groups, as student diversity constitutes a resource rather than a problem. Educationally disadvantaged students come into their own, as they often have highly-developed interpersonal and social skills which are an asset in the small group interactions.

Dee Pratt

99

Fig. 2. The higher order competencies which are developed in the small group interactions in a scenario approach

The approach also has advantages for lecturers. Assessment is more interesting because of the open-endedness of student output, and work is less exhausting because it is student- and not teacher-driven. 30 It is also less stressful to deal with large (i.e., over 100) and therefore often unruly tutorial groups in this way, as the students’ often inexhaustible energy tends to be diffused within the small group interactions. Informal peer teaching and mentoring frequently occur in the small group interactions, which shares out much of the teacher’s load in redressing inequalities in the students’ educational backgrounds. The teacher’s role becomes mainly that of facilitator and assessor, apart from occasional input of practical theory, such as how to give a talk or write a report. Materials production is facilitated: once the course template has been set up and study guides and project sheets have been prepared, scenarios tailored to different diploma groups can be improvised and slotted into a course template in a matter of minutes. 2.3 Translating the course into mixed mode The same course template which had been developed in “offline” pilot studies was used for mixed-mode delivery. As in our previous pilot studies, the course integrated aspects of professional communication and was outcome-based. However, this time the course revolved around an Internet search for “Professionally Relevant Internet Sites”, which made it intrinsically web-focused. Unlike previous scenarios, which needed to be tailored to fit specific vocations or professions, the PRINTS project was relevant to all diploma groups, and could be used in subsequent courses without the danger of students copying project work (sites used by previous groups could easily be identified and set off-limits for the next round of students). As well as having the course activities revolve around an Internet search, there were other electronic enhancements. Course materials and other resources were uploaded on to WebCT (Web Course Tools – an educational administrative program). Three WebCT courses were used: WebCT for Dummies, ditcom and Comm. Skills Online (with the different Diploma group indicated, for example, Comm. Skills Online for Survey). WebCT for Dummies is a basic student induction workshop for WebCT, which was designed to be run live, but can be browsed online by students; ditcom is the DIT English & Communication Department’s online resource base; while the Comm. Skills Online courses contained study guides, project sheets and other materials specific to the PRINTS project. Our intention was to use WebCT for Dummies to introduce students to WebCT, have their course materials set up on Comm. Skills Online, and let students make use of ditcom for general learning resources (for example,

100

Mixed-Mode Communication Courses

course notes, PowerPoint slide shows of lectures, and links to free language games on the Internet). Students were also given hard print copies of course notes and project materials. 2.4 Assumptions made about electronic enhancements The possible enhancements offered by running the project in mixed-mode delivery are demonstrated graphically in Fig. 3, which shows the potential of mixed-mode delivery to transform the learning interaction. Not only can electronic enhancements provide greater efficiency in providing more (and more varied) resources and the option of communicating electronically, but they offer the opportunity for better quality teaching and learning as well. Many of the issues we sought to address were a direct result of the merger, for example, high student numbers, duplicating delays and costs, a lack of Audio-Visual equipment, and difficulties communicating with students after lecture hours because the campus was now spread over a wider area. We were therefore not only considering electronic enhancements for the purpose of curriculum development, but also as a means of maintaining academic quality, which was threatened by severe budget cuts and a chronic lack of staff, resources and facilities. The background to HE education in South Africa sketched earlier will have made it clear that these problems are not specific to our institution but reflect national trends. In this context, our use of ICT enhancements should not be viewed as an attempt to replicate work done previously overseas, but to find solutions for urgent educational problems which our dwindling material resources were clearly not going to solve. Our assumptions about electronic enhancements were as follows: •

The email and discussion facilities on WebCT would facilitate afterlecture communication between lecturers and large numbers of students spread over more than one campus.



Course materials set up on the Internet would cut down on duplicating time and costs, and would compensate for missing/faulty AV equipment.



Student work could be posted on the Internet, resulting in better quality work for a “real” audience, as well as opportunities for peer assessment and feedback.



Student work posted on the Internet detailing professionally relevant internet sites would provide both models and resources for other students.



Students who found it difficult to obtain individual lecturer attention because of higher student numbers might benefit from the lecture slide shows, self tests and revision exercises set up on the Internet.

Dee Pratt

101

Fig. 3. Profile of integrated project work in mixed-mode delivery



Course materials would have more appeal for students when set up on the Internet than in hard copy course notes.



Second language students who were not confident about their use of English might feel more at ease communicating via email or discussions.



The enjoyment of browsing the Internet would motivate students to do well at their studies, and would offset any initial nervousness at using unfamiliar technology.



Students would read more as a result of the project, as most of them appear to prefer “surfing the Net” to reading a book.



The Internet search process would model a useful set of study skills which would be transferable to other diploma subjects.

With reference to the last two points, we also believed that use of computers and the Internet would enhance students’ academic literacy (including print literacy), but we were not sure exactly how this might operate for our particular students in their specific context.

102

Mixed-Mode Communication Courses

2.5 The different mode mixes used for course delivery Three different mode mixes were used for course delivery owing to constraints such as total diploma group numbers, and availability of staff, computer laboratories and other facilities. (i) Chemical Engineering and Survey: predominantly online The course was run in computer laboratories for these two small groups (49 and 17 respectively), apart from the oral presentations. We were able to book the best general (i.e., for use by all students) laboratory on our campus, which had just been fitted out with 40 Pentium 4 computers, and which had fast Internet access. The laboratory was not equipped for teaching web-based learning, however, nor were the browser settings properly adjusted for WebCT use, in spite of the fact that the specifications had been given to the laboratory Manager. As a result, both student groups did not explore all of the potential of using WebCT, and we did not introduce web-based learning with WebCT for Dummies, but merely recommended this course to students for individual browsing. However, materials set up on ditcom were used in lectures on practical theory, which was useful in view of the backlog in duplicating (we are particularly dependent on duplicated materials, not only because of the general lack of resources, but because we can tailor these to our students needs.) No lecture rooms were available for small group discussions, which impacted negatively on group work. Team teaching by two lecturers was used for the Chemical Engineering students, as dividing them into two smaller groups would have meant that one group would have had to use inferior laboratory facilities. (ii) Electrical Engineering (Light Current): partially online As this diploma group was large (over 260 students) and general laboratory space was limited, it was not possible for these students to use the online courses set up on WebCT except as an optional enhancement. The Internet search project (PRINTS) was considered feasible, however, as these students were technically skilled, and some of them were specialising in computers as part of their diploma course (of respondents to the feedback questionnaire, 60% of this group had used computers before, and 40%, the Internet). As many of these students did not have private access to computers or the Internet, computer laboratories had to be booked for the Internet Search part of the project. There was not enough laboratory space for 260 students, but each member of a team of five could visit the largest laboratory in rotation, which gave all students a chance to carry out the search. The largest laboratory, however, had the oldest computer equipment and the slowest Internet access: it took 20 minutes out of a 40 minute lecture period to turn all computers on, and then not all of them actually worked. Extra laboratory times had to be

Dee Pratt

103

scheduled so that all students could participate in the Internet search, which meant that the lecturer supervising the laboratory work (the author) had five extra periods a week. This laboratory was not suitable for teaching (or computer use, for that matter), but a number of first-time computer users were given tuition during the laboratory periods. Because of the intrinsic interest of the activity itself, students were able to use computers to carry out Internet searches on their own after a very short briefing, sometimes as little as 15 minutes. (iii) Information Technology: optionally online Ironically the IT group, which comprised the most promising candidates for a mixed-mode course featuring an Internet search, had the worst time of it. Their Department initially overlooked the fact that a Communication Skills course formed a part of their diploma/degree, so there were not enough English & Communication lecturers left to take them when the request came in at the last minute (they were a group of over 350). I offered to solve this problem by including the IT group in our mixed-mode pilot study, because IT students would presumably be able to cope with the Internet and WebCT without intensive training, and they could go online in the IT computer laboratories. It turned out, however, that these students did not have sufficient access to computer laboratories for their own diploma work. Printed course materials did not arrive for two months because of the duplicating backlog, and I eventually had to provide copies of the study guide and project sheet out of my own financial resources. There were insufficient venues, and the AV equipment was often faulty or inaccessible. The students also initially had a dispute with the IT Department, which spilled over into our lectures. In an attempt to solve some of the problems, the students were divided into smaller groups of just over 100, and the Comm. Skills Online course was adapted slightly. Multiple choice was used for the standardised written test, and formal oral feedback on the project was omitted, although small-group oral communication still took place regularly, as group discussions and teamwork were an integral part of the project work. Because of the limited laboratory access on their campus (computer laboratories were being used for lecturing venues), we even made the Internet search itself optional. However, professional pride prompted most students to complete the Internet search, and some (about 25 teams, or 125 students) even managed to set up web-based presentations, although only hard copy written reports were obligatory. Setting up the websites on WebCT required four or five double periods in the Online Learning Centre teaching laboratory (which is more properly reserved for lecturer induction courses), as even the bestequipped general laboratory did not have the facilities for uploading webs on to WebCT. Some very professional web pages were displayed in the WebCT Presentation area, with a few having to be evaluated from stiffie

104

Mixed-Mode Communication Courses

disks, as they were in non-standard web format. However, the most important aspect of the group work was not ultimately the quality of the work displayed, but the harmonious working together of so many students in multicultural groups, where resources such as technical expertise, software and equipment were shared amicably (there were, of course, exceptions). This is particularly important in the case of the IT students, who initially tended to polarise into cultural groups and to show a marked disinclination to work together in multicultural groups. 2.6 Course activities in mixed mode Students worked on their projects in groups, recorded group discussions in handwritten minutes, carried out the Internet search, and wrote a formal test on business correspondence. They handed in a typed hard copy of a group report on their project, and (except for IT) gave illustrated oral presentations on their project. All students were officially registered on WebCT, so that they had the option of using the three online courses set up for them. Only the Chemical Engineering and Survey students could be shown how to use the WebCT discussion and email facilities, however, and even they did not have the time or facilities to use these as often as we would have preferred. Lecturers facilitated group discussions and gave lectures on practical theory (i.e., how to present a talk, how to write a short report) when needed. Laboratory sessions with diploma groups were supervised, although individual students could use one of the general laboratories for Internet access from 3 p.m. to 4 p.m. Initially we had hoped to have students display their reports as web pages on WebCT, but there was neither the time nor the facilities for them to learn how to do this, so that only the IT students managed to display their reports online. Unlike earlier integrated projects, the PRINTS project in itself did not favour educationally disadvantaged second language learners, and we were concerned in case some of them found the new technology daunting. Although teams are usually randomly selected to ensure a multicultural mix, we hit on the idea of allocating at least one experienced Internet user to each team, which worked well in terms of peer teaching: there was no danger that the one member would do all of the search work, as all students were eager to have their turn at the Internet. The project gave enough scope for academically weak students to cope, as only three sites needed to be found per team, and for above-average students not to get bored, as the requirement to work as a team meant that they could not sit back with their own work completed. 3. Results 3.1 How feedback on the project was obtained Feedback on the project was obtained from anecdotal evidence derived from student discussions and talks, an informal student feedback

Dee Pratt

105

questionnaire, and a staff meeting at the end of the project. The feedback questionnaire was not exhaustive or research-validated, as it was intended only to confirm impressions we had already formed from first-hand experience or to find out more in areas which were seen as significant. As I had designed and co-ordinated the project and also had extensive firsthand experience of all three delivery modes, I was in a good position to assess both the effectiveness and the potential of the project. I was also painfully aware of any mistakes we had made in planning or execution, as I was constantly involved in trying to made things work in my own teaching situation as participant, and for everyone else, as Course Coordinator. This led initially to exhaustion, disillusionment, and a strong desire to have the earth open up and swallow me so that I would be spared the embarrassment of dealing with a potential debacle in which I had involved so many other innocent people. It was only later that I realised that, apart from the teething problems caused by poor computer equipment for the Electrical group (and none at all for IT) the project was perceived by participants and the rest of the Department’s staff as well-organised and effective: it was in fact one of the few instructional offerings that had got off to a good start and was running smoothly at the beginning of 2003, as the merger had caused considerable disruption to academic programmes. Staff and student feedback suggested that the students enjoyed the Comm. Skills Online course and found it highly relevant to their eventual professional functioning. Ultimately the project turned out to be a morale booster for the whole Department, as we were seen not only to be coping with change, but also to be trying out something which was not only innovative but associated with advanced technology. 3.2 The extent to which electronic enhancements were achieved Not all of our assumptions about electronic enhancements could be tested out, let alone achieved. For example, only the Chemical Engineering and Survey students had the chance to use the WebCT email and discussion facilities to test out the following assumptions: •

The email and discussion facilities on WebCT would facilitate afterlecture communication between lecturers and large numbers of students spread over more than one campus.



Course materials set up on the Internet would cut down on duplicating time and costs, and would compensate for missing/faulty AV equipment.

Both of these groups appeared to enjoy using the WebCT email and discussion facilities, although there was not enough time left in the first semester to exploit their use fully. During lectures on the practical theory of professional communication, the students working in laboratories were given the option

106

Mixed-Mode Communication Courses

of either following on projected overhead transparencies or using the copies of these set up as PowerPoint slide shows on ditcom. Students appeared to concentrate better when they focused on the lecturer and followed the projected transparencies with their monitors turned off. However, slide shows of lectures on WebCT have potential as a revision option. In one instance when the Chemical Engineering group needed to refer to course materials for revision during a lecture, I noticed that at least half of them turned to the online resources rather than to their hard text copies. This was an interesting development, in suggesting that some students can readily make the transition from printed text to screen text in accessing learning resources. Some of the Chemical Engineering and IT students took the opportunity to download online notes from ditcom during the duplicating backlog. One Chemical Engineering student commented enthusiastically that ditcom “has something for everyone”. Only the IT students were able to test out the next two assumptions: •

Student work could be posted on the Internet, resulting in better quality work for a “real” audience, as well as opportunities for peer assessment and feedback.



Student work posted on the Internet detailing professionally relevant internet sites would provide both models and resources for other students.

This was because IT was the only student group to upload reports on to WebCT. The excellent work and team spirit generated by displaying work online, where students could gauge their own progress (and better it) by referring to other students’ work convinced me that we needed to find ways to show less technically-skilled students how to do this, for example, by saving reports typed on MsWord as web pages without necessarily using hyperlinks, and uploading these straight on to WebCT. Because semester time is limited, students would then need to complete more of their writing tasks electronically, so that they could become familiar with using MsWord. This will be built into the course for the next round of pilot studies. An outcome of the PRINTS project which was not achieved, the setting up of a professional website index online, which would serve as a potential resource for other students, could have been achieved more easily if students had recorded new sites regularly in the WebCT discussion messages instead of reporting only at the end of the project. To get around the problems caused by duplicating backlogs, the lack of data projectors, and faulty overhead projectors in even the good laboratories, notes showing students how to use the WebCT facilities have been run off well in advance for the next pilot study.

Dee Pratt

107

The following was another assumption which was not tested out satisfactorily: •

Students who found it difficult to obtain individual lecturer attention because of higher student numbers might benefit from the lecture slide shows, self tests and revision exercises set up on the Internet.

Students with Internet access (at DIT or at home) could do this, but we were not able to give WebCT training to the majority of the students. As we would like to offer all students who take Communication courses at DIT (over 6,000 ) this facility, I have prepared a User’s guide to ditcom giving information about the Department's resource base, and showing students how to access the resources on ditcom and how to use the WebCT communication facilities. The Online Learning Centre has offered to do blanket registration of all of our students, as we are using WebCT4, and have an "unlimited user" licence. Regular tuition sessions in WebCT during Forum Time will also be offered to interested groups of students who would like to make use of ditcom. •

Course materials would have more appeal for students when set up on the Internet than in hard copy course notes.

Fig. 4 Two Chemical Engineering students engrossed in the Internet

Anything set up on the Internet seems to have more appeal for our students (see Fig. 4). Students were fascinated by the digital snapshots

108

Mixed-Mode Communication Courses

which we took in class and uploaded as background to their course materials, exclaiming, “Look, we are now on the Internet!” •

Second language students who were not confident about their use of English might feel more at ease communicating via email or discussions.

This assumption was borne out by my experiences with the two groups in computer laboratories, in spite of the effort required by ESL writers to compose written messages in English. Some of my ESL students showed extreme disappointment when laboratory time was over without their having completed and posted a discussion message of their own. They were highly motivated not only to compose their own discussion messages, but to have them read by the whole class. ESL students are not usually that eager to have their writing on general display. The fact that the tone of electronic messages was generally casual, and that everyone, including the lecturers, made the occasional typo, seemed to encourage ESL learners to write more. In the offline classroom it is unlikely that Abenicio would have introduced himself so readily or confided in me that he was Mozambican and experienced difficulty in communicating in English (his command of English is actually very good, considering it is probably his third language): First allow me to introduce,I'm ABENICIO from MOZAMBIQUE, in my country we don't use inglish language so,sometimes is difficulte to communicate.I'm hear to improve and i'll need your help.

Some of our Chinese Foundation English students wanted to improve their English by corresponding with mainstream students via WebCT email. One student had posted the following message on WebCT: I'm fred,I come from china,glad to be friend with you.

It was the second language students, not the mother-tongue English speakers, who immediately wrote in response, offering encouragement and advice: Hi! My name is James. I'm also glad to be your friend,now firsly, tell me about CHINA. and i'll also tell you a little bit about South Africa. hi! I'm Nonto hope u'll be able to pronounce that. I'm a first year chem. Eng. student.i am glad to know that there is someone like you out there and hope to learn more about you and also for you to know about me.

Notice how James tactfully models the correct form of “Glad to be friend with you” for his Chinese visitor. Nomkhosi below is not so strong on tact,

Dee Pratt

109

although her vaunted use of English is very good, apart from (inevitably) a typo omission: Hi. How are you doing? It's a pity that your English is not as good as mine. You know what, English is my second language but of course I can speak it very fluently. I know that technologically, you are far better than myself. The secret is here "practice makes perfect". Keep on practising, read a lot of books, and of course write as many English as possible.

Her message was not meant to be patronising, however - it just came out that way. What is interesting is her clever, grammatically correct integration into her text of a fragment of a message I had posted to this group earlier, commenting that the Chinese students were more mature, and very technologically advanced, but that their English was not very fluent. This enterprising strategy of Nomkhosi’s suggests that email offers ESL students a chance to model their language use on that of first language speakers in a way that could not be as easily achieved in rapid conversation or the slow exchange of printed texts. •

The enjoyment of browsing the Internet would motivate students to do well at their studies, and would offset any initial nervousness at using unfamiliar technology.

As we had anticipated, use of the Internet, even with antiquated equipment, proved to be irresistible to students. Far from being daunted by the new technology, our educationally disadvantaged second language learners could not be prised away from the Internet: I frequently had to turn off the computers to get students to leave the laboratory when other students were waiting to come in. Students were on the whole excited rather than apprehensive at the prospect of using the Internet, as seen by the following responses: hi,Mrs.Dee it your communication student Thuthukani. It exciting to surf in webct and it's very easy . As it's my very first time to use the computer I thought it will be difficult but it's not. But I just need one faver can you please give us a chance to enter any website. HALLO DEE THANK YOU FOR SENDING ME MY VERY FIRST E-MAIL!! LOVE : LONDIWE hi, mrs.DEE, Malu here. it is realy exciting to use webct, it is formative and at the same time it a lot of fun but personally enjoy the chat room, I never used it before and I like it very much. There are lot of places Ihaven't been into neither the less, but I hope to visit

Mixed-Mode Communication Courses

110 them soon.

Students were impressed with the scope and speed of job-seeking on the Internet (one of the project themes was “employment”). Second language students who had difficulty reading text books on technical subjects were delighted to find simplified versions of “The History of Electronics” (or Surveying, or Chemistry) on the Internet. The Electrical Engineering students in particular were pleased that “English” involved an Internet search instead of the formal teaching of Literature or Grammar. In their oral report-back many of these students commented on the usefulness of the Internet search for their technical diploma subjects: they found “shopping lists” of electronic components they would need for their fourth-semester design projects, and examples of the kinds of technical design reports they would be required to write. Students were allowed to browse generally, provided that their project work was done first, as our students tend to have a limited world view and little general knowledge because they read so little, mainly owing to the lack of libraries in schools. Responses to the feedback questionnaire suggest that they made good use of the opportunity to browse general sites. •

Students would read more as a result of the project, as most of them appear to prefer “surfing the Net” to reading a book.

My impression (borne out by the questionnaire responses) is that students read far more than usual as a result of using the Internet, and in particular, were obliged to develop the browsing, previewing and skimming skills which are so important in processing study materials, simply because of the huge amount of data the Internet search threw up. Students were also obliged to make more decisions and judgments about the materials they found than they would have in an offline project, which would have yielded far more limited data, or in conventional lecturing, where the textbook or lecturer’s notes would have absolved them of the necessity for thinking for themselves. Decisions about which sites to use in their project had to be discussed with other team members, which meant that knowledge was negotiated rather than remaining inert. Students were also required to carry out a simple evaluation of the websites they chose, which obliged them to focus on the way in which information was communicated, potentially sensitising them to the effectiveness of their own communication. Student responses to the feedback questionnaire suggested that 78% of respondents read more than usual, and 33% much more than usual. Even if what they read had been educationally worthless, the development of mechanical reading skills and exposure to English at a level students could understand would have been beyond price. What is more important

Dee Pratt

111

is that discovery of something they enjoy reading can set in train the habit of reading for students. For the majority of our students, reading in English is not a recreational activity but a joyless imperative from parents and teachers. Student responses indicated that at least 30% of reading was professionally related (higher in the case of Electrical Engineering students), which suggests that the reading of electronic texts on the Internet has the potential to enhance not only their diploma studies but also their eventual professional functioning,. •

The Internet search process would model a useful set of study skills which would be transferable to other diploma subjects.

It is the integrated project approach itself which models study skills, and not the Internet search per se: within this approach students learn to collaborate in groups to carry out a data search on a professionally related theme, to find and select relevant information, to come to conclusions and to report back orally and in writing, i.e., they learn how to conduct basic research. The Internet enhances this basic research capacity not only in terms of the efficiency and speed of the search process itself, but also in stimulating students mentally with the scope and variety of resources on the Internet, and in requiring higher-order decision-making, data-sorting and organising skills. 3.3 The development of academic literacy That students were able to carry out their projects, largely unassisted after the initial briefing, and come to conclusions about the data they had recovered on the Internet, in itself meant that they had developed the basics of a high-powered form of academic literacy. As the PRINTS project was geared to professional functioning in their diploma subjects, it was likely that transfer to learning/research in these subjects would be made. Some specifics about the surface manifestations of academic literacy in written and oral expository texts were noted by project staff. Students in the pilot study made better use of graphic materials in both their reports and oral expositions, communicating information clearly by means of good quality graphs, tables and illustrations. This was attributed to exposure to multimedia texts on the Internet. The best web-page reports from the IT student group were not just technically advanced, but much better in terms of structure, logic, and style than is usual with their conventional academic work; they were also laid out and hyperlinked so that they were user-friendly and communicated well, i.e., the visual and spatial aspects of communication were being combined effectively with verbal (print) aspects. As we had found in previous professionallyorientated projects, the students’ professional expertise appeared to have enhanced their language work. Many of the reports echoed the more

112

Mixed-Mode Communication Courses

colloquial tone of the Internet, which we did not see as a problem as long as the reports were interesting to read and well communicated, which most of them were. The oral presentations tended to be much better than usual, with more animated “talking about” and less “off-by-heart” learning of speeches. Graphic materials used with talks were better than usual on the whole, and better integrated into the talks (it is very difficult to describe an Internet site without graphic illustration.) Academic literacy is not about reading, writing or oral exposition as discrete skills, however, but about the ways in which these skills are harnessed to the ends of learning. In South African institutions there is a trend towards facilitating the development of academic literacy in mainstream academic programmes rather than by means of remedial-type interventions. In our current situation 60% of our students would require some form of intervention, which is simply not feasible. Linking the development of communication skills with professionally-orientated projects can be powerful tool for the development of academic literacy, because the approach combines meaning-making at an interpersonal level with the constructing of knowledge on a wider social scale, 31 i.e., as interpreted by group of professionals or technical experts. What society in general accepts as knowledge then becomes infused with the incontrovertibility and passion of the student’s first-hand experience, and it is this passion which ultimately transforms learning, and not electronic enhancements per se. 4.

Conclusion

In spite of the many problems we experienced, staff and student feedback on the Comm. Skills Online project suggested that course work was enhanced by the use of WebCT and the Internet. However, we concluded that, if use of technology by students becomes an end in itself, as tended to happen with the two groups working entirely in computer laboratories, the learning interaction itself can become compromised: it is not so much an issue of the proportion of time spent in laboratories or classrooms, but an issue of focus. As the learning approach used in this pilot study centres around group work, sufficient emphasis needs to be put on group work, whether face-to-face or online. Running Communication courses in mixed-mode delivery did not realise all of the benefits of the electronic enhancements we had identified, but it achieved them to a degree which made it worthwhile to run further pilot studies, and to keep feeding back insights from the pilot studies into mainstream teaching. Future pilot studies will be run with small groups of students, however, as the DIT infrastructure does not at present support large-scale mixed-mode projects because of the lack of general computer laboratories and Internet access for students. In the next round of pilot studies the Online Learning Centre laboratories will be used to teach students how to use WebCT, so that the

Dee Pratt

113

facilities and resources set up on it can be exploited fully. Regular electronic communication will be a course requirement: progress reports on projects and records of team meetings will be recorded electronically on the WebCT email or discussions. A better-equipped teaching venue will make it possible to teach students how to collate these electronic texts to compose their reports on word processor, and how to upload their reports on to the WebCT presentation area. The technical skills students develop in the process will not only facilitate their studies but will improve their employment prospects. Rooms will be booked for small group interactions so that students can have group discussion time away from computers. Our overall strategy for mainstream teaching will be to encourage all Communication students to use the departmental resource base, ditcom, whether they do so from home or from the general DIT computer laboratories. Training sessions on how to use ditcom will be held in the Online Learning Centre for interested groups of students during College Lecture times. Our part-time Communication students in particular could benefit from being registered on ditcom, as they have difficulty communicating with academic staff after hours, and most of our mature students in full-time employment have access to computers and the Internet at the office. Before concluding, I would like to acknowledge the hard work and support of the project staff: Linda Herbert, the Department’s IT specialist, who collaborated with me in devising the PRINTS project, Caryn Barnes, Rob Gutteridge and Naadira Jadwat. A special note of thanks goes to the Computer Laboratory staff, Clement Zikalala (Manager), and Lucky Dlamini (Technician), for doing their best to accommodate our students’ needs under trying circumstances. I would also like to acknowledge the help and support of Mari Peté, Director of the Online Learning Centre, and Charl Fregona, Technical Manager and Courseware Designer: there is so much of you both blended into my mixed-mode courses that I cannot begin to extricate it, let alone acknowledge it. Finally, this project would not have been possible had my HOD, Carol de Kock, not actively supported mixed-mode learning and worked it into the timetable, in spite of the strain on staffing caused by the merger. In conclusion, there are clearly two “layers” to this project. At face level, our intention was to achieve optimum learning using modern technology: at a deeper level, we wanted to empower our students by opening to them a virtual new world. We do not labour the empowerment aspect of our work with our students, who want to be regarded as modern and capable citizens of the world, and, understandably, resent being typecast as educationally-disadvantaged second language students. For them, the opportunity to “surf the net” means being “in sync” with their

114

Mixed-Mode Communication Courses

worldly-wise and sophisticated hip-hop cousins in the USA and UK - it is “super-cool”. This is not to say that they do not realise the significance of their “giant leap” into cyberspace. I cannot put it better than my colleague, Linda Herbert: “You can see their eyes literally light up as they go on to the Internet.”

References 1

Daily News. “Census 2001 results out”. 9 July 2003:5. Government of South Africa. “National policies concerning lifelong learning”. (18 October 2003). 3 Bennet, F. Computers as tutors: solving the crisis in Education. Sarasota, FL: Faben Inc., 1999. 4 Odendaal, R. “The future of higher education (HE) in South Africa”. September 2002. http://www.vanschaiknet.com/HigherEducation.htm> (8 July 2003). 5 De Kock, C.M. and N. Gawe. “Who rattled my cage? A study of change management at the Durban Institute of Technology.” Paper presented at the NTESU Conference. University of Natal, Durban, South Africa, 9-11 July 2003. 6 Department of Education, Pretoria. Green Paper on Higher Education Transformation, 1996. (19 October 2003). 7 Ministry of Education. Draft white paper on higher education: a programme for higher education transformation. General Notice: Notice of 712 of 1997. (8 April 2002). 8 CHE. Towards a New Higher Education Landscape. CHE Report 30 June 2000 [63] (12 July 2002). 9 Directorate of Distance Education, Media & Technological Services: DoE. Technology-Enhanced Learning In South Africa: A Strategic Plan. (12 June 2002). 10 NADEOSA. Shape and Size of Higher Education: Submission from NADEOSA – The National Association of Distance Education Organisations of South Africa. (12 July 2002). 2

Dee Pratt

11

115

Butcher, N. “Information and communication technologies in South African higher education”. Paper presented at the 2nd NADEOSA Conference. Pretoria, 22 August 2000. 12 Butcher, N. “10 years in educational technology.” Open Learning through Distance Education: 10th Anniversary Edition, July 2002. 13 Swank, K., S. Lubbe and L.Heaney. “Introducing NIT to an historically disadvantaged institution in South Africa.” http://web.simmons.edu/~chen/nit/NIT'96/96-283-Swank.html (15 October 2003.) 14 National Languages Working Committee for Technikons. Technikon RSA, Johannesburg, 8-9 May, 2003. 15 Giannini, D. “eLearning in developing environments: creative ways of breaking the access to computers bottleneck.” Summary of discussion at First SANTEC Online Conference, 8–10 October 2003. (14 October 2003). 16 Van Brakel, P.A. and J. Chisenga. “Impact of ICT-based distance learning: the African story.” (10 October 2003.) 17 Fregona, C., M. Harris and J. Kruger. “The barefoot teacher on the telematic highway – serving rural communities in KwaZulu Natal.” Paper presented at the 2nd NADEOSA Conference. Pretoria, 22 August 2000. http://www.saide.org.za/nadeosa/conference2000/fregona.htm (12 October 2003). 18 Nkwae, B. “Information and Communications Technologies: can Africa leapfrog the digital divide?” (10 October 2003). 19 Curriculum 2005 Review Committee. A South African curriculum for the twenty-first century. Report of the Review Committee on Curriculum 2005. Pretoria, May 2000. http://www.polity.org.za/html/govdocs/reports/education/curric2005/curri c2005.htm (19 October 2003). 20 Fregona, C. “Lecturers are from Venus, Computer Services are from Mars – institutional experiences in bridging the educational technology divide.” Paper presented at the Joint Conference of SAARDHE and SAAIR. Bellville, Cape Peninsula, 8-10 July 2002. (10 October 2003). 21 Asmal, K. Press Statement by the Minister of Education, Professor Kader Asmal, MP, on the Transformation and Reconstruction of the Higher Education System. 30 May 2002.

116

Mixed-Mode Communication Courses

http://education.pwv.gov.za/Media/Statements_2002/may02/he.htm (14 October 2003). 22 Peté, M., C. Fregona, T. Allinson and J. Cronje. “Developing a community of online learning practitioners at the Durban Institute of Technology.” Paper presented at the Conference on Communication and Information Technology in Tertiary Education (CITTE). University of Natal, Durban, 25–27 September 2002. (11 July 2003). 23 Alley, L.R. and K.E. Jansak. “Ten keys to quality assurance and assessment in online learning.” 19 November 2002. (10 December 2002). 24 Littlewood, W. Communicative language teaching. Cambridge: Cambridge University Press, 1981. 25 Widdowson, H.G. Teaching language as communication. Oxford: Oxford University Press, 1978. 26 Heese, M. and D. Adey. “Tutor orientation through simulation: an action research project.” SAJHE/SATHO 6(2) (1992):25-33. 27 Pratt, D.D. “The matrix reloaded: the role of the web-based learning project in integrating and coordinating mixed-mode learning interactions.” Paper presented at the 5th Annual Conference on World Wide Web Applications. UDW, Durban, South Africa, 10-12 September, 2003. 28 Schön, D. The reflective practitioner: how professionals think in action. New York: Basic Books, 1983. 29 Spady, W. Outcome-based education: critical issues and answers. American Association of School Administrators, 1994. 30 Sionis, C. “Let them do our job! Towards autonomy via peer-teaching and task-based exercises.” English Teaching Forum (1990): 5-8. 31 Vanderstraeten, R. and G. Biesta. “Constructivism, Educational research and John Dewey.” (11 June 2003).

Construction, Consumption and Creation – the Convergence of Medium and Tool Anders Kluge Information Systems Group, Department of Informatics, University of Oslo Abstract To use the computer as both medium and tool has become a regular way to utilise it. This confronts the users with complicated situation, particularly in a case where they must handle consumption, construction and creation of multimedia material. The subject of this study is use of a specialpurpose software and specially designed and adapted content used in project based learning in a secondary school. During the field trial the pupils are able to handle the complicated situation by applying a tool approach to use. The interaction design works to propel the constructive and creative activity by integration of functions to view, relate, and make material in one application. The focus on visualisation of elements in the software also led the pupils to spend a considerable amount of time on stylistic issues. Introduction Computers are increasingly used as media machines. Alan Kay's vision of the computer as the first meta-medium (Kay 1984) can be said to be reality in the way we can incorporate all audio-visual media-types on the computer and are able to enjoy arbitrary combinations of sound, film, pictures, text and graphics, e.g. to combine a movie with textual information, or illustrate a verbal sound with explaining animations, each media type as a separate stream under user control. The computer also has 'meta-features' in the way several sources of information can be integrated in one media utterance. Information on a CD-ROM can be combined with web material to form one expression. In addition to be a medium conveying material and presenting it under user control from different sources, the computer remains to be is a production and construction tool. We still use it to write, calculate, design, and to numerous other productive and creative tasks. However, permanent Internet access both in the workplace and private are becoming common technical facilities in the wealthy segment of the post-industrial world, contributing to a use pattern where we alternate between production and consumption, potentially blurring the boundaries between using the computer as a tool and as a medium.

118

Construction, Consumption & Creation

This development presents challenges for interaction design as well as the individual and collective use situations. To swap between production and consumption makes a complex use situation, it may stimulate to a more fragmented working style and loss of deep concentration. The design challenge is difficult to assess. Should we accept the use situation and support, stimulate and enable the user to alternate between consumption, construction and creation? Should we do it within an application in such as way as to facilitate tool and medium qualities in the same application, or should the each application have a more well-defined role and lean on the operating system and graphical user interface and the computer hardware itself to integrate tool and media qualities? Or should we consider frequent shifts between a consuming mode on the one hand and constructive / creative mode on the other hand as processes so different as use situations that they mutually ruin each other, to the effect that the integration of the two processes should be considered harmful and be prevented by design? These questions may be important considerations of research in design and use of multimedia content and tools. In this paper, the study will be limited to concern the use of one software prototype for learning purposes, and the use situation of multimedia consumption, creation and construction in this context. Use of one particular application is studied where media and tool qualities are tightly integrated. This special purpose application and multimedia content are used in a set of field trials conducted in secondary schools. In this paper the issue is how the application and the facilities in it is utilised by the user, and how the use context and design interplays when pupils are to use the computer as a combined medium to consume information and a tool to construct and create. This study is limited to one field trial, and the pupils have the assignment to make a rich media presentation by use of a special-purpose software named Slime, and to present in for the rest of the class as the project is completed. Learning as use To use a medium and to use a tool can be considered to be opposites, the former as an activity of consuming information, i.e. to understand and try to derive meaning from content, the latter as a creative activity of construction and production. However, in a learning situation, to combine the two is a familiar way for most of us to operate; we take notes during a talk perhaps trying to put the message from the speaker into our own understanding of the issue, we may make notes in the margin of a book we are reading and highlights particular points in a text when engaged in learning activities. Software is also implemented dedicated to support these operations. (LeeTiernan and Grudin 2001).

Anders Kluge

119

Another indication of the advantage of combining creative and consuming activities on a computer when utilised for learning can be drawn from the constructivist approach to learning activities. This perspective emphasise the user's own activity during learning processes, and the creative and constructive activities of the learner is seen as essential to acquire knowledge. "[Constructivist learning strategy] gives the learner an opportunity for concrete, contextually meaningful experiences through which they can search for patterns, raise their own questions, and construct their own models, concepts, and strategies" (Fosnot 1996). A combination of tool and medium qualities may offer the 'user-learner' both information by use of the medium qualities, and tool features to support constructive activities. Use as learning Learning is a complicated process that this paper does not make any attempt to explain, but as an activity it carries some general properties that can make results from users engaged in learning more broadly applicable as studies of computer use. Even though we cannot equal use with learning, the processes we engage in while using a computer and while engaging in learning activities have similar characteristics. When we use the computer as a medium we want to get some information, we what to discover, understand, experience, gather, search information. When looking at these activities they overlap considerably with learning or may enter into activities that are a part of learning. If we include the use of a computer as a tool, the process to create a document, a presentation, etc., together with the media aspects as the habitual use pattern we are again close to a constructivist view of learning. This time from the perspective of contemporary use patterns, as combining the assimilation of new information with creative and constructive activities. In the emerging information society learning is a key concept. As the operative competence we possess as individuals, in an organisation or as a nation becomes our main asset, continuous learning processes or lifelong learning turn out to be a necessity. Several governmental and EU strategy documents also emphasise the need to establish possibilities for lifelong learning see e.g. (Union 2001; Sprogøe 2003). Information and Communication technology plays an important role in these policies, envisioned as an efficient and effective instrument to implement them. The developments sketched above, indicates that there are common characteristics between learning and what has come to be a recurring use pattern of digital technology, making it relevant to study computer-based learning in order to illuminate the general question of how the user may handle the blending of tool and medium aspects in computer use. In addition, the development towards the information-based society based on

120

Construction, Consumption & Creation

lifelong learning, seems to question the boundary between learning and use, making it relevant to study learning situations in order to say something about use, as well as to study use to explore learning. The LAVA Learning project 1 The general aim for the LAVA Learning project was to explore the use of computer-based rich media environments in schools. The project had three tracks of research and development; (1) pedagogical models, (2) new media content creation and (3) software development. It remained a major concern to treat these issues as mutually dependent in the project. The multi-disciplinary constellation along with the long-term time scope it possible to discuss observations, potential improvements in technology, pedagogy and content in an ongoing process during the course of the project. Software as a viewer and a production space As a result of the experiences from the first series of trials, where the pupils had problems in understanding and operating with a separate viewer and construction tool, it was decided to try to integrate the two in one application. The special-purpose application named Slime is a tool for integration of arbitrary data-types in one multimedia presentation. Audio, video, text, pictures and web-pages can be imported into Slime, and then placed on the ‘canvas’ of the tool (see figure 1). The elements are organised along a

1

LAVA learning is a large applied research project in a Norwegian context. Through more than three years, 19 participating organisations at a budget of NOK 24 million conducted 12 field trials at four different schools. See http://www.nr.no/lava for more information

Figure 1: The canvas of Slime, with timeline

Anders Kluge

121

global timeline for the presentation, and are placed in user-defined ‘scenes’. The scenes are organised in a number of user-defined headlines. The basic data-elements (text, pictures, audio, video, html-pages) are placed in a scene. Every basic element has a user-defined time assigned to it. This defines the point in time where the element will appear and how long it will remain on the canvas. The spatial placement is done freely by dragging an element onto the canvas at the particular place the user wants it to appear. The tool has two modes, ‘make’ and ‘search’. The search-mode is used to import elements to a presentation, the make-mode to set the properties for the elements and to view the presentation made. In make mode the user will get a universal timeline for the presentation available. The current scene will be showed with a red area in the timeline (see figure 1), illustrating the relative time-space it occupies in the presentation. The play indicator showed in the timeline, can be dragged in order to manoeuvre in the presentation and as a 'fast-forward' or 'rewind' functionality, where the user can view the at optional speed, depending on how fast the indicator is moved. In addition, the presentation may be played regularly by use of the 'play', 'pause' and 'stop' buttons in this mode. The lower left corner of the tool has two ‘cards’ or tabs. (see figure 2). One is the structure of the presentation showing the scenes and headlines for the presentation as expandable items, named the scene graph. The headlines and scenes can be moved freely in the structure as in a drag and drop procedure, thereby altering the sequence they appear in the presentation. The other tab is the clipboard, where imported items are placed, and later may be moved into the canvas by drag-and-drop procedure. In the tool, construction of a presentation and the showing of it, are combined. Or more precise, there are no separation between presentation and construction. As a consequence, the presentation may be altered by anyone looking at it.

Figure 2: The canvas with the graph of scenes

122

Construction, Consumption & Creation

Media Content The content from the providers in the project were selected to support the general theme "culture and consume" for the pupils' field trial. The content on the web were organised in two levels. The first level was the opening web-page with a collage of pictures. Short audio or video clips linked to the picture were triggered by clicks in the collage, and links were provided to the complete material of different data types. The audio and video material was also accessible through the Slime-application itself. All the provided video, audio and pictures could be imported in Slime and used as 'raw material'. In addition to the specially designed content the pupils had access to the Internet and the school library. The schools also had digital cameras for still pictures and video available. The field trial For the particular field trial reported in this study the pupils had been preparing on the theme the last 4 weeks before the trial began. This gave them a general background in the issue of culture and consume, but no specific focus was selected. An important part of the project-based learning in Norwegian schools is that the pupils are supposed to plan their own project, with problem definitions, milestones along the way, distribution of work within in the group etc. As a consequence little structure were imposed from the teachers, but there were some: (1) They were strongly encouraged to spend at least one hour on the content specially prepared for the trial. This was partly to see how the content was utilised and to get an evaluation of it, but also to provide a starting point for the pupils in a technical environment that where unfamiliar to them. (2) The pupils were supposed to present a problem statement during the first days, which the teachers should approve of, before they could proceed further (3) The teachers had separate tutoring with each group the first days, and also when needed later during the work In addition was the overall aim to have a multimedia presentation ready at the end of the project period when they were to show it for the other pupils in the class. Stimulating activity The first part of the field trial, where the pupils spent time on the specially prepared content, appeared to be a session where they were bored and uninspired. They did not find the material engaging and seemed to be unable to relate to most of the video material, pictures, audio, and text.

Anders Kluge

123

However, this did not imply that they were passive in front of the computer. The pupils were rather constantly active, selecting hyperlinks on the different web-sites and pages, selecting video material and pictures from the media server, scrolling text up and down, moving items, selecting text and items, dragging down menus without selecting anything. This 'hyper-interactivity' implied that the information resided a very limited amount of time at the screen, video was interrupted and text scrolled too fast to be read, several video frames could be present at the screen simultaneously, occasionally resulting in a cacophony of sound from different sources. After the session devoted to go through the material, all the groups except one experienced severe difficulties in proceeding with the work. They were bewildered in how to continue. Remarks made in one group are telling in this respect, as one of the members asked the others: "So what do we do now?" and got the answer "I don't know". A couple of minutes later another member in the same group said: [angry] "Now we have to write something, come a little further, maybe we won't have that one [the video they were playing around with] at all!" The pupils started the project by using the web-browser to look at content. Gradually they turned to use Slime more as a viewer. Every item that was imported for use or created by the pupils were visible, movable and resizable on canvas of Slime. Even the sound had a visual representation as a frame where the time of the audio was counted. The pupils did a lot of tentative imports to Slime, increasingly using the software as a viewer of material in general. This implied that they had to include the items they wanted to study in their emerging presentation, with the items already present there. A frequent pattern of use in every group was to run through the material in the presentation in whole or in parts as soon as something new were included in it. They did this either by playing it or by moving through the presentation by 'dragging' the play-indicator positioned in the timeline of the presentation. This gave the pupils a chance to get a visual impression of the material they have found, also in relation to what they already had placed in their presentation. To include an item in the presentation it has to be placed in time and space by the pupils. Although the placing were temporary and subject to numerous and virtually continuous changes, the continuous decision-making triggered concrete discussions in the groups on the material they imported, discussions that were rare when they looked at the specially-prepared material in the initial session by use of a browser. Although discussions had an overweight of layout and stylistics, they were instrumental in driving the process forward. The tendency to get contrasts of visual images in Slime appeared to stimulate to comparative problem statements. One group studied a sarcastic video with a guy they understood to be a 'redneck'. They found

124

Construction, Consumption & Creation

other information about rednecks ('harry' in Norwegian) and discovered that they were conflicting. This different account of 'harry' that they visualised when viewed in Slime, led them to be interested in why, how and by whom something comes to be described as 'harry'. Another group studied old and new commercials by use of Slime and noted how different they were in their means to draw attention to the products, and proceeded further to study the style and rhetoric in old and new commercials. Yet another group looked at pictures they found from different cultures and mounted them simultaneously by use of Slime, leading to a problem statement on beauty ideals in different cultures. The Graph of Scenes as a design facilitator Together with teacher tutoring, particularly guiding the pupils to define problem statements, the stimulation of contrasts got the groups who experienced difficulties back on productive processes. One group did not experience this phase of bewilderment in the same extent as the other groups. At the second day of the project, they discussed and developed the scene graph consisting of eight headlines, each with 2-4 named scenes. The group did spend considerable time on this work, which was structural and quite general with headlines 'introduction', 'presentation of the group', 'interview with people', 'commercials', 'conclusion', but also with but also with a couple of the more specific 'production' and 'development in sales'. The problem statement was not formulated at this stage, but the issue "Coca Cola" were selected. They obviously considered the making of this structure as a main achievement, as one member of the group said, "OK, now we are finished" with considerable relief when the structure were put in place. The group also used the scene graph to structure the process, and to support the decisions they were taking as they developed their presentation. As one of the pupils in the group announced: "This is great: the only thing we need to do is to go through it [the scene graph] and take each the each one at the time!". For the remains of the project this group also referred repeatedly to the name of scenes in the list, when they discussed the presentation. When they considered the one part of the presentation to be complete one said: "Now 'history' [the name of a scene] is finished". They used the list both to determine if certain issues were relevant or not, and to place it in the presentation. When one person in the group asked to get one item into the presentation another answered: "No, that is 'people in general' [an name of a scene item]". For the remains of the project very few discussions or decisions were made in this group without referring to the graph. Gradually the list of scenes was also used more active in the other groups as way do a kind of 'quality assurance'. They used it as a check often

Anders Kluge

125

accompanied by a run through of what they had in the presentation, to see how they were covered on the particular issues described as names in the graph. But the group who started their work with the graph, remained the most consistent and consequent users of it. As the project continued, and more information were created, imported and staged by use of Slime, increased focus were put on style, colour, layout, and rhythm by which the different elements appears by the groups. Discussions on content became more difficult to promote. An illustration of this was a group who tested how long time a text should remain on the canvas by reading through it. During this test one pupil reads out loud "Cheapest is best" in the text, and tries to ask the others if they really can write that as a general statement. However, the editing of time dominates as the focus of the work, he gets no reaction from the other group members, and are also himself quickly immersed in timing the reading of the text and get the right parameters into the system. Design as Driving Force The disorientation found after the pupils have browsed through or even examined multimedia material closely is also reported by others (Laurillard, Stratford et al. 2000). Laurillard et al., looking for the affordances of multimedia for learning, observed that even if the material were highly relevant and engaging "it appears to have afforded no productive response of any kind. " (Ibid p 8). They also observe an interest for technical and operational issues rather than content in this initial period, such as making the video frame larger, discussions about the quality of the sound, how to manoeuvre in the material etc. The authors attribute the lack of productive processes to an absence of narrative structure in the material. They suggest a structure to maintain an overall narrative line throughout the whole process, supported by teachers interventions but also achieved by design. "Our observation of learners using interactive multimedia shows that design features can be built in to act as affordances for these learning activities" (Ibid p15)." A hierarchical goal structure is proposed with subordinate actions to give the pupils a process as a narrative line to relate to. In the study reported here no structure was implemented in the application the pupils used. Some structure was present as guidance from the teachers, particularly the demand to make a problem statement, and this did contribute to bringing them further when they experienced problems after the initial study of content. However, a main driving force for the pupils to overcome the initial limbo was how they combined viewing and studying material with integrating it in the presentation. They placed material in time and space in the presentation, integrated in the process of playing and examining it. This triggered discussions related to design, what might be the right size, and how it looked together with the rest of the material, but

126

Construction, Consumption & Creation

it also frequently transformed into discussions on content. The visual power of contrasting items in the application, as imported material was integrated in the existing material, seemed to make discussions on comparing these items inevitable. This also complies with a constructivist view on learning as it is presented in (Fosnot 1996): "Disequilibrium facilitates learning. [...] Contradictions in particular, need to be illuminated, explored, and discussed." The contrasting ways of thought did also influence the problem statements, which for the most part were of a comparative nature. This study indicates that the pupils can be given stimulation by design to enter a productive phase reasonably fast. This does not come from a narrative structure in the content, neither from particular design support of the different phases a user may go through while collecting, relating, creating and disseminating material identified by Shneiderman (Shneiderman 2000). The pupils are rather stimulated by their own design and combinations of elements. A tool that enables and stimulates them to integrate and make contrasts, in existing content as well as material of their own making, work to propel activity in the groups. Emphasis on design also came with a tendency to put effort into stylistic issues. In the field trial, the pupils devoted a significant amount of time and energy on layout and related questions, at the expense of discussion content matters. Tool characteristic dominates Previous studies have showed that pupils can become passive viewers of material when they are confronted with multimedia content (Plowman, Luckin et al. 1999; Laurillard, Stratford et al. 2000). Plowman et al recommends a balance between narrative guidance from the software and content, and support for the user to make their own narrative construction. In the concepts in this paper it can be translated to mean a balance between tool and medium qualities in the application. In the field trial reported here, the pupils maintained to use the computer as a tool throughout the whole project period. When they were supposed to study content in the initial session by selecting different content items, they did not allow the material come to a close after selecting it. The pupils did not have the patience to let even a very short video commercial to run from start to end, the same with audio, the pictures live a very short life on the screen, and the text are moved and scrolled, and only parts are occasionally read. One interpretation of this behaviour is that the pupils refuse to use the computer as a medium. The machine seemed to signal activity to them, and they are not able to sit back and patiently wait for one piece of material to finish and then select a new one.

Anders Kluge

127

When the groups started to use Slime and imported and integrated material in their presentation, they studied the material closer. This may have a couple of explanations. One is the procedure they had to go through to import the material, which was considerably more laborious than to click on a link. This implied a threshold for doing it, leading to more conscious decisions to import the material. Also the procedure seemed to lead the pupils to consider the imported material to be more 'theirs own', as a consequence of the import procedure. Another explanation is that the material implicitly related to other items in the presentation (with the exception of the first item imported, when the presentation was empty), making it nearly inevitable to compare them, which the pupils found stimulating. In this case the pupils favoured to use the computer as a tool, and not a medium, they preferred the construction to the guidance. They showed little interest and engagement during the initial session where they were required to use the computer as medium. Real engagement and discussions in the groups appeared only when they were able to subordinate the medium qualities of the software to the tool qualities. Conclusion Several writers have foreseen a transformation of the computer from tool to medium (Oren 1990; Laurel 1993; Murray 1997). In this case, when the computers used for learning purposes, the media qualities appear subordinated to the tool qualities by the use pattern the pupils apply. In a learning perspective the active attitude to the applications are in accordance with a constructivist approach to learning. The software used in the reported field trial stimulates the users to contrast material and demands a decision on how the relations between content elements should be in space and time. This worked as a driving force in the groups' work, as a trigger of activity and to encourage discussions. In addition, the exposure of content and timing of elements led the pupils to use a lot of resources on stylistic issues. The groups varied in their use of the structure mechanism in the application, however it was able to work both as a tool to structure the presentation and as feature to support the process.

References Fosnot, C. T. e. (1996). Constructivism : theory, perspectives, and practice, New York : Teachers College Press, c1996. Kay, A. (1984). Computer Software. Scientific American. 251. Laurel, B. (1993). Computers as Theatre, Addison-Wesley.

128

Construction, Consumption & Creation

Laurillard, D., M. Stratford, et al. (2000). "Affordances for Learning in a Non-Linear Narrative medium." Journal of Interactive media in Education 2. LeeTiernan, S. and J. Grudin (2001). Fostering Engagement in Asyncronous Learning Trroght Collaborative Multimedia Annotations. InterAct, Tokyo, Japan. Murray, J. H. (1997). Hamlet on the holodeck : the future of narrative in cyberspace. New York, Free Press. Oren, T. (1990). Designing a New Medium. The Art of Human-Computer Interface design. B. Laurel, Addison-Wesley Publishing Company. Plowman, L., R. Luckin, et al. (1999). Designing Multimedia for Learning: Narrative Guidance and Narrative Contruction. CHI, Pittsburgh, USA, ACM. Shneiderman, B. (2000). "Creating Creativity: User Interfaces for Supporting Innovations." ACM Transactions on ComputerHuman Interaction 7(1). Sprogøe, J. (2003). Comparative analysis of lifelong learning strategies and their implementation in Denmark, Estonia, Finland, Iceland, Latvia, Lithuania, Norway, Sweden. Riga, Latvia, The Danish University of Education. Union, C. o. t. E. (2001). Report from Education Council to the European Council on the concrete future objectives of education and training systems.

Case Study: Multi-disciplinary, Cross-Cultural Community-building in University Multimedia Design Environments

Scott P. Schaffer & Melissa Lee Price 1.

Introduction There has been an explosion of software development products entering the marketplace over the past several years. Software development teams that work together well are thus an enormous competitive advantage for organizations. One challenge for software development teams in a global marketplace is managing and developing individuals from different disciplines (software engineering, graphic design, instructional design/educational technology) and ethnic/cultural backgrounds. Teams often encounter communication, environmental, and philosophical barriers related to completing design tasks that threaten the success of projects. The current study describes the first phase of an ongoing project designed to explore pedagogy, communications, and logistical issues related to development of a sense of community between multimedia design students in the United Kingdom and United States. Students with a diverse set of backgrounds and experiences interact with one another while participating in electronic discussion groups with varying levels of structure. While the students are not currently working together on project teams, they have similar goals, use similar processes and encounter similar barriers while completing multimedia design projects. 2.

Electronic Discussion Groups Electronic discussion groups (EDG) are considered a primary means of discussion and communication in most distance learning environments, and they are uniquely different from face-to-face (f2f) discussion groups.1,2 Researchers have, however, identified specific instructional strategies that facilitate active and engaging EDG in both f2f and distance learning environments.3 Electronic discussion boards supplement f2f classes by providing a means to extend classroom discussion by assigning additional topics that may be discussed peer-to-peer as well as student-toinstructor. Such EDG’s can also be used to preface classroom discussions and stimulate preliminary interaction between students.

130

Community-Building in University Multimedia Design

3.

Multi-disciplinary teams Many software products are developed in professional environments by one or more teams, depending on the scope of the project. The productivity of such teams is dependent on a complex array of conditions and factors that have begun to receive attention by researchers.4,5, As part of a project examining how to assess multi-disciplinary team work, Fruchter and Emery identified four dimensions useful in measuring the evolution of such teams.6 These dimensions are: • • • •

Islands of knowledge: the student has mastered his or her discipline but has little experience in other disciplines; Awareness: the student is aware of other discipline’s goals and constraints; Appreciation: the student begins to build a conceptual framework of the other disciplines, and understand enough about them to ask good questions; Understanding: the student develops a conceptual understanding of the other disciplines, can negotiate, is proactive in discussions with participants from other disciplines, provides input when requested, and begins to use the language of the other discipline.

With special relevance to the current project, Kuhne found that instructional designers working on software development projects sought to assimilate software design processes into their own design thinking.7 Similarly, instructional design models are increasingly reflecting software design concepts such as rapid prototyping, and agile methods.8,9 4.

Developing a sense of community According to Westheimer & Kahne a sense of community is the result of interaction and deliberation by people brought together by similar interests and common goals.10 Graves suggested that it is an environment in which people interact in a cohesive manner, continually reflecting upon the work of the group while respecting the differences individual members bring to the group.11 An online learning environment creates special challenges related to developing community. It is well established that the online delivery system itself does not influence learning, rather it is the design and pedagogy that matter. Interaction is important to promoting community in the classroom, and attitudes toward learning.12 Several studies have shown that a sense of community is related to learning.13,14,15,16 Thus, interaction around a common learning goal should positively impact learning effectiveness. Rovai examined feelings of classroom community in both online and f2f classrooms and found that amount of interaction explained 30% of the variance in classroom community. He concluded that other factors such as quality of interaction

Scott P. Schaffer & Melissa Lee Price

131

also influenced community. Rovai further identified several factors essential to creating, facilitating and sustaining a sense of community in an online learning environment.17 The factors are: transactional distance, social presence, social equality, small group activities, group facilitation, teaching style and learning stage, and community size. In this initial phase of the project, the researchers set out to learn how students in their respective learning settings would interact with one another given they had similar learning goals and objectives. We were specifically interested in how we could support such interactions, foster the development of a sense of community, and increase learning about design. This action research approach will allow us to continually examine research design and to focus research questions as the project matures. 5.

Method To learn more about the level of design thinking and communitybuilding within a diverse group of students working on multimedia projects, six groups of students from two different university “classrooms” were studied. Sixteen students in a Computer Assisted Learning course within the Educational Technology program at Purdue University, and twenty-nine students from the Hypermedia course at Staffordshire University participated. The Purdue course is relatively new – it has been taught once previously – and has very few prerequisites. The class met face-to-face, twice weekly. Since there were few prerequisites, students varied widely in background and prior knowledge related to course content. Graduate students were enrolled from several department and schools across campus including: Engineering Technology, Hospitality and Tourism Management, Agricultural Economics, Veterinary Medicine, and Education. Only a small number of students were experienced in either instructional design or multimedia design. In fact, this was the first course in both instructional design and multimedia design for a majority of the students. Students were expected to complete a prototype web-based instructional product in one semester. The Hypermedia class (UK term: module) at Staffordshire University is a core class for final year students enrolled on a BA(Hons) in Interactive Multimedia. The Interactive Multimedia major (UK term: course) is a hybrid one, whereby students take half of their classes in the School of Art and Design, and half of their classes in the School of Computing. Under the UK modular system, students take very narrowly defined classes that are mostly within their subject area. The BA(Hons) degree in Interactive Multimedia is a three year ‘sandwich’ degree. Sandwich degrees offer the student an optional opportunity to spend an entire year in full time work placement between their second and third year. Thirteen of the twentynine students enrolled on Hypermedia had spent the previous year in work experience. The class is taught across two semesters of the final year. It is

132

Community-Building in University Multimedia Design

a ‘double’ class, in that it is worth twice the number of normal class credits. It should be noted, however, that the Hypermedia project is not the students major piece of final year work, as they all are also required to take a ‘Route Project’ that is a ‘triple’ class. The Staffordshire students all had significant experience with a variety of software packages used in multimedia design. They, however, had never been exposed to instructional design theory. As the Hypermedia class is taught across two semesters, the first semester is devoted to students receiving lectures in educational and instructional design theories while the second semester focuses on the application of those theories in the design of a specific project. The Staffordshire students were also required to work with a local primary or secondary school teacher and students to ensure that the project met the learning objectives set by the England’s National Curriculum. As the second semester of the class focuses on the actual creation of an artifact, students have no formal class meetings, instead they have individual fifteen minute tutorials with the instructor. Due to the large number of students enrolled in the class (twenty-nine) the number of individual tutorials a student received was limited to three over the twelve week semester. It was hoped that the creation of the collaboration groups would give the students a measure of peer support between individual tutorials. Collaboration groups were balanced in terms of previous classroom performance, previous experience with multimedia design, work/life experience, and ethnic background across the two classes. We therefore expected that there would be no difference in the levels of participation across these groups since they were balanced according to learner characteristics. A featured type of support was the presence or absence of feedback and participation on the part of the instructor. Students were assigned to groups with varying degrees of structure and support ranging from high structure/high involvement to low structure/low involvement. Pre and post collaboration questionnaires and transcripts of group collaborations were analyzed in an effort to assess the evolution of crossdisciplinary thinking, to detect patterns of critical thinking and problem solving related to individual projects, and to understand more about the development of community amongst students with different backgrounds and project emphases. 6.

Findings Cross disciplinary understanding. Fruchter and Emery’s dimensions were used to measure the evolution of the cross-disciplinary teams. Most of the instructional design students reported themselves at the “islands of knowledge” stage at the beginning of the project, while 25% of the graphic design students reported themselves to be at this level. Many instructional design students also reported that they did not feel as if they had mastery

Scott P. Schaffer & Melissa Lee Price

133

of instructional design. By the end of the project, 33% of all learners appeared to be at the “appreciation” stage of cross-disciplinary evolution as witnessed by the kinds of questions they asked of one another and their acceptance of the other discipline’s perspective. Of these learners, all were from structured groups. None of the learners in the unstructured groups reached the “awareness” level as all appeared to focus primarily on their own discipline and asked few questions of the other discipline that indicated an understanding of goals and constraints. High Expectations, high participation. The most striking result was that a condition of meaningful interaction in the discussion boards seemed to be classroom performance and previous design experience. High performing students and experienced students seemed more willing to share projects and provide feedback to others. Lower performing, low experienced students seemed to shy away from the discussion groups and did not present projects for review. In many cases, interactions were brief one or two sentence reviews of posted projects. The amount of structure or instructor interaction did not seem to matter to this group of students who were quite willing to share work and ideas with others. These students, not surprisingly, rated the discussion board collaboration very favourably and encouraged its continued use. Low Expectations, low participation. While high performing students were interacting in a meaningful way on the discussion board, what exactly were the other students doing and why? Some Purdue students actively voiced their fear of interacting with Staffordshire students who they believed had far superior multimedia design skills. These students were intimidated by the completeness of the Staffordshire student projects and believed they had nothing to add that would be of value. Sense of Community. The development of a sense of community within the learning environment was mitigated against by several factors including: the amount of time students had to actually interact, six weeks; the maturity of projects: Staffordshire students were culminating two semesters of design and development work, while Purdue students were producing a raw prototype; and the cost-benefit: students had large workload requirements to complete projects and many appeared to forego active engagement and participation in favour of meeting minimum participation requirements. 7.

Conclusions and Lessons Learned Findings are discussed relative to their implications for individuals working on diverse multimedia project teams. A basic goal of this action research project is to increase the quality of student projects by encouraging students to share projects, methods and ideas, and discuss them via a discussion board. Promoting awareness of the processes used

134

Community-Building in University Multimedia Design

by learners from other disciplines and learning ways to communicate with others is another major goal of the project. While high performing and experienced learners engaged in discussion and mutual interaction, other learners probably felt somewhat marginalized or at least saw little value in active discussion in this particular discussion forum. This finding is not surprising and somewhat mirrors the real world design environment in which employees are often placed on teams in a random manner. More surprising was the degree to which structure and instructor involvement didn’t matter in groups with both high and low performers/experience levels. High performing/experienced learners simply appeared to thrive in an environment with relatively little structure and openly shared ideas and products. Future interactions between learners at these two universities may benefit from some of the following research design modifications: 1. 2. 3. 4. 5.

Match learner prerequisite skills and experience with levels of structure and instructor/facilitator involvement. Match discussion topics and focus with current project completion levels. Structure the discussion around a few topics/problems and have fewer discussion groups overall. Consider structural changes to the Purdue course to match the 2 semester sequence used by Staffordshire. Consider implementing more specific course prerequisites such as introductory instructional design and introductory multimedia production to the Purdue course

Notes 1 2 3 4 5 6 7 8 9 10 11 12 13 14

Kuehn, 1994 Molinari, 2001 MacKinnon, G., Pelletier, J., & Brown, M. 2002 Yen et al 1999 Walther, 1992 Fruchter, R. & Emery, K., 1999 Kuehn, 1993 Tripp & Bichelmeyer, 1993 Schaffer, S. & Douglas, I., 2003. Westheimer, J. & Kahne, J. 1993 Graves, 1992 Russell, T. L.,1999 diSessa, A. & Minstrell, J., 1998 Rogoff, B., 1994

Scott P. Schaffer & Melissa Lee Price 15 16 17

135

Brown. J.S., Collins, A.. & Duguid, P.,1989 Lave, J. & Wenger, E., 1991 Rovai, A.P. 2002

References Brown. J.S., Collins, A.. & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18, p. 32-42. diSessa, A. & Minstrell, J. (1998). Cultivating conceptual change with benchmark lessons. In J. Greeno & S. Goldman (Eds.) Thinking practices in mathematics and science learning, New Jersey: Lawrence Erlbaum, p. 155-188. Fruchter, R. & Emery, K. (1999). Teamwork: Assessing cross-disciplinary learning. Proceedings of the Computer Support for Collaborative Learning (CSCL) 1999 Conference, C. Hoadley & J. Roschelle (Eds.) Dec. 12-15, Stanford University, Palo Alto, California. Mahwah, NJ: Lawrence Erlbaum Associates. Graves, L. N. (1992). Cooperative learning communities: Context for a new vision of education and society. Journal of Education, 174(2), 57-79. Kuehn, S. A. (1994). Computer-mediated communication. Instructional Settings: A Research Agenda for Communication Education, 43, 171-183. Lave, J. & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge: Cambridge University Press. MacKinnon, G., Pelletier, J., & Brown, M. (2002). Coding Electronic Discussion to Promote Critical Thinking: A Cross-Curricular Teacher Education Approach. Annual Proceedings of the Society for Information Technology in Teacher Education, 1372-1374. Molinari, D. (2001). Online group processing: A qualitative study. Annual Proceedings of the Society for Information Technology & Teacher Education, 2895-2900. Rogoff, B. (1994) Developing understanding of the idea of communities of learners. Mind, Culture, and Activity, 1(4), p. 209- 229.

136

Community-Building in University Multimedia Design

Rovai, A. (2002). Building Sense of Community at a Distance. International Review of Research in Open and Distance Learning, 3(1). Russell, T. L. (1999). The no significant difference phenomenon. Chapel Hill, NC: Office of Instructional Telecommunications, North Carolina University. [Online] Retrieved November 24, 2001: http://cuda.teleeducation.nb.ca/nosignificantdifference Schaffer, S. & Douglas, I. (2003). Performance support for performance analysis. TechTrends, March, 2004. Tripp, S.D. & Bichelmeyer, B. (1990). Rapid prototyping: An alternative instructional design strategy. Educational Technology Research and Development, 38(1), 31-44. Walther J B (1992) Interpersonal effects in computer-mediated interaction: a relational perspective. Communication Research, 19(1), 52-90. Westheimer, J. & Kahne, J. (1993). Building school communities: An experience-based model. Phi Delta Kappan, 75(4), p. 324-28. Yen, S., Fruchter, R., Leifer, L., “Capture and Analysis of Concept Generation and Development in Informal Media,” ICED 12 th International Conference on Engineering Design, Munich, Germany, August 1999.

Is Electronic Community an Addictive Substance? An ethnographic offering from the EverQuest Community. Florence Chee and Richard Smith 1.

Introduction Many people argue that computer games are unhealthy, and some people claim that they are “addictive.” Such arguments assert that as a result of these games, children have lost interest in school, spouses have lost interest in partners, and employees are coming to work tired and distracted. Is this true? And if so, what makes online games so “addictive?” In this study we report on ethnographic fieldwork from “inside” the EverQuest community, including participant observation and in-depth interviews concerning the experience of becoming a member of the EverQuest community. We test our hypothesis that video game addictions are constructions of a greater need for community. First we discuss the issues regarding addiction and community as they pertain to this study of EverQuest. Second, we highlight initial findings from our fieldwork data. Third, we suggest possible avenues for further exploration by looking at possible implications of regulating online communities. We have looked at historical perceptions of addiction, the concept of community, and conducted background research on the online game EverQuest. Our fieldwork included conducting participant observation in Norrath, and interviews with people who treat addiction disorders and those who partake in the community of EverQuest. The manner in which society constructs addictions like game addiction stands directly to influence policy that is not only in reference to the addictive properties of games, but the pathologization of behaviour that certain groups of influence find unacceptable, or simply different. We acknowledge that it is a risky undertaking to present the concept of addiction as a social construct, but when one uses the word addiction, substance dependence, destructive behaviour, and compulsivity often fall under the same umbrella term which can be confusing at best, especially when concerning online games like EverQuest and attempts at regulating those games in the same manner as narcotics. Quite a few academics in the past have come under fire for similar opinions on addiction-essentially having deviant opinions about deviance. They have been accused of being insensitive or ignorant about the pain caused to self and others by engaging excessively in an activity. 1 Our argument should not 1

Examples are Schaler (2000), Peele (1975), and Alexander (2000).

138

Is Electronic Community an Addictive Substance?

be mistaken for a dismissal of biomedical conceptions of substance dependence and abuse, but rather as a piece that cautions against the application of blanket diagnoses and policies for life conditions that might stem from social, rather than medical causes, exclusively. We also believe that as there is a better vocabulary developed to articulate the difference between physiological, psychological, and psycho-social behaviour, there will be less need for the current usage of “addiction” to describe what could be physical substance dependence, a compulsion for which many reasons may exist, or an engaging and entertaining activity. 2 2.

Addictionology and Community We argue that it is not the use of substances that warrant the term addiction, but rather the suggestion of destructive behaviour that causes people to assign the term addiction to an activity. We wish to show why this is problematic, especially in the context of electronic communities. It is necessary to be critical of those seeking to regulate online communities by policies that deem an activity to be addictive. As game manufacturers come under fire for deliberately designing an addictive game, a cautionary light must also be cast upon the other side of the spectrum from those calling for increased pathologization of what could essentially be a compelling community in which one chooses to exist. The word “addiction” was never closely linked to drugs or to disease until the propaganda associated with the mid-nineteenth century Temperance movement became widespread. 3 The movement spurred an intense medical interest in treating the excessive consumption of alcohol or drugs. Hence, the vocabulary has revolved around treatment of a condition. Popular culture in the form of self-help books and talk shows have also been implicated in bringing medical vocabulary to mainstream discourse, promoting the idea that any personal problem manifests itself in some “…form of ‘compulsive’ behaviour—‘addictions’ to food, romance and sex, destructive relationships in which one became “dependent” on an inappropriate partner, even excessive shopping and plastic surgery…”

2

In this case, we mean the term psycho-social to include intangible and ‘soft’ reasons for one’s actions, such as cultural, community, and societal influences. Further elaboration on a similar addiction paradigm can be found at Stanton Peele, The Stanton Peele Addiction Website (2002 [cited); available from http://www.peele.net/intro.html. 3 Joseph R. Gusfield, Symbolic Crusade: Status Politics and the American Temperance Movement, 2nd ed. (Chicago: University of Illinois Press, 1986).

Florence Chee & Richard Smith

139

were eventually seen as ‘addictions’”. 4 Through the media and rhetoric as found in the Temperance movement, ‘addiction’ has taken on a different meaning from the original sense of the word, which was the diligent pursuit of an activity. 5 The term is now being used to describe very different types of behaviours, from use of crack to participating in the game EverQuest (often referred to as ‘EverCrack’ in popular media). 6 The mere use of the word addiction to govern actions has been and is increasingly problematic. Addiction is a word that serves to encompass so many human conditions, including feelings like pleasure and pain, for which it should be viewed as sorely inadequate. Unlike addiction, the word community has a positive association. As Bauman writes, “Community, we feel, is always a good thing.” 7 Briefly defined, community is a social structure around a common interest. The word represents the ideal kind of world we wish to inhabit but cannot realistically. Community is often where one can realize a positive sense of self, but when alienated, one is negatively detached from community and the sense of self is dislocated and fragmented. Community appears to talk about “a special closeness or bond which unites some persons and differentiates them from others.” 8 According to current definitions, the people associated with sustaining the relationships in the EverQuest environment would definitely qualify as a valid community. The land of Norrath contains a group of people existing in the same locality, following codes of conduct which can be implicit or explicit. As players indicate, they have common interests of well-being, which the game environment does facilitate, and are in groups forming distinct segments of people. There are occurrences of sharing, participation, and fellowship while interacting with one another in the game. EverQuest is the site of community, where players can be seen duplicating and adapting many of the relationship dynamics that take place in everyday life to the game experience. This is not to say that those relationships are not carried out in the offline world and vice versa, as many are and have become familiar with fellow players on and offline. It therefore makes a good field site and case study in a cyber-context, and the relationships found therein are much more complex than one might initially realize. 4

Elayne Rapping, ed., U.S. Talk Shows, Feminism, and the Discourse of Addiction, Gender,Politics and Communication (New Jersey: Hampton Press Inc., 2000). 5 Oxford English Dictionary, ed. prepared by J.A. Sumpson and E.S.C. Weiner (Oxford: Clarendon press, 1989). 6 Everquest or Evercrack? (CBS NEWS, 2002 [cited June 17 2002]). 7 Zygmunt Bauman, Community: Seeking Safety in an Insecure World (Cambridge: Polity Press, 2001). 8 Joseph R. Gusfield, Community: A Critical Response (New York: Harper & Row, 1975).

140

Is Electronic Community an Addictive Substance?

People are indeed addicted to this game, but they are only ‘addicted’ in the sense that people everywhere have activities in which they are actively engaged, fulfilling a need for identity and community. Instead of making ‘addiction’ in the EQ sense, a disorder, people should be seen as merely finding a community to which they can belong. 3.

What is EverQuest? Sony Online Entertainment’s EverQuest (EQ), is one of the most popular massively multiplayer online role-playing games (MMORPG) ever to hit the online game market, with a subscriber base of more than 430 000 gamers. 9 At any one point, around 2000 players per server are going about their daily business online. Sony currently supports 41 EverQuest servers and each server runs an independent version of Norrath, which is the EverQuest world. 39 out of 41 servers are based in North America, and the two latest additions are based in Europe. 10 EverQuest is set up with a persistent environment. This means that events in Norrath happen real-time, and players cannot save the game in order to return to the exact same scenario the next day. Like the real world, events in one place happen whether someone is there or not. The game makes an attempt at approximating real idle time in changing conditions of weather, roaming creatures, and one can pass the time by doing something as simple as watching the sunset. Rare creatures can spawn and be killed by guilds for valuable loot while one is offline. One of the many possible reasons players feel compelled to stay in the game is the fear of missing out on something, or being left behind. People often spend as much time with EQ as they would a second job (anywhere from 10 to more than 40 hours a week). “Avatars in virtual worlds must work to do anything interesting at all”. 11 There are a number of cases in which people have turned to ‘addictions’ in order to “deal with” being laid off of a job for instance. This is reflective of the value society places on work as self-realization and identity. Informants in our study also noted that the difference between engaging in active 9

T.L. Taylor, "Multiple Pleasures: Women and Online Gaming," Convergence: The Journal of Research into New Media Technologies Vol. 9, no. 1 (2002).: 2 10 Mark D. Griffiths, Mark N.O Davies, and Darren Chappell, "Breaking the Stereotype: The Case of Online Gaming," CyberPsychology & Behavior 6, no. 1 (2003).82 11 Edward Castronova, Virtual Worlds: A First-Hand Account of Market and Society on the Cyberian Frontier (Gruter Institute Working Papers on Law, Economics and Evolutionary Biology, 2001 [cited April 15 2003]); available from http://www.bepress.com/giwp/default/vol2/iss1/art1/current_article.html.: 16

Florence Chee & Richard Smith

141

consumption (like gaming) rather than passive consumption (like watching television) highlighted the feeling of being social and productive—making what they do hardly an addiction. There is also an important social aspect to EverQuest, and in actuality, looks like the central reason for its success. It is advantageous to talk to others. This necessity for basic social contact and information exchange lays the foundation for deeper, more meaningful interactions that lend to the apparent sense of community found in EverQuest. Because of the difficulties in excelling in the game without others, multiple forums for discussion and collaboration exist in Norrath and other places, both online and offline. As Taylor (2002) finds, games like EverQuest place the user in many settings containing both on and offline friends, strangers, and people across the world, whose avatars may be virtually next to you in avatar form. One of countless examples of an online EverQuest community is the web site Allakhazam’s Magical Realm. 12 On this site, there are links to online communities discussing EQ, the latest news, player written storylines, biographies, chat rooms, art created by players, and many more modes of fostering a sense of community identity amongst EQ players. The game is much more than the computer entertaining a player by facilitating a little interaction—the game is the nexus of a vast online community, complete with its own modes of human expression in art, culture, conflict, and resolution. 4.

Moral panic and game addiction Many players use the term ‘addiction’ to describe their attachment to EQ, perceiving their time in front of the computer as a source of conflict with their relationships in offline real life (RL). 13 The popular media has often presented EQ as a game so destructive that the moment people start playing it, they begin to sever all ties with the tangible world. Players start to alienate real life friends and loved ones, leaving their school and work pursuits to spend their lives pursuing almost the exact same things in Norrath. There are support groups on the Internet dedicated to the ‘victims’ that lay in EverQuest’s wake, such as EverQuest Widows, Spouses against EverQuest, and so on. Our interest lays in the reasons players have for this abandonment of real life, versus the reasons others may have for this behaviour. Also under examination is the question of responsibility and culpability: How popular and compelling must something be in order to be a certified addiction that warrants a warning label? Must everything 12

Allakhazam, Allakhazam's Magical Realm: Your Everquest Community ([cited January 15, 2003 2003]); available from http://www.allakhazam.com. 13 Nicholas Yee, "The Norrathian Scrolls: A Study of Everquest (Version 2.5)," (2001).

142

Is Electronic Community an Addictive Substance?

strongly compelling have such a label? There have been extreme cases of game overuse. One such case is the story of EverQuest player Shawn Woolley from the United States that made headline news, game forums and discussion groups for months and whose case still makes chilling allegations concerning just how far a game (and its policies) can go. 14 In November 2001, the 21 year-old shot himself after playing EverQuest for thirty-six hours straight. His mother Elizabeth Woolley blames the developers, Sony Online Entertainment, for intentionally creating an addictive game. She is suing the company in order to get access to her son’s account and to have warning labels placed on allegedly addictive games like EverQuest. It is interesting that the game, and nothing else, was instantly blamed for his suicide. This raises many questions for how people can arrive at policies that are based on very few and isolated cases. What else was going on in his life? Are other players experiencing the same type of hopelessness he seemed to feel? If a product is as apparently exciting and engaging as EverQuest, should companies that make good games be wary of pending lawsuits? We attempt to convey what at least some players think about their experiences in the game with the fieldwork results in the following pages. 5.

Methodology The fieldwork for this report spans a six-month period from early December 2002 to late June 2003. In that time, Florence created her own avatar and conducted participant-observation in the land of Norrath on the Terris-Thule server. There, she conducted herself with the intention of excelling in the game—the initial motive for almost all game players. By nature, an ethnographic study is rich in descriptive data, and our informants did not disappoint. Here, we provide a relatively brief synopsis the results of the research carried out thus far. First, we discuss the mechanics of the study in terms of scope and methodology. Second, we highlight key themes that emerged from preliminary findings in the fieldwork data. Lastly, we compare and contrast our initial assumptions with the resulting data set, indicating what avenues should be explored further in the future. In addition to the researcher’s first-hand experience with the game, we use the testimony from four key informants: three additional EverQuest players and one psychiatrist who has attempted to treat EverQuest ‘addicts’ in his practice. 15 Though not intended to be representative and generalizable to the greater population of players, the 14

Death of a Game Addict (JS Online, [cited May 30 2002]); available from http://www.jsonline.com/news/State/mar02/31536.asp. 15 For these interviews, we obtained informed consent and followed the code of ethical conduct concerning the use of human subjects as prescribed by Simon Fraser University.

Florence Chee & Richard Smith

143

data gathered and ideas presented here do touch upon some insights that we doubt could have been collected by methods other than ethnography. The answers to questions in the findings highlight even more intriguing questions to ask in future phases of this research. This paper shares just a few of the many exciting findings, with a discussion on possible implications for online communities and notions of addiction. During the project, Florence created an avatar, whose activities and development she oversaw and nurtured. Participant observation involved regularly advancing her wood elf’s experience and level, as well as socializing with those she encountered in the community—forming friendship ties and valuable networks as the study progressed. This experience was integral to the study because it directly impacted her ability to understand the stories of other players, relay the game experience of a relatively new EverQuest player, and as a result convey possible reasons of why games like EverQuest present such an opportunity to participate in an engaging community. The implications of conducting fieldwork on the Internet are numerous and intriguing, worthy of further studies in themselves. However, for now, we will convey some key findings with respect to informant testimony and researcher experience which were derived with as much researcher sensitivity to possible ethical and methodological issues as possible. In current existing research on video game addiction, there is a notable lack of description addressing the subtleties of game experience and player motive. By having Florence act as participant-observer, she was able to bridge the gap between player and researcher, describing her experiences from an intimate point of view, yet addressing those experiences with an academic analysis informed by anthropological method. She socialized with those online and conducted in-depth interviews with EverQuest players offline. We were then able to contrast the reported experiences of these players with the data gained from addiction literature and medical personnel. In addition to the researcher’s first-hand experience with the game, we use the testimony from four key informants: three additional EverQuest players and one psychiatrist who has attempted to treat EverQuest ‘addicts’ in his practice. 16 Though not intended to be representative and generalizable to the greater population of players, the data gathered and ideas presented here do touch upon some insights that we doubt could have been collected by methods other than ethnography. The answers to questions in the findings highlight even more intriguing questions to ask in future phases of this research. This paper shares just a few of the many exciting findings, with a discussion on possible implications for online communities and notions of addiction. 16

For these interviews, we obtained informed consent and followed the code of ethical conduct concerning the use of human subjects as prescribed by Simon Fraser University.

144

Is Electronic Community an Addictive Substance?

In the game itself, Florence chose her character’s starting attributes. After that, she proceeded to nurture it by getting experience and practising its skills. Choosing what appealed to her most, she settled upon a Wood Elf of Druid class. The game allowed her to choose her name, hairstyle and colour, along with facial features and starting points to which she could allot to certain attributes such as agility, dexterity, and wisdom. Each avatar has a starting geographic region, determined by their race. EverQuest has many intriguing stories, and stories are continuously being circulated as expansion packs come out and, as one will be able to see, the experiences of individual players make for intriguing narratives. Participant observation involved regularly advancing the avatar’s experience and level, as well as socializing with those she encountered in the community, forming friendship ties and valuable networks as the study progressed. This experience had a direct impact on Florence’s ability to relay the experience of a relatively new player to the game, understand the experiences and stories of other players, and as a result convey the compelling nature of the EverQuest community. In many respects, Florence’s experiences in the land of Norrath closely matched the experiences of what people have come to know as “traditional” ethnography. That is, immersing oneself for an extended period of time in a given field site while attempting to build rapport and experience what it is like to live as a local through daily life and informant guidance. However, in this case people are anonymous, and everyone can be whom they choose to be. She was, in a sense, immediately a resident of Kelethin and instantly assimilated the attributes of being a Wood Elf, their inherent racial tensions, and at the same time, an EQ community member like everyone else. She was not immediately marked as an outside researcher. The implications of conducting fieldwork on the Internet are numerous and intriguing, worthy of further study at a later date. Various field notes and reflections were catalogued on a web log, further connecting the researchers, making them accessible to the Internet at large. Florence took part in chat sessions, message boards, and general conversation—a necessary component of game play. This laid the groundwork for general questions of players both in the game, and formal interviews outside the game for which we gained ethics clearance and ensured confidentiality of informant identities to the extent they wished. 17 Our offline player informants Derek, Edward, and James were recruited by “snowball sample” in that they were friends of one another and referred to one another in their interviews. 18 17

Ethics requirements approved according to guidelines for human research subjects as set by the Department of Research Ethics at Simon Fraser University. 18 Pseudonyms have been used for all informants to provide confidentiality.

Florence Chee & Richard Smith 6.

145

Preliminary findings-- some common themes In this section we report on participant observation fieldwork from “inside” the EverQuest community as well as interviews with players, and a therapist who has treated problem players. To date, we have looked at the past and present discourse of addiction, the concept of community, and conducted background research on the online game EverQuest. We provide player testimony along with some additional unique issues arising from the online game environment that one might not obtain by belonging to tangible community. All three were 26 years old, which also reflected the average age of EverQuest players according to a study conducted by Yee. 19 Firstly, Derek 20 is a college student who “… joined the bandwagon, bought it, played it, and it was pretty incredible.” The longest he has played in one sitting is ten hours. Derek has interests in anything technical. In his spare time he likes snowboarding and playing both console and computer games. He has played anything and everything since the Atari and Commodore 64 in the early 1980s. He gravitates towards real-time strategy games like WarCraft, and he likes role playing games, though he finds that they take up a lot of time. He currently plays CounterStrike, EverQuest and Allegiance. Derek is habitually an early adopter of many games. He got started on EverQuest after reading about the game in magazines and online when it first started. Originally none of his offline friends played the game, but he states that he made friends online. “Some people have different reasons for playing. Some people are in it for being in a guild, some people are in it for the loot, items, and some people are in it to help others.” Originally none of his offline friends played the game, but he says he “made friends online.” Secondly, Edward is a young professional working in the tech industry. He travels a lot and is often very busy with work. He enjoys fantasy and role-playing games and is also a competitive billiards player. Having played games for roughly eighteen years, he now plays EverQuest exclusively when he plays PC games. The longest he has played in one sitting is eighteen hours. He got into EverQuest because he knew one of his friends liked the game, so he tried it. The longest he has played in one sitting is eighteen hours. “The gameplay really sucks, but I had to play it. I don’t know why. I’ve given up dates, I’ve ignored my friends… neglected sleep and food and hygiene. I’ve done it all. 18 hours in a row.” He started playing EverQuest because he knew one of his friends liked the game so he tried it. Thirdly, James is a part-time student, and employed part-time, and seems to be the most dedicated EverQuest player out of the three 19

Yee, "The Norrathian Scrolls: A Study of Everquest (Version 2.5)." Pseudonyms have been used for all informants to provide confidentiality. 20

146

Is Electronic Community an Addictive Substance?

informants. Though his time during one sitting does not exceed twelve hours, he used to play twelve hours a day and outplay the other two informants in hours amassed per week. He currently plays six hours a day, when he gets back from work or school. “It was such a huge world, and it was the social interaction. You’d have to talk to people all over the world, which I found was pretty amazing.” What makes games like EverQuest in particular so compelling? The player informants touched upon a number of different points in their interviews, but some very interesting themes emerged that were not previously highlighted in other studies. The development of the playerdriven economy as a particular determinant of the governance of Norrath, grouping and obligation to the community, and social status were compelling factors in one’s decision to play the game for prolonged periods of time. We describe select themes that emerged from our informants in the following pages. A. The changeover from computer to player-run economy When EQ first started in 1999, the economy was largely controlled by computer-controlled merchants, known as Non-Player Characters (NPCs). Players, who were all relatively low-level at the time had to rely on NPCs to buy and sell goods. As the game evolved, players increasingly wrested control of the economy from the game NPCs and turned it into the current barter system. Derek talks about practical reasons for the change: “There were some high level characters like basically I might be level 20, but there are level 40 characters around and they would be able to kill something and get these fascinating weapons that everybody wanted. For example, the wizards, heavy magic users, had to buy spell ingredients. In order to do that, they needed a lot of money. They didn’t care about weapons and armour, so they used to sell the weapons and armour to other characters to get enough money to buy and create their spells. There was a huge investment involved in making spells. There were monks and such that decided that they didn’t need weapons, so could sell weapons for a certain amount of money.” The developers of the game responded and created a common area called the Bazaar where players meet at scheduled times. Consequently, a freer market came into existence, and with the added experience of increased freedom, emergent narrative and self-determination, the community continued to flourish. “It became a hub. It became where

Florence Chee & Richard Smith

147

everybody would go there to buy or sell equipment and you can meet somebody there, and you can always get something there… at the Bazaar.” B. Grouping and participation in raids Another theme was the need to join a player-organized group in order to be as successful as possible in the game. Edward brought up raids, which are missions involving many high-level players (30 to as many as 80) to vanquish a large target. He told me of a time when his guild was “camping” (waiting for) a target to emerge. The target was Rage Fire, which was a triggered spawn that one had to catch at the right time. The target could take anywhere from three to five days to spawn, and Edward’s guild was camping it for a while. One of the guild members camped Rage Fire for two days straight in shifts with a friend. One night, Edward’s phone rang at 3:45am. He picked up the phone, and the guild member in charge of the phone list told him that Rage was “up.” “So I get up and boot my computer, log on, and park in the right area, and we killed the creature and went back to bed. ” Playing with a group of peers and the social element associated with that is by far the key determinant of what players talked most about as “fun.” When we asked James what raids were like, he described his experiences as, “…it can get really boring, cause you’re just waiting. Then all of a sudden it’s just like—turns from sheer boredom to complete terror.” All three players spoke of a reluctance to leave a good group. Each player Class is capable of contributing its particular talents to a group, such as healing, spell casting, or fighting. Derek recounts his experience playing a Cleric, and, being the only full healer in the group, he could not log off while the group was still playing because that would essentially mean destroying the group. “There’s the aspect that you didn’t want to be a loser and just drop from a group after you get something.” Florence felt this pressure the times she grouped as well, and even though it was 3 o’clock in the morning and she had to wake up for work in three hours, her sense of “responsibility” to the players she had known a very short time kept her logged on and contributing. It was not the promise of monetary personal reward that kept her and other players on, but more a sense of duty. James reported that it is currently this social dependence that keeps him playing. He mostly participates in raids to help other people, as opposed to being in it for personal advancement and loot. While “soloing” is possible with certain classes like the Druid, many players find “Grouping” ultimately fulfilling and an essential part of the greater EQ culture, as players could only get so far alone. Their complementary strengths serve in being able to accomplish the most for one’s group in a given amount of time. There are many facets that make finding a group attractive, such as the continuous rapport building amongst guildmates, companionship during raids of 30 or more players, and getting

148

Is Electronic Community an Addictive Substance?

the most loot and experience by taking down a larger target. conveyed the excitement he felt in a particular raid he recalls:

James

“Most people do it with 50 people, but there’s 32 of us. We were doing pretty good and then all of a sudden people started dying. A lot of people. Even I died when [his] health was at 40%. So I’m bound in the Nexus, and I’m naked and all I can read is the main channel. And you know those war movies? They’re yelling and stuff, that’s what it was like. The Druids saved us. [the Dwarf] was at 10%, they were like Nukem! Nukem now! And boom, we won.” The level of dedication indicated by players is rarely seen for something that people do not care about, and leaves much to be considered when studying the place EverQuest occupies in the lives of individual players. James’ guild, like many, works on a point system. The more guilds a player attends, the more points they get. When loot is up for auction, players use their points to then bid on items. Hence, many players want to log on and attend as many raids as possible. C. Status and sense of accomplishment In their respective interviews, I asked Derek, Edward, and James what they felt kept them and others playing, and the reason that came up each time was the desire to achieve the next level for their characters. Edward insisted that the real fun did not begin until level 60, so one is more inclined to play more to get one’s character there faster in anticipation of participating in high-level activities such as raids. James’ thoughts when he was offline would be occupied with the cool things he could do once he reached a particular level. My experiences were similar, as I felt compelled by reaching the next level when I played as well. Achieving status and sense of accomplishment could be done in a number of ways. As James’ testimony indicates, he does feel an immense sense of pride in conducting a raid with only 32 players when most guilds need at least 50. He also indicates his feeling of being a part of something bigger, creating a historical narrative (as in war movies), his enjoyment of teamwork and the fulfilment that comes with those elements. “I’m in it because it’s really good. Especially when you take someone down that’s really hard. It makes me feel pretty good to do more with less.” As well, the drive to increase a character’s level exists because it allows the player to do more and see more, therefore lending more status to the character. Having the best weapons and skills help a player participate in more group events like raids and be seen as a more valuable member. Hence, the more time one spends online nurturing their character, the more prominent that

Florence Chee & Richard Smith

149

player becomes as a member of the community. James noted, “… if I get to level 45 I can cast all these cool spells. That was the motivating factor back in the early days… [now it’s] 5 more alternate experience points and I can get a horse.” Equipment is a very important status marker in the game. Edward stated that he “twinked” out his level 1 Druid, meaning that he transferred equipment that was better than completely new player might have to his Druid. “I had the best stuff that you could buy for a Druid.” James is five alternate experience points away from obtaining a horse—a highly visible marker of prestige and prowess for high-level characters. One can gain the awe of peers with the best equipment the same way having a large house or an expensive car might, and one can gain the respect of peers by being a high-level character the same way holding a prestigious profession might in the tangible world. Derek summarized what keeps many players from logging off: “…it was the idea that if you’re in a very good group, and you’re gaining a lot of experience points… you enjoy the feeling of achievement.” Griffiths (2003) indicates that players found the difficulty of advancing for casual players to be the least fulfilling aspect of the game. 21 This finding is consistent with our detailed informant reports and observation, that there is indeed a significant difference in rewards for players who spend more time online, socially and logistically. On the other hand, players reported that their favourite activities were levelling and building up their character, grouping and interacting with other people, and chatting. Derek pointed out that there is very little reward in logging on for a short time. The game takes roughly ten minutes to log onto. His experience coincided with the researcher’s in that he found that he needed about three hours to fully get into the game and find it rewarding. This was an interesting coincidence, though understandable. There are numerous everyday little ‘errands’ to do while waiting for something “exciting” to happen. It is our assertion that the game provides common goals that are fulfilling for the individual as well as the group, and that this is what facilitates the rich socially interactive environment that is so attractive to many players. Similarly, the Griffiths indicates that one of the most consistent points regarded the significance of the social milieu in determining the level of enjoyment one attained from gameplay. D. Anonymous therapy James, in his many hours online, had the chance to nurture relationships with his fellow players. The chat system became a way for players to obtain anonymous therapy regarding their offline problems. James recalls being almost a “surrogate father” to a ten year-old who told 21

Griffiths, "Breaking the Stereotype: The Case of Online Gaming." :85

150

Is Electronic Community an Addictive Substance?

James his problems, as well as countless times during idle moments people asking general advice about relationships. “People keep telling me things that they normally wouldn’t tell anyone else. Maybe it’s because of that anonymity…they don’t know me, and I don’t know them, so who will get back to you.” Unlike Edward, James prefers to remain anonymous. Edward has given his contact information to his guild members for use in raids, whereas James keeps in touch by way of forums and separate email accounts. This shows that one’s apparent emotional attachment to the EverQuest community is not necessitated by revealing one’s identity in the tangible world—and one might not want to anyway. E. Therapy for EverQuest addicts As Alexander notes, “Membership in something seen as destructive is far more endurable than no identity at all.” 22 Our interview with a psychiatrist who specializes in troubled youth was especially enlightening with regards to the implications of calling EverQuest an addiction. In his experience, despite the obsessive-compulsive nature of addiction, those who came to him as “EverQuest addicts” did not respond to the obsessive-compulsive medications that he administered. “Someone who’s addicted to alcohol doesn’t respond to anti-obsessive medications either.” When we asked what the circumstances were by which he met these kids, he stated that it largely seemed to be part of some other concern such as not doing well in school, lacking friends (offline), and battles with parents. “One kid was in here with his dad and the dad couldn’t get him to school…Dad had his own addictions and this boy was just massively addicted to EverQuest/ EverCrack. That’s how he came in. Not going to school and not participating…You don’t really see the addiction you see the fallout of the addiction, you see school, friends, and marks, hassles… I don’t really know if that’s the outcome of the addiction or whether those pre-exist with the addiction, or maybe those are some of the ingredients needed for addiction.” The psychiatrist believed that the main allure of games like EverQuest is the power and control players feel. He makes reference to the amount of time players indicate they have invested in evolving their avatar to its current level of power. While it is possible to think of the allure of the game in this light, as we have noted, the feelings the players have seem 22

Bruce Alexander, "The Globalization of Addiction," Addiction Research 8 (2000).

Florence Chee & Richard Smith

151

to be much more complex. Then again, why wouldn’t someone wish to participate in something that allowed the player to engage with others, feel in control, and be able to work hard to achieve power? When we asked players about the reward of levelling up, players did initially report the power. However, that power, when probed further, became a multitude of other senses—primarily the sense of accomplishment one might get from doing anything well. Edward bragged about getting his Druid from level 1 to 53 in two months. Derek noted a feeling of accomplishing something greater, particularly in that every member of a group plays an integral role in its success. James feels the biggest sense of accomplishment when he is part of a raid that can eliminate a target with thirty people when it would normally take fifty to do so. There are bragging rights associated with being known as part of one of the most powerful guilds on the server. There is something for everyone in the game, and multiple ways for people to define their own power. People are very inventive with the ways they finance themselves, as James recalls with a Bard who made her living by buying things for below market value, and selling them for above market value. When commerce is involved, this type of outsourcing is possible, so people who do not wish to engage in combat can find strategic ways of avoiding it, as in this case. The Bard made enough money to pay one of the most powerful guilds on the server to help her do her epic quest and obtain high-status equipment. In this manner, she did not have to do the quest herself and saved herself time. So far, he and his colleagues have been unable to find an EverQuest ‘antidote’, but we found it thought provoking that the only successful “treatment” for his EverQuest players was for the therapist and parents to work together in getting their offspring interested in other, nonelectronic activities. Ironically, all of these drug free solutions centred upon finding alternate communities, such as horseback riding or swimming although one must think of why these are considered solutions. Are they only solutions because society tends to privilege non-electronic activities like sports, which people can do just as if not more compulsively? Derek echoed this sentiment when he noted: “Sports are addictive. A person will play hockey for their entire lives. If a person spends the same amount of time on a sport as they do a computer game, we would not label them as being addicts because they’re striving to improve themselves. The problem with a computer game is that they’re trying to improve themselves in a virtual way, something that doesn’t accomplish something in real life. That’s sort of the difference, but you still have the same level of accomplishment. You still feel good about yourself.”

152

Is Electronic Community an Addictive Substance?

Social roles and relations are continuously being negotiated and played out online. The case is no different in EverQuest, where there are many instances of behaviour reflecting those found in the tangible world like social hierarchy, gender experimentation, and construction of bonds like marriage. James shared an anecdotal experience of a wedding he attended in Norrath. The ceremony took place on one of the most beautiful white bridges of Norrath, located in North Karana. Numerous players attended the ceremony and James got there by paying someone 200 platinum pieces to port him there because he was late. He described the scene to be like what would find in a medieval movie, with beer, duels, and much jubilation. Apparently the marriage only occurred online, though there are other instances of EverQuest players meeting offline as well. One wonders, if EverQuest is just a simple game, or even an addictive ‘substance,’ why people would bother to do things like get married. 7.

Conclusion Around the world, people are dealing with online game usage by doing things like suing game developers, calling for warning labels to be put on games, implementing curfews to curb game usage, prescribing medications and sending kids to therapy. 23 On one hand, businesses like game manufacturers and Internet café owners are blamed, but at the same time game players are encouraged to realize they have a problem that could be all their own. The evidence is complex, and often conflicting. Our informants likened their participation in EverQuest as a hobby that lent status and identity. Is it right to label an online community addictive? Are curfews necessary? Are stricter laws working to curb game addiction? Our analysis of game experience and informant testimony shows that regulation and control of games is ultimately not a correct course of action in order to heal social dysfunction, of which excessive participation in electronic communities is only a symptom. It seems as though energies that are spent searching for stricter rules and better treatments should indeed be spent on addressing gaps in the context of people’s lives. Through discussion and exploration of addiction and community through the lens of EverQuest, it is apparent that EverQuest is a valid community, and that labelling the game as addictive is a fundamentally incorrect course of action. The experience of EverQuest does not seem to be an addiction from which one needs to be dissociated, but rather a community with which one chooses to identify. An in-depth discussion of perspectives regarding the historical contexts and social causes of 23

Caroline Gluck, South Korea's Gaming Addicts [Web News] (BBC News, 2002 [cited June 30 2003]); available from http://news.bbc.co.uk/1/hi/world/asia-pacific/2499957.stm.

Florence Chee & Richard Smith

153

addiction such as the basic human need for community will be addressed in future reports. In this case study of EverQuest, we tested our hypothesis that video game addictions are constructions of a greater need for community. First we discussed the issues regarding addiction and community as they pertain to this study of EverQuest. Second, we highlighted initial findings from our fieldwork data. Thirdly we have also suggested possible avenues for further exploration throughout the paper in terms of policy, implications, and social causes of addiction. Player testimony and participant observation have indicated that the game is compelling, but it should only be considered addictive in so much as any other social behaviour like work, school, and hanging out with friends is. EverQuest should not be labelled as an addictive substance because if we were to do that, then we would most likely need to label everything potentially likeable as addictive. EverQuest fits the criteria for a community, and is a rich, robust, imaginative and almost utopian one— unlike many communities found in the tangible world. It is for this reason we believe EQ is a very attractive community of which to be a part. We believe EQ is a very attractive community of which to be a part. We were curious to see if the researcher’s participation in the game would have adverse effects similar to that of other reported worst-case scenarios. We can safely say that aside from a few nights of staying up late to group or chat, there was no harm done.

154

Is Electronic Community an Addictive Substance? Bibliography

Alexander, Bruce. "The Globalization of Addiction." Addiction Research 8 (2000): 501-26. Allakhazam. Allakhazam's Magical Realm: Your Everquest Community [cited January 15, 2003 2003]. Available from http://www.allakhazam.com. Bauman, Zygmunt. Community: Seeking Safety in an Insecure World. Cambridge: Polity Press, 2001. Castronova, Edward. Virtual Worlds: A First-Hand Account of Market and Society on the Cyberian Frontier Gruter Institute Working Papers on Law, Economics and Evolutionary Biology, 2001 [cited April 15 2003]. Available from http://www.bepress.com/giwp/default/vol2/iss1/art1/current_artic le.html. Death of a Game Addict JS Online, [cited May 30 2002]. Available from http://www.jsonline.com/news/State/mar02/31536.asp. Everquest or Evercrack? CBS NEWS, 2002 [cited June 17 2002]. Gluck, Caroline. South Korea's Gaming Addicts [Web News]. BBC News, 2002 [cited June 30 2003]. Available from http://news.bbc.co.uk/1/hi/world/asia-pacific/2499957.stm. Griffiths, Mark D., Mark N.O Davies, and Darren Chappell. "Breaking the Stereotype: The Case of Online Gaming." CyberPsychology & Behaviour 6, no. 1 (2003): 81-91. Gusfield, Joseph R. Community: A Critical Response. New York: Harper & Row, 1975. ———. Symbolic Crusade: Status Politics and the American Temperance Movement. 2nd ed. Chicago: University of Illinois Press, 1986. Oxford English Dictionary. Edited by prepared by J.A. Sumpson and E.S.C. Weiner. Oxford: Clarendon press, 1989. Peele, Stanton. The Stanton Peele Addiction Website 2002 [cited. Available from http://www.peele.net/intro.html.

Florence Chee & Richard Smith

155

Rapping, Elayne, ed. U.S. Talk Shows, Feminism, and the Discourse of Addiction. Edited by A. and Liesbet van Zoonen Sreberny, Gender,Politics and Communication. New Jersey: Hampton Press Inc., 2000. Taylor, T.L. "Multiple Pleasures: Women and Online Gaming." Convergence: The Journal of Research into New Media Technologies Vol. 9, no. 1 (2002): 24-26. Yee, Nicholas. "The Norrathian Scrolls: A Study of Everquest (Version 2.5)." (2001).

156

Is Electronic Community an Addictive Substance?

Author Affiliation/Notes on Contributors Florence Chee Research Assistant, Centre for Policy Research on Science and Technology at Simon Fraser University and The New Media Innovation Centre Suite 600-515 West Hastings Street, Vancouver, BC V6B 5K3 CANADA Florence Chee is a Graduate student in the School of Communication at Simon Fraser University, Vancouver, BC, Canada. With a background in Anthropology and Honours Communication, her primary academic pursuits include research on the social implications of technology and online gaming communities, as well as how ethnographic narrative can inform technological design. Florence is part of the research staff actively involved in SFU’s Centre for Policy Research on Science and Technology (CPROST) and the New Media Innovation Centre (NewMIC). Richard Smith Associate Professor, Communication, Simon Fraser University 515 West Hastings Street, Vancouver, BC V6B 5K3 CANADA Richard Smith is an Associate Professor of Communication at Simon Fraser University. He is also Director of the Centre for Policy Research on Science and Technology (CPROST) at SFU’s Harbour Centre campus, in downtown Vancouver and a Research Scientist at the New Media Innovation Centre (NewMIC). Dr. Smith studies the social construction of our relationships with technology. He is currently researching: 1) new tools to help designers and policy makers think about the future in creative and constructive ways, 2) the role of social capital in clusters of high technology firms in the new media sector, and 3) new applications for information technology in support of scholarly publishing.

When Identity Play Became Hooking Up: Cybersex, Online Dating and the Political Logic of Infection Jeremy Kaye You can be whoever you want in cyberspace Early Net dictum. But it is important to remember that virtual community originates in, and must return to, the physical. No refigured virtual body, no matter how beautiful, will slow the death of a cyberpunk with AIDS. Allucquère Rosanne Stone1 1.

Pre-Introductory Notes and an In-Flight Movie

In the last few years, as we have seen the proliferation of Internet dating/sex sites, one thing is becoming clear: websites such as match.com, lavalife.com, adultfriendfinder.com, as well as more “controversial” ones such as barebackcity.com, prove that the Internet is no longer just a virtual space, in which disembodied subjects “chat” with one another, never really knowing who they are chatting with, never intending to meet their chat-partner in material reality (that is, RL or “real” life to cyber-junkies and online chat-addicts). More and more Internet users are using the web’s unmatched communicativity in order to meet online with the intention of, what I will call, physically “hooking up.” In fact, the British newspaper, The Guardian, estimated in 2002 that over one-third of all Internet users are using the web to physically meet with the possibilities of romance and/or sex.2 What this type of use implies is that a paradigm shift has taken place: a shift that not only has not been commented upon by many theorists of cyberculture studies, also called new media studies; but also a shift that radically alters many of the implications of the existing theoretical discourse that has contributed to shaping how cyberspace has been used and thought of in our culture. In other words, what I hope to argue in this paper, is that this so-called paradigm shift implies more than just an increasing use of the Internet for sex; but it also alters many of the ideological implications of the Internet as such. The signs of this shift are endless. The largest online dating site in the U.S. is Match.com, a Microsoft run website with over eight million customers in the states. It is similarly not uncommon to pass lavalife.com billboards when one is driving the freeways here in California, or to read a Rolling Stone article about a particularly “notorious” sexual-site - a website that not only tells its users that the purpose for this website is physical sex, but also essentially forbids users to use the particular website

158

When Identity Play Became Hooking Up

____________________________________________________________ for anything but meeting in person with the intention to have sex.3 In short, the information is out there. It is, at the very least, intuitively understood that users are logging on to the Internet with the intention of RL sex. After all, how could we have these Match.com “success stories” when we first log onto that particular website, with “success” defined as couples who meet on match.com, get married and have children - i.e., compulsory heterosexuality with procreative purposes. But it is also, again at the very least, symptomatic of the repressive roots still so deep in our cultural ethos, that online “Netsex” users seem a bit reticent to talk about the web in this way. And the critics behind the theoretical construction of the Internet as such appear no different at first glance. When sitting down to begin this research endeavour, what I found most striking and problematic is that no theorist seems to talk about this radical shift in chatting culture - from what I term identity play to what I term hooking up. Cybercultural theorists - from Sherry Turkle and Mark Poster to Donna Harraway and Allucquère Rosanne Stone - are wholly invested in paradigmatic questions of the Internet and identity, and the Internet and embodiment; but they seem to wholly overlook the reality that people are using the Internet to have real, that is corporeal, sex. So the question becomes, why aren’t theorists talking about the Internet as a space for sex? And, possibly more important, why aren’t non-academic subjects talking about this cultural phenomenon either? It is with this question in mind that I found myself recently on a plane over the Atlantic, laptop on lap, not coincidently about to present a previous version of this paper at a conference in Prague.4 Having reached an intellectual roadblock, I decided to take a break from revising and watch Bringing Down the House, a movie that I was positive would take my mind off of my work. I remembered the movie being in theatres a few months prior. The plot, vaguely, is as follows: comedian Steve Martin plays a divorced, workaholic, lonely-of-late lawyer, who by an online accident (which by no accident, finds it way into this introduction) meets a loud, busty, ex-con African-American woman, played by Queen Latifah. Now, what is most interesting for me about this movie is how the Internet gets coded in two particular scenes (the second of which, I will turn to later in the paper in the discussion of online dating and digital photography). The first scene, however, occurring early on in the film, portrays Martin sitting in his office, chatting with his new online romanticinterest, about to arrange a meeting with her in-person. But, when Martin’s friend and co-worker comes into his office, Martin abruptly closes his computer and, subsequently, is embarrassed to reveal he is going on a date with a woman that he met on the Internet.

Jeremy Kaye

159

____________________________________________________________ Watching this scene, I thought of a similar narrative played out in an advertisement for Match.com, which, as I stated previously, is the largest dating site in the states. This Match.com advertisement, which was showing primarily in cinemas as a theatrical trailer (approximately a year or two before the release of Bringing Down the House), has interesting similarities with the scene in the film I’ve just described. In the advertisement, a newlywed couple talks to one set of parents about how they first met. The couple narrates a fictional account of their initial meeting as a romantic, chance encounter near the Eiffel Tower in Paris. At the same time as we the audience are being shown “fictional” film footage of their “meeting” in Paris, we learn, in actuality that the couple met on Match.com. (As the fictional film footage of the couple in Paris is shown, there is a caption below which gives us the information.) In short, these two examples speak to a wider, more generalized cultural unwillingness, perhaps even embarrassment, to talk about virtual space as a vehicle for real meetings/real sex. There seems to be an embattled relationship between cultural ideology (what is considered a “normal” space for meeting partners [bars, clubs, Paris]) and cultural praxis (what might be considered the “abnormal” use of Internet to meet people). What seems clear is that the Internet has taken on the cultural stigma that was once attached to newspaper personals and other various dating services via telephone and television. What also seems clear, however, and what these pop culture representations and indeed the 8 million people on Match.com seem to be copping to, is precisely what the theoreticians are still unwilling to discuss - the very reality that people are using the Internet to have “real” sex. Insisting on the virtual as a disembodied space of anonymous identity-play with no real potential for material interaction both reflects and masks certain ideological assumptions. It constrains new media discourse from taking fully into account the political ramifications that are implied with the increase of physical/sexual interactivity. This paper, in essence, is a kind of polemic that not only argues an epistemological rupture has occurred in the constitution of cyberspace, but also that the theoretical foundations surrounding cyberspace need revision. And if the discourse is not opened up soon, there is a real risk of losing cyberspace to hegemonic, heterosexist discourse and a correlating culture of stigma and fear. 2. A Proper Introduction - Or, “Who uses the Internet just for cybersex anymore? Everyone I know uses it to get laid!” Let’s face it. Anyone who’s had cybersex knows what a nuisance it can be. You’re trapped behind a computer screen, typing away to god

160

When Identity Play Became Hooking Up

____________________________________________________________ knows who, and besides, you’re left to do all the finishing touches yourself. But wait a minute. Cybersex was a 90s thing, wasn’t it? A fad? How many people are still doing it? Aren’t people today using the Internet’s vast networks in order to physically meet the “right” person and inevitably, have “real” (meaning physical) sexual intercourse? In other words, has the paradigm shifted – from disembodied, anonymous cybersex - back to physical sex? I think of everyone I know, and how they have all used the Internet, at one time or another, to physically meet and hook up with both men and women (some use it religiously; others have used it once or twice, and then have gone back to the old-fashioned method of meeting people - bars, clubs, blind dates, et al.). When starting to think about this project, I thought of one particular friend of mine, who had been going online for years, from chatting to role-playing and multi-user domains (or MUDs, as they are commonly called). I’d assumed he was using the Internet in the spirit of Sherry Turkle’s “social laboratory” with the hope of self-reinvention. Following Turkle, my initial theoretical questions were based on the assumptions and paradigmatic questions surrounding embodiment and identity in cyberspace that has informed so much of the discussion around new media: can you really be whoever you want to be in cyberspace? How are the meanings of this “virtual” identity negotiated physically, psychically and culturally with one’s “real” identity? After talking to my friend at length, my question changed, though not in the way I expected. “Being whoever you wanted to be in cyberspace,” my friend told me, “was the old way of thinking, the textonly way.” He said, “Back then, you could be whoever you wanted to be. Now, you have pictures, you have bio-profiles, you have interests. You have to sell yourself. You can’t exactly be whomever you want anymore. It’s not like it used to be where you could make-up and play several personalities at once. Now you have a picture. With your picture up there, you can’t really be anybody else. Yeah, maybe you can tinker with your personality a little bit, but you can’t change what you look like. And you need to have a picture. Without your picture, no one will talk to you. That’s the first thing anybody looks for.”5 I then went back to my friend and asked him the main reason he went online, and he answered, to hook up. “Cybersex?” I asked. “No. Cybersex is really kind of looked down upon in chat-rooms. Kid’s stuff. Everyone I know uses the Internet to get laid.” At the same time as I was made aware of my friend’s online “activities,” I encountered a front-page article in USA Today, in which the United States Government officially blamed an increase in the AIDS

Jeremy Kaye

161

____________________________________________________________ epidemic on the Internet.6 I thought of how the traditional text-based idea of cybersex (i.e., two anonymous subjects typing at computer terminals, unable to see the other) had been politicized in the early 1990s as an alternative, disembodied, fluid-free, safe “sex” for AIDS sufferers. But now, on the contrary, the Internet was causing AIDS rather than providing haven for victims - at least according to the government. I paused for a moment. The thought of this terrified me. If we look closely at cybersex as a cultural phenomenon, it contains many of the discursive formulations around identity and the Internet. As I stated previously, most of the discourse around identity and new media had been formulated in the early to middle 1990s - an era that was working from a disembodied, text-based computational paradigm.7 Seeing the possibilities of de-centred subjectivities, de-linked from the corporeal body, theorists in the early 1990s saw cyberspace as a kind of “utopian” space for mobility. This trope of mobility was formulated in mainly two ways: (1) identity mobility (i.e., you can be whoever you want to be in cyberspace); and (2) mobility in a subversive sense, that is, ideologically-constructed norms can easily be undermined in cyberspace (e.g., There are no closets in cyberspace; or, cybersex is the ultimate safe sex for the 90s). The theoretical manifestations of these de-centred subjectivities have been Donna Harraway’s cyborg, Stone’s multiply personaed, cross-gendered subject, and Mark Poster’s vision of a “postmodern” and “post-national” subject. My initial theoretical project followed these assumptions, but in the context of my friend’s statements, I had to ask myself a more difficult question. Is identity-play the primary reason that subjects are using the Internet? In other words, how have these paradigms of identity-play and resistance been essentially changed with the additions of specific technologies of location (user-profiles, photos, personal information)? Perhaps, the discursive formulations of these mid-90s theorists were in need of revision. In what follows, I will explore what the discourses surrounding online identity construction have yet to take into account - a third, and perhaps most significant, notion of mobility: (3) physical mobility, as the Internet is being used more pervasively as a vehicle for corporeal sex. So, how do we attempt to theorize the change that has taken place in cyberspace, from a “disembodied” space to a vehicle for corporeal “hooking up?” I locate the critical moment of this historical shift to be in the widespread introduction and use of the digitized photograph into chat user-profiles in the middle-1990s. I will then argue that such a paradigm shift in the way the Internet is perceived and used has ultimately fuelled a dialectical movement towards containment. This dialectical relationship between mobility and containment forms the structural framework for this

162

When Identity Play Became Hooking Up

____________________________________________________________ analysis. The tropes for this containment that I will explore here are (1) photographic (seen in the digitized photograph as it constrains identityplay, while simultaneously re-“embodying” virtual space); and (2) a media-constructed “moral panic” due to the fear of infection/contamination (as seen in several mainstream-media outlets). With this paradigm shift placed within a historical context, and hooking up configured as mobility, I will discuss this ideologically-motivated fear of infection as nothing but the panoptic gaze (re)entering cyberspace, stigmatizing the “new” virtual space as inherently virus-spreading (formerly virtually spreading within networks, but now, actually spreading from networks into material bodies through the ubiquitous figurehead of AIDS). 3. Allucquère Rosanne Stone and the Privileging of the Virtual over the Material in Cyberculture Theory The privileging of the virtual over the material has long had its accepted place in the theoretical discourse of cyberculture studies. As early as 1995, Sherry Turkle, in her groundbreaking book, Life on the Screen: Identity in the Age of the Internet, says the Internet has become a “significant social laboratory for experimenting with the constructions and reconstructions of self.”8 It is, indeed, difficult to even consider the Internet as anything other than this infinitely disembodied space, in which subjectivities, as Turkle notes, are separated from the corporeal bodies, enabling subjects to become whomever they want online. If one is a man and wants to become a woman (or vice versa), one can do so, easily and naturally, online. If one is shy in social situations and wants to become rather gregarious, one can meet hundreds (perhaps even thousands) of different people in cyber chat-rooms. In other words, cyber-subjects are free to become different versions of themselves because online, one is no longer empirically bound to one’s corporeal body, and identity-play is the norm. If much of the theoretical discourse surrounding cyberspace follows these central tenets (disembodiment, identity-play, transformation), then the Internet is clearly being constituted as the quintessentially “postmodern” space, in which there is an intense blurring of fiction and “reality,” becoming increasingly difficult to tell which is which. The new media theoretician who has written what I consider to be the most thorough, complicated, and complicating critique of the contradictory impulses at work in cyberculture studies as a theoretical movement, is Allucquère Rosanne Stone.9 In The War of Desire and Technology at the Close of the Mechanical Age, Stone daringly negotiates the borders between the utopian virtual and the dystopian material (those

Jeremy Kaye

163

____________________________________________________________ contradictory impulses in new media studies). She articulates the history of technology as a “war” between “bodies” and “selves,” paying particular attention to the ways in which computational networks alter the boundaries of the body - both physical and psychical. All of Stone’s ideas are embodied in her figure of the multiply-personaed, transgendered, virtual/material subject, as s/he navigates the endless nets of information. Deconstructing the traditional binary relationship between virtual/material, is perhaps the key to Stone’s critique, in that her subject, as well as her project as a whole, seem to rely on this notion of displacing the material body as the privileged “site of political authentication and political action.”10 In other words, Stone’s positing of a multiply-personaed, transgendered subject enables her to theorize cyberspace as a discursive space which constructs subjects that are radically changing the political economy of material reality. The only problem I see in Stone’s truly visionary critique of what she terms the “metaphysics of presence” (the cultural privileging of the material body as the true site of agency), is that her transgendered subject needs cyberspace in order to affect these changes in the political climate. In other words, one needs to enter the virtual space in order to “transform” material space. Here is Stone, herself, on the subject: In cyberspace the transgendered body is the natural body. The nets are spaces of transformation, identity factories in which bodies are meaning machines, and transgender—identity as performance, as play, as wrench in the smooth gears of the social apparatus of vision—is the ground state. … Conversely, I think, obviously, in physical space the transgendered body is the unnatural body.11 The utopian rhetoric, at least in this instance, is clear. To posit the transgendered body as “natural” in cyberspace (while still unnatural in material reality), is to focus myopically on the radical changes implicit in cyberspace, and, to further neglect a simple question: how much does Stone’s rhetoric here rely on notions of disavowing the material body and all of the violences, both physical and cultural, that are written onto that body (i.e., physical-handicaps, racism, sexism, homophobia, transgender, to name just a few)? In fact, this seems to be the question that Stone seems to evade throughout her entire argument. She plugs along, continuously iterating that there is no essentialized identity - all identity is performative, and playing with identity is the norm. In order to cement this point, Stone comments that 15% of the 150,000 users of Habitat, one of the first virtual

164

When Identity Play Became Hooking Up

____________________________________________________________ communities, are engaged in cross-gendered behavior.12 Mobility, in the sense of changing identities at will with a few keystrokes, is seen here as, if not utopia, then at least a site for radical “political action.” What I always found dissatisfying about these theoretical positionings is that there seems to be a blind, technophilic, utopian rhetoric working here - something akin to: you can be whoever you want in cyberspace; or, any personal stigma or bodily threat you experience in “real life” is thereby negotiated when you enter the self-enclosed “fantasyland” of cyberspace. Before we embrace virtuality as a social laboratory (which seems to implicate real life change, similar to Stone’s notion of multiple cyber-identities disrupting the “social apparatus”), we have to ask ourselves, who controls our “real” identities, and whether the virtual systems (as modes for identity construction) are a constructive way to get out of their control, or whether those disembodied systems are simply an escape? And, more importantly, do these performative net identities promote “real” life changes? For instance, if, according to Stone, a transgendered body is “naturalized” online, then how far are we from if not naturalizing, then at least accepting transgendered bodies in “real” life? In order to answer these questions, let’s return to Stone for a moment. She defines the disembodied anonymity of cyberworlds through what they lack - namely, corporeal bodies and all of the identificatory and possibly marginalizing “real” world cultural encodings that are attached to those bodies. Stone goes on: “codings that have attached themselves to voice quality and physical appearance have been uncoupled from their referents.”13 (As noted before, Stone is writing here in 1995 before the digitized image had become ubiquitous in online identity construction.) So, in effect, the multiple identities that Stone sees as a point for dynamic political change are inextricably linked to the fact that they are disembodied, anonymous - outside of these cultural “codings,” which constitute identity in “real” life. In other words, Stone sees the material body as a site of politico-ideological containment, and, on the other hand, sees multiple online identities as a way to remain “politically” mobile. But this political mobility, specific to cyberspace, begs a significant question: how long can cyberspace keep hegemonic ideological constructions out, especially if its potential is defined against them? I assume in order to put so much stock into the online body rather than the material one, is to hope against hope that hegemonic culture doesn’t find its way into the world of cyberspace. In my opinion, these are awfully big hopes, perhaps too big. It seems that an ideological invasion is simply a matter of time. Sue-Ellen Case, in The Domain-Matrix: Performing Lesbian at the End of Print Culture, sees Stone as significantly privileging the virtual

Jeremy Kaye

165

____________________________________________________________ over the material here. Case goes even further, saying that cyberspace “cut loose from the bodily referent, then, operates like a universal(izing) class, practicing the privilege of abandoning the local.”14 If “bodies” in cyberspace are “cut loose from the bodily referent,” and if the material body seems to be where hegemonic norms are written, then it comes as no surprise that not only transgendered bodies, but also various other culturally marginalized bodies find haven on the Internet. (A point to which we will return in our discussion of AIDS and online sex.) Obviously as a transgendered body herself (she writes also as Sandy Stone), Stone has a lot of personal investment in her argument. The transgendered body, stigmatized in physical reality, finds haven and/or becomes “naturalized” in cyberspace. It is ironic, however, that the way for this body to become naturalized is precisely contingent on its very disembodiment. Stone defines disembodiment as not an absence of real bodies (though that absence surely is implied), but as the absence of the “cultural signification” of the real body (as it is signified by the voice and/or physical appearance). This begs another question: what will happen when cultural signifiers, such as “voice quality” and “physical appearance” - qualities that Stone reads as inherently locatory and constraining - find their way into cyberspace? Furthermore, what will happen when technological innovations such as the microphone and the digitized image will be able to locate the voice and physical appearance of cyberspace’s infinite subjectivities? So, if anybody can be anyone s/he wants to be in cyberspace, it is precisely because no one can see or hear who anybody physically is. In other words, it doesn’t matter in cyberspace whether you really are, or whether you are just playing an identity. So this begs another question here, with regard to Stone: how much is Stone’s notion of the transgendered body being naturalized in cyberspace, bound to the very ability of cyberspace to, in effect, make these bodies culturally invisible in the sense that no one can physically see or hear their very transgenderedness? In other words, if we can be who we type, if we can type ourselves across genders and ethnicities (what critic Lisa Nakamura who is not transgendered herself - pejoratively calls “identity tourism”), then isn’t cyberspace universalizing; or rather, isn’t cyberspace making invisible the very marginalized bodies that it is supposed to, according to Stone, create a haven for?15 In other words, isn’t the reason then that cross-gendered activity is the norm in cyberspace, precisely because no one knows whether whom they’re chatting with is playing transgendered or actually is transgendered?16 Furthermore, if transgenderedness is the natural identity online, why was there such an uproar in the virtual BBS over Julie, the “cross-

166

When Identity Play Became Hooking Up

____________________________________________________________ dressing psychiatrist” that Stone, Turkle, and so many other theorists write about?17 A male psychiatrist virtually “cross-dresses” as an older woman called Julie, and is so beloved by her virtual chat-community that when her/his “real” identity is revealed, there is an overwhelming feeling, among the community-members who had trusted her, of “mourning” and “violation.”18 Stone reads Julie’s story as an opportunity to reiterate her claim that all identity is performative - i.e., “Is it personae all the way down? Say amen, somebody.”19 That may well be the case (and I think queer theorists and feminist theorists would have something to say here), but this still doesn’t account for Julie’s online community-members’ “violent” response. In other words, it may only be acceptable for one to virtually “cross-dress” when there is a concomitant illusion that one isn’t cross-dressing at all. In this way, virtual cross-dressing subjects are only subverting hegemonic constraints (in Julie’s case, changing his/her “real” gender) to the extent that they are fooling, or making themselves invisible to that hegemony. In effect, in order for Stone’s and Turkle’s theories of virtual identity mobility to function, they must be based on illusion, steeped in narratives of fantasy, theatre, and role-playing disguised as reality. This trope of disbelief is one thing in role-playing environments such as MUDs and MOOs, but quite another in virtual chatrooms, where there has always been an illusion of reality, as exemplified in the canonical Julie story. So in essence, after Julie’s story makes virtual cross-dressing visible, the paradigmatic question in chat becomes: is this person, whom I’m chatting with, really who s/he says s/he is? Now let’s turn our attention to how this question has been radically modified by new technologies that have, in effect, (re)corporealized virtual space, and paved the way to online “sex.” 4.

The Digital Image as Vehicle for “Real” Sex

It is easy to become wrapped-up in the excitement about the radical “post-humanist” and “post-embodiment” ways in which the pioneering theorists of the new media have theorized subjectivity. By doing so, however, one overlooks some central things. Most important, as alluded to before, these pioneering theorists were writing in the late 1980s and early to mid 1990s when the Internet operated radically different than it does today. Graphically, the Net was black-and-white and text-only. In other words, it was much easier to escape one’s body and play with one’s identity when one didn’t have to upload one’s picture onto the Net. Also, these theorists seem unable to account for the increasing popularity of dating and sex websites in the last few years. I would posit that the main reason for this radical transformation of web-culture (especially chat), has

Jeremy Kaye

167

____________________________________________________________ been the digital image and its ubiquitous integration onto a more graphical computational interface (around the middle 1990s). Although no theorist comments upon how digitized imaging technologies changed virtual chat cultures (and cyberspace in general), let’s examine how a net user is likely to construct his/her identity online today: uploading a good picture of oneself, coming up with a great catchline to make people want to chat with you, listing your interests, affiliations, et al. (not to mention those pesky behind the scenes things such as providing authenticated email addresses, phone numbers, credit card information, etc.). This version of identity construction is particular to Internet dating sites, but if we look at generic chat sites, such as America Online and MSN Network, this kind of identity formation isn’t relegated to only dating sites; it has pretty much become the norm. So what exactly does uploading a picture of yourself into your chat user-profile signify, in terms of identity construction? We intuitively know that Stone’s and Turkle’s paradigm of multiple online identities doesn’t quite work anymore. If chat’s paradigmatic question once was - Are you really who you say you are? (or as cyber-critic Daniel Tsang so convincingly puts it, “all the posted information should be taken with a grain of salt”) - then it seems that the paradigm has shifted due to the digitized image and this question isn’t as necessary as it once was.20 The digital photograph becomes a locating device, a specific technology of location, an anchor that serves to bind the virtual actor to his/her “real” (or, at least, supposedly so) body. The task at hand is to reconfigure this question in order to reflect this paradigm shift. Clearly an epistemological shift in chat has taken place. Cybersex is not purely a text-based mode of communication between two “anonymous” people any longer, in which, as cyber-critic Gareth Branwyn says, “these encounters rarely carry over into face-to-face meetings.”21 On the contrary, as I’ve stated previously, it is estimated that over one-third of all Internet users are using some kind of dating service, in order to enact face-to-face contact. And the way you form your user-profile clearly matters, or, as it is so succinctly put by a dating-site user: “Pictures and profiles are turning into art forms.”22 Now, let’s look at how the image has been configured as a figure of constraint by several key theorists. If disembodied space and multiple identities signify mobility, a world untied to bodily referents, then obviously the image signifies constraint, and some kind of real-world referent. Slavoj Zizek, in The Plague of Fantasies, calls the photograph the “medium of immobilization.”23 To Zizek, invisibility connotes blindness and mobility, and visibility connotes immobility. So, if we apply these formulations to identity on the web, when the web was solely text-based, there was

168

When Identity Play Became Hooking Up

____________________________________________________________ extreme mobility and a correlating metaphorical blindness (i.e., Are you really who you say you are?). Now, as identity construction is more and more based upon a digitized image, identity is made “visible” and therefore immobile. In other words, older notions of “identity tourism” and “multiple identities” are no longer apt. A critic like Daniel Tsang has accounted for this digital image into the user-profile, but he still maintains these older notions of identity-play: “Thus, with a keystroke, one can change one’s biographical particulars, e.g., ethnicity, age, domestic partnership status, class, or even sexual orientation.”24 I would have to disagree with Tsang here. If I have a picture of myself in my user profile, and I common-sensically look like a 24-year-old Caucasian male with dark hair, then I guess I can put that I am a different age or ethnicity, but only in as much as it could be believable. I can’t very well type that I’m a woman, as there would be a clear indication (in the digital image in my user-profile) that I am not. This then begs the question that one can manipulate digitized photography (in Photoshop or other similar software), or one can upload any picture onto the web in the place of one’s own. So, some would say, there is still no way of knowing who is who on the net. The issue here basically becomes trust. As Peter Lunenfeld argues in his article, “Digital Photography: The Dubitative Image,” as photography becomes more and more mutable, “the public is forced to trust in the source of the image.”25 In essence, the question becomes: do I believe that the image and user-profile corresponds to the “real” person? At the same time, however, one doesn’t have the same “healthy paranoia” online (as Tsang puts it), as one used to have when it was purely text-based (i.e., Is this person real?). This is because there’s something about the photograph that people believe as truth. As Roland Barthes states in Camera Lucida, “From a phenomenological viewpoint, in the Photograph, the power of authentication exceeds the power of representation.”26 In other words, a person’s photograph, much more than actually able to represent reality, makes the viewer believe in its reality. To illustrate this point for fully, let us return to later paradigmatic scene from Bringing Down the House, the film with which I introduced this argument. If we remember, the plot revolves around Steve Martin’s on-line courtship of a woman with whom he believes he knows, simply because he’s seen her picture online. He expects the white-woman he sees online, but is shocked, to say the least, when the African-American Queen Latifah knocks at his door. Latifah enters to Martin’s dismay, and repeatedly tells Martin that she is the woman (username: “lawyergirl”) with whom Martin has been chatting with for so long. Martin denies this fact, arguing that this is not the woman of whom he has a picture (the picture that Martin has saved to his desktop, quite unlike the woman at his

Jeremy Kaye

169

____________________________________________________________ door, is an older, blonde white woman). Martin repeatedly avows that Latifah is not the woman he knows from hours of online chatting. Then, the trickiest part comes. Latifah proves that she is Martin’s “lawyergirl,” revealing a blurred image of her likeness in the upper-right-hand corner of Martin’s photograph, the picture in which the older blond woman is centered. “There! I told you that was me in the picture!” This is a true Barthean moment here. Again, the “power” of the photograph authenticates not so much Latifah’s identity, but rather Martin’s belief in the phantom identity of his white-woman-date. In other words, Martin (and millions of other online daters), in effect, says: I can see her, so she must be a real person! In other words, it is not so much that the photograph constrains Latifah’s identity (in fact, her identity remains quite mobile), but rather it is Martin’s trust in the photograph of the white woman in the center that immobilizes her identity. Or, as Zizek says, “it is only immobility that provides a firm visible existence.”27 When users see one firm identity and it is “proved” with a photo, trust usually follows. Also, the stakes of the so-called virtual game have changed as well. As I have stated, the stakes are no longer identity play, the stakes now have become using the Internet to hook up physically (whether dating or sexual). So, in essence, why not trust the image? You’re going to find out eventually whether s/he’s real or not anyway, when you meet the person! So if the stakes have changed from identity play to hooking up, then this entire shift can be based, in large part, on the digitized photograph. Even if cyber-daters still want to “get to know” each other before physically meeting, isn’t the first thing they see before chatting with anybody this person’s photograph in the user-profile? Isn’t this why nearly all of these dating/sexual websites (from lavalife.com to match.com, to barebackcity.com, ad infinitum) are set up so that the first thing we encounter is a photographical “slide-show” of potential dates or sexual partners? Isn’t physical appearance (or at least how it is signified in a photograph) the main criteria that matters, as we continuously click forward the next “pic” until we find somebody we like. Essentially, the paradigm is no longer - Is this person who s/he says? But has now become - I would hook up with that person!28 If we go back to Stone, we see that her account is not prepared to account for the user’s digitized image as constraining his/her online identity. Sue-Ellen Case thoughtfully summarizes Stone’s argument here: “Multiplicity is confounded simply by the practice of referencing a body.”29 And what is the photograph if not (a) confounding to multiplicity (in Zizek’s notion of the photograph as immobilizing), and (b) “referencing a body.” Stone says that what a disembodied cyberspace uncouples from its subject referent is the “voice quality” and the “physical

170

When Identity Play Became Hooking Up

____________________________________________________________ image.” Thanks to new technologies of location, the physical image seems to be accounted for via the digitized image, and the voice quality can be accounted for via microphones. So, the corporeal body, at long last, reenters cyberspace, not only in the figure of the image, but more significantly, as the Internet is being used so heavily today as a vehicle for physical meetings/sex. So, now the question becomes: how is the public and how are theorists constituting this “new” virtual space? 5.

Taking the Cyber out of Sex: The Politics of Cybersex

Virtual chat has always been linked to notions of cybersex. Not long after the first asynchronous BBS boards were built in the late 1970s, the idea of using it for some type of cybersex experience takes shape. From the earliest instances of Minitel (the first “chat” system in France, “devolving” into sex-chat), to narratives of cyber-prostitution on MUDs such as LamdaMOO and Habitat, it seems apt to say that from the beginning, one of the main uses of Internet chat (both asynchronous and synchronous), has been for some kind of cybersex. So then, how has cybersex been culturally encoded?30 What’s interesting when we look at how older notions of “cybersex” (that is strictly “writing sex” in the words of Sue-Ellen Case), is to see how cybersex as a cultural phenomenon has been politicized. AIDS is the pervasive trope here.31 In fact, much of the original appeal of text-based cybersex, at least according to critics, was the fact that it was safe, disease-free sex, with no exchange of body-fluids. Numerous critics comment in this fashion. Critic Branwyn argues in the early 1990s that cybersex itself as a concept was invented as the “ultimate safe sex for the 1990s, with no exchange of bodily fluids, no loud smoke-filled clubs, and no morning after.”32 The metaphors to AIDS and infection are not subtle here. Another critic who talks about cybersex in almost redemptive terms is Nina Wakeford. In her article, “Cyberqueer,” Wakeford cites J.D. Dishman’s formulation that the “origin for the desire for a sexual use of cyberspaces [i]s connected to the impact of HIV and AIDS.”33 Dishman goes on: Certainly, with respect to infectious disease, there can be no safer interactive sex than having it with someone with whom contact exists only through electronic pulses. Cybersex is safe sex.34 Furthermore, Wakeford also cites an early 90s advertisement in The Advocate: “‘There are no closets in cyberspace.’”35

Jeremy Kaye

171

____________________________________________________________ However erroneous these ideas are that cybersex originated in order to provide AIDS sufferers a sexual outlet, these formulations probably reflect more the ideological moment of the mid-1990s, and less the actual reality of cybersex. AIDS of course is the ultimate metaphor for constraint in our culture, as it signifies death, moral depravity, and panic. In this way, cybersex is how AIDS sufferers can become mobile again. In other words, how much do these paradigms of cybersex reflect the further ideological marginalization of AIDS sufferers? In effect, dominant ideological discourse wants to relegate their sexual practices to virtual space, so that hegemonic culture doesn’t have to worry about the threat of contamination. In her treatise on disease, AIDS and Its METAPHORS, Susan Sontag calls this type of marginalizing rhetoric, the “language of political paranoia, with its characteristic distrust of a pluralistic world.”36 So, the ideological formulation of fluid-free cybersex as fermenting mobility among AIDS sufferers (and of course its implied references to queer and other marginalized subcultures), in reality, does nothing more than reinscribe the further stigmatization of the always already culturally stigmatized. What the old paradigm of anonymous cybersex bespoke was a visionary escape from the body, an embracing of identity-mobility and plurality, in which culturally marginalized groups find haven. As mobility (formulated first as identity-play, hacking, and later as physical hook-up) is seen by hegemonic culture as subversive, this subversive mobility has been culturally coded as infection, in an attempt at ideological containment. Fast-forward a few years. The Internet has changed. Cybersex is old-fashioned; real sex is now the norm. So, in effect, how does dominant ideology manage to constrain this mass mobilization of sex? The same way they did it ten years before by formulating cybersex as a way for the infected to have “sex” without contaminating “us.” Except this time, however, the re-encoding of hegemonic norms is not so subtle. On the contrary, they are quite explicit. (1) A front-page USA Today article from February 2003 reads: “Resurgence in HIV infection rates feared.” The article goes on to report that the government blames a “marginal” increase in the number of AIDS cases on the “use of the Internet to meet potential sex partners.” The article then goes on to cite a study of “nearly 3,000 men who surf chat rooms on gay Web sites found that 84% met sex partners online and 64% of them had high-risk sex.” An official for the Centers for disease Control cites “fading memories of the early epidemic, illicit drug use and treatment optimism,” along with the Internet as the reason for increase.37 (What’s interesting to note here is that this is a front-page USA Today article,

172

When Identity Play Became Hooking Up

____________________________________________________________ articulating a “marginal” (in its own words) increase in the AIDS population.) (2) A controversial Rolling Stone Magazine article in the same month preys on cultural stereotypes like “fading memories” and promiscuous gay men. In “Bug Chasers: The men who long to be HIV+,” Gregory A. Freeman “exposes” another subculture, the “gay” men who are using the Internet in order to have unprotected sex with as many other men as possible, in order to spread the AIDS among their own population.38 Freeman specifically talks about Barebackcity.com, a website that has garnered much media attention of late. Barebackcity.com is a gay website that forbids any use of their website for anything other than “quick,” “anonymous,” sexual encounters. Freeman takes activities like these as a possible site for cultural mobility: “It’s empowering. They’re no longer victims waiting to be infected.”39 The rhetoric here, Freeman argues is essentially that this population is already demonized by the population as disease-ridden anyway, so they may as well have the disease to go along with the stereotype.40 There’s no doubt that websites like these are problematic, but to only talk about them in a kind of overblown-sensationalist-media-frenzy is not the answer. It only reinscribes notions of panic and fear of contamination onto these always already contaminated sub-cultures. What’s interesting to note here is that in early 2003, barebackcity.com garnered this media attention, and by late in the year, the website could not be found on the Internet. In effect, websites like these are becoming virtual versions of “bath-houses,” blamed for the epidemic, and quickly shutdown one by one by the powers that be, in an ideologico-political campaign that basically says: if we just take these websites away, the problems and stigmatization will go away too. (3) Frontiers One Magazine, a gay publication, responded to the article from Rolling Stone Magazine, in “Chasing the Bug Chasers.” It doesn’t manage much better than the Rolling Stone piece, as it basically demonizes “these men” who use the Internet for these subversive activities, while tapping into the cultural logic of fear and contamination that notions of the virus already instantiate.41 (4) “The Internet is one big closet.” A gay comic said this recently in performance at the Riverside Municipal Auditorium.42 He first introduced the idea, jokingly, that if you’re gay, you should be able to “get laid” on the Net within a few minutes. After initially joking, he became very serious, and said: “There are real Infectors out there, and they are spreading the virus over the Internet. The Internet is one big closet.”43

Jeremy Kaye

173

____________________________________________________________ 6.

Conclusions

Taken together, these instances are forming a profitable ideological moment, precisely because they are citing old stereotypes of queer populations culturally coded as bug-chasers and infectors. Essentially, this is a return to the middle-1980s’ moral panic because of AIDS. These mainstream media sources are using this new Internet space to (re)assert dominant ideological culture’s inherent distrust and fear of the kind of sexual mobility that the Internet has paved the way for. The conflation is complete. Network viruses are becoming “real” viruses, and the hegemonic cultural logic of infection, contamination, and fear is ruling the recent discourse over cyberspace. But what isn’t as culturally visible, is the fact that nobody is talking about the “heterosexual” use of the Internet for sex. As I have stated several times before, one-third of the people on the Net are using it to have sex (both heterosexual and gay populations). But why is it that the group who is most visible are also the culturally marginalized (e.g, this is typified in the wide publicity that the gay website Barebackcity.com has gotten recently)? Andrew Calcutt, in White Noise: An A-Z of the Contradictions in Cyberculture, exemplifies these reified cultural fears and hegemonic norms best: “For all its immunity to physical penetration, the online world offers no protection against ideological intercourse.”44 It just seems unfortunate that the ideological intercourse that seems to be invading cyberspace now are the same old notions of fear and risk. At least in these instances, cyberspace is becoming more and more global in Case’s sense of a “universalized space,” which seems to be creating a further balkanization of oppressed groups. Case locates social change in “bodies rather than in that universalized space.”45 But how can virtual space, which has already been culturally coded as erasing bodily particularities in favour of such universalizing notions as the global or post-national subject, be suddenly inscribed with a true acceptance, and not just a disembodied, utopian one? When are theorists going to account for the changes inherent in this new virtual space, a space that has become a vehicle for physical sex? If they don’t comment soon, they risk losing the “new” cyberspace to overblown, sensationalist media. I have one hope too. Instead of putting all of our hope for identity mobility, social change, and cultural subversion and resistance into utopian notions of a disembodied cyberspace, I will echo Case here, and argue for making “real” life (i.e. material reality) the dynamic site for multiplicity and change. To close, let’s return to Stone, who closes The War of Desire and Technology, by saying:

174

When Identity Play Became Hooking Up

____________________________________________________________ As we stand together at the close of the mechanical age, in the ruins of a system of visual knowledge whose cultural purpose was to ground and authorize sovereign subjectivity, that such vampires do exist is for me the challenge and the promise of virtual systems.46 As these subcultures, these vampire cultures, are becoming more and more culturally visible (contrary to the invisibility supported by the old paradigm of cyberspace), they are being stigmatized as “unnatural” by the very “system of visual knowledge” that this “war of technology and desire” was supposed to leave in ruins. (“Visual” is important here as it signifies not only visuality introduced by digitized photography, but also ideological panoptic media coverage). In the end, these historical shifts must be accounted for before the “social apparatus of vision” as Stone calls it, reasserts its domination and claims cyberspace as its own; before the “wrench” in the social apparatus, begins to simply add more fuel to the ideological fire.

Notes 1. Alluquère Rosanne Stone, The War of Desire and Technology at the Close of the Mechanical Age (Cambridge, Massachusetts: MIT Press, 1995), 113. 2. “Love at First Site,” The Guardian, 30 June 2002, (10 March 2003). 3. The “notorious” website I am referring to is a gay website, barebackcity.com, which during early 2003 came under a firestorm of controversy. The website was a “gay” site, which promoted promiscuous, unprotected sex, and some blamed it (and websites like it) for a “supposed” spread in the AIDS virus. We will look at this website in more detail later in the paper. What’s particularly interesting, is that sometime after last year barebackcity.com was taken off the web, perhaps due to the media firestorm. 4. The conference was titled Interactive Convergence: Research in Multimedia, which was held 7-9 August, 2003 at Anglo-American College in Prague, Czech Republic. 5. An audience member during the presentation of this paper in August 2003 in Prague, in fact, echoed these sentiments, saying that

Jeremy Kaye

175

____________________________________________________________

6. 7.

8. 9.

10. 11. 12. 13. 14. 15.

16. 17. 18. 19. 20. 21. 22. 23. 24.

the very thing she disliked most about online communities “nowadays” was the fact that the first thing people ask for is your picture. Steve Sternberg, “Resurgence in HIV infection rates feared,” USA Today, 12 February 2003, sec. A, p. 1+. Harraway’s groundbreaking article, “The Cyborg Manifesto,” was first published in 1983; Stone’s book came out in 1995; as did SueEllen Case’s and Sherry Turkle’s. A majority of the articles in The Cybercultures Reader (several key articles from this volume are cited here) were written in the middle 1990s. Mark Poster’s What the Matter with the Internet was published in 1999. Sherry Turkle, Life on the Screen: Identity in the Age of the Internet (New York: Touchstone, 1995), 180. I choose Stone’s book as the object of my critique, not only because I feel that she is representative of many of the theoretical trends in new media studies, but also because I feel that she is quite critical of many of these trends as well. Stone, 91. Ibid., 180-81. Ibid., 92. Ibid., 181. Sue-Ellen Case, The Domain-Matrix: Performing Lesbian at the End of Print Culture (Indianapolis: Indiana UP, 1996), 120. Lisa Nakamura, “Race In/For Cyberspace: Identity tourism and racial passing on the Internet,” in The Cybercultures Reader, eds. David Bell and Barbara M. Kennedy (London: Routledge, 2000), 712. I realize this is an essentialist argument, but so it goes. Indeed, the Julie narrative is a staple in cyberculture theoretical discourse. For more, see Stone, 65-83. Stone, 78-9. Ibid., 81. Daniel Tsang, “Notes on Queer ‘N’ Asian Virtual Sex,” in The Cybercultures Reader, eds. David Bell and Barbara M. Kennedy (London: Routledge, 2000), 433. Gareth Branwyn, “COMPU-SEX: erotica for cybernauts,” in The Cybercultures Reader, eds. David Bell and Barbara Kennedy (London: Routledge, 2000), 397. “Love at First Site.” Slavoj Zizek, The Plague of Fantasies (London: Verso, 1997), 87. Tsang, 433.

176

When Identity Play Became Hooking Up

____________________________________________________________ 25. Peter Lunenfeld, “Digital Photography: The Dubitative Image,” in Snap to Grid: A User’s Guide to Digital Arts, Media, and Cultures, ed. Peter Lunenfeld (Cambridge: MIT, 2000), 61. 26. Roland Barthes, Camera Lucida: Reflections on Photography (New York: Hill and Wang, 1981), 89. 27. Zizek, 87. 28. A minor explanation I feel is necessary. When presenting this section at conference, a peer raised a valid point that she was always surprised how different people look in person, as opposed to their online picture. Another peer retorted that the issue becomes “trust.” I.e., Do you trust your online chatting-partner? This is precisely my point here. It essentially doesn’t even matter if the user and his/her picture are the same person; the mere fact that there is a picture that can be seen online alleviates the fear of meeting an unknown person in the flesh. Consider the scene described above from Bringing Down the House (2002), in which the protagonist is excited to meet his online “girlfriend” because he “knows” what she looks like (at least according to the photo). Compare this scene with something like You’ve Got Mail (1998), in which pictures are not exchanged so even though they are destined to be together, both main characters have tremendous anxiety before their first meeting. 29. Case, 119. 30. Many theorists discuss these examples of Minitel and LamdaMOO. These examples here are coming specifically from Stone, 99-121. 31. Many scholars make the link between AIDS and cybersex. In fact some scholars posit that computer viruses were originally modeled after the AIDS virus. For example, see Andrew Ross, “Hacking Away at the Counter-Culture.” It’s interesting to note that both the computer virus and the AIDS virus were first gaining wide cultural notice in the early 1980s. 32. Branwyn, 398. 33. Nina Wakeford, “Cyberqueer,” in The Cybercultures Reader, eds. David Bell and Barbara. M. Kennedy (London: Routledge, 2000), 410. 34. J.D. Dishman, “Digital divas: defining queer space on the information superhighway,” (paper given at Queer Frontiers: The Fifth Annual National Lesbian, Gay and Bisexual Graduate Student Conference, 1995), 23-6, quoted in Nina Wakeford, “Cyberqueer,” in The Cybercultures Reader, eds. David Bell and Barbara M. Kennedy (London: Routledge, 2000), 410-11. 35. Wakeford, 411.

Jeremy Kaye

177

____________________________________________________________ 36. Susan Sontag, AIDS and Its METAPHORS (New York: Farrar, Straus and Giroux, 1988), 18. 37. Sternberg, sec. A, p. 1+. 38. Gregory A. Freeman, “Bug Chasers: The men who long to be HIV+,” Rolling Stone, 6 February 2003. Found online at 39. Ibid. 40. Ibid. 41. John Caldwell, “Chasing the Bug Chasers,” Frontiers One Magazine, 28 February 2003, 14-5, 38. 42. Bruce Daniels, Stand-Up Comedy Performance at Riverside Municipal Auditorium, 21 February 2003. 43. Ibid. It’s interesting to compare this statement to the ad in The Advocate cited earlier, “There are no closets in cyberspace.” The logic here has clearly turned on its head. 44. Andrew Calcutt, White Noise: An A-Z of the Contradictions in Cyberculture (New York: St. Martin’s), 1999, 111. 45. Case, 120. 46. Stone, 183.

References Barthes, Roland. Camera Lucida: Reflections on Photography. Translated by Richard Howard. New York: Hill and Wang, 1981. Branwyn, Gareth. “COMPU-SEX: erotica for cybernauts.” In The Cybercultures Reader, edited by David Bell and Barbara M. Kennedy, 396-402. London: Routledge, 2000. Calcutt, Andrew. White Noise: An A-Z of the Contradictions of Cyberculture. New York: St. Martin’s, 1999. Caldwell, John. “Chasing the Bug Chasers.” Frontiers One Magazine, 28 February 2003, 14-15, 38. Case, Sue-Ellen. The Domain-Matrix: Performing Lesbian at the End of Print Culture. Indianapolis: Indiana University Press, 1996. Daniels, Bruce. Stand-Up Comedy Performance. Riverside Municipal Auditorium. 21 February 2003.

178

When Identity Play Became Hooking Up

____________________________________________________________ Freeman, Gregory A. “Bug Chasers: The men who long to be HIV+.” Rolling Stone Magazine. 6 February 2003. (10 March 2003). “Love at First Site.” The Guardian, 30 June 2002. (10 March 2003). Lunenfeld, Peter. “Digital Photography: The Dubitative Image.” In Snap to Grid: A User’s Guide to Digital Arts, Media, and Cultures, edited by Peter Lunenfeld. Cambridge, Massachusetts: MIT Press, 2000. Nakamura, Lisa. “Race In/For Cyberspace: Identity tourism and racial passing on the Internet.” In The Cybercultures Reader, edited by David Bell and Barbara M. Kennedy, 712-720. London: Routledge, 2000. Sontag, Susan. AIDS and Its METAPHORS. New York: Farrar, Straus and Giroux, 1988. Stone, Allucquère Rosanne. The War of Desire and Technology at the Close of the Mechanical Age. Cambridge, Massachusetts: MIT Press, 1995. Tsang, Daniel. “Notes on Queer ‘N’ Asian Virtual Sex.” In The Cybercultures Reader, edited by David Bell and Barbara M. Kennedy, 432-38. London: Routledge, 2000. Turkle, Sherry. Life on the Screen: Identity in the Age of the Internet. New York: Touchstone, 1995. Wakeford, Nina. “Cyberqueer.” In The Cybercultures Reader, edited by David Bell and Barbara M. Kennedy, 403-15. London: Routledge, 2000. Zizek, Slavoj. The Plague of Fantasies. London: Verso, 1997. Jeremy Kaye is a graduate student at the University of California, Riverside. He research interests include issues of performance with an emphasis in cyberculture and new media theory.

Advising Student Software Development Teams for Maximum Performance Randy S. Weinberg Jennifer A. Tomal Abstract: Developers of multimedia software systems typically work in teams composed of people with diverse backgrounds and skills. Creating successful software systems requires people who know how to work effectively on teams and who understand software development processes and methodologies. To meet the need for individuals trained in software development methodologies and teamwork, Information Systems programs are adding team-based, project courses to the curriculum. With little training or experience in teamwork, student teams often lack the insights desirable to affect good teamwork systematically. Our research indicates the need for effective faculty mentoring, or advising of student software development teams. Student team members expect their faculty advisors to be skilled in technology and process. Since teaching team-based, project courses may be an unfamiliar activity for many faculty, we report on the coaching needs frequently desired/mentioned by student team members during our interviews and on general guidelines for effective coaching based on research in team facilitation.

Introduction Software development organizations need people who are skilled in many dimensions, including software development methodologies, project management and teamwork. Development of successful systems is a complex activity requiring the skills of a variety of people working effectively together toward common goals. It is typically the task of teams, requiring effective communications, coordination, planning, and skilful project management. Yet, in practice, developers often work in loosely coordinated teams and create products that do not meet stakeholder expectations. Indeed, numerous studies have documented the high costs and failure rates of large software development projects. One well known report reveals that only 26% of such software projects were completed on time, within-budget, and with expected functionality [1]. Most of these failures are not attributed to technical factors, but to organizational, people and teamwork factors. Having studied over 500 software development projects, Demarco and Lister report that, for the overwhelming majority of failed projects, "there was not single technological issue to explain the failure." [2]

180

Advising Student Software Development Teams

____________________________________________________________ Research in organizational dynamics, project management and software development consistently demonstrates the need for effective peopleware. Synthesizing research from the last 15 to 20 years, McConnell states "… we now know with certainty that peopleware issues have more impact on software productivity and software quality than any other issue." [3] While individual ability and motivation certainly remain key factors, teamwork issues are now recognized as critical to project success. In various studies cited by McConnell, productivity of software teams shows differences of 3, 4, and 5 to 1. [3]. There is little doubt that effective teamwork leads to lower product costs, higher product quality and better outcomes for project stakeholders. However, in general, software development teams and their managers do not pay enough attention to their own inner workings. Team members, project managers and project sponsors do not, in general, recognize the central importance of effective teamwork to overall project success. Intentional reflection on teamwork and team practice is not a common activity in many software development organizations. Rewards and recognition for effective and efficient teamwork are not common in most development organizations. Instead, the easier approach of measuring and rewarding individual performance or relative technical contributions is the norm. Too often, however, with minimal, or no, training in the art of teamwork, system developers must learn or deduce, through first-hand experience, how to make their work teams effective and efficient. Accepting McConnell's claim that most software practitioners have never worked on a high quality project in a high quality development organization [4], it is reasonable to surmise that most developers do not learn the right lessons about teamwork. In fact, they may actually adopt or accept counterproductive and unhelpful habits as "best practices." While effective teamwork is a prerequisite to successful project completion, there are barriers to effective teamwork. Achieving good teamwork is not an obvious activity; there can be many problems that seem resistant to easy solutions. Most of these problems are human and cultural in origin, rather than issues of technology. Yet, given the increasing recognition of the role of teamwork, all too many software developers still have not learned the art of teamwork, nor have they developed an appreciation for the central importance of teamwork. Through our work with the Information Systems faculty and over 50 student software development teams at Carnegie Mellon University, we have found that effective teamwork increases team members' commitment to the project, their sense of accomplishment and overall satisfaction with the project experience. We have identified key factors influencing student team success and developed guidelines for effective faculty coaching of student development teams. With improved coaching we hope to see

Randy S Weinberg & Jennifer A Tomal

181

____________________________________________________________ improved teamwork, student satisfaction, project outcomes, and the overall course experience for students, instructors and project clients. Team-Based Coursework Responding to the need for skilled developers, academic programs in Information Systems, Multimedia Design and Computer Science are increasingly offering team-based, project courses to their students. Faculty are recognizing that a key part of students' education in project-based courses should be to experience effective teamwork and to appreciate how their project results are influenced by their teams' practices. These team-based courses are intended to expose students to the challenges and issues, both technical and organizational, they will face in the marketplace. In these courses, students learn and practice the principles of project management, teamwork and communication, in addition to the design and programming skills they seek to sharpen. They learn, firsthand, the fundamental importance of effective teamwork to the overall success of any project. They live with the realities of ambiguous requirements statements, tight deadlines, constraints on resources, working with diverse team mates and project stakeholders, all while being expected to produce high quality work products. Student developers, as novices, have much to learn about creating complex software systems. The first important lesson is that there is much more involved in a team-based software development project than simply creating technical designs and writing computer code. First-time team members who are competent as programmers or designers usually do not realize how the nature of the work changes when they work with others, and that the overall project depends critically on effective teamwork. For inexperienced student developers, interdependencies and teamwork problems can become intractable and easily spoil project outcomes. While it is possible, and desirable, to provide some teamwork training, many students in project courses are simply assigned to project teams and expected to perform their roles. Without appropriate direction and facilitated introspection, students may, or may not, learn productive teamwork skills by "doing" and may not be particularly insightful or thoughtful about what they've learned. At Carnegie Mellon University, undergraduate students majoring in Information Systems participate in two team-based software development project courses - one during the junior year and the second during the senior year. Our goal is that every IS student will have a "good" to "excellent" experience working on a software project team while in school - ideally, taking the discipline, process, useful practices and team citizenship lessons with them into the workplace. As juniors, IS students first complete a semester-long, team based project course in which they learn the basics of team citizenship, practice

182

Advising Student Software Development Teams

____________________________________________________________ disciplined software process, and are held responsible for high quality work products and overall performance. To prepare for the vagaries of working with a real client on a real project, students are immersed in a selfproposed substantial project of interest to the team. These teams are challenged to create more complex systems than they have completed individually. They create complex software systems spanning a wide range of applications such as multimedia applications, online voting systems, events and entertainment calendars, electronic auctions, electronic photo albums and a variety of online information and decision support systems for non-profit and charitable organizations. They quickly learn that they have to rely on each other to plan, design, manage and implement these projects on time. They must find ways to make decisions, resolve conflicts, reach agreements, and deal with problems and project risks. They must manage their time and collective work products effectively, produce high quality deliverables, communicate successfully among themselves and communicate effectively with the course instructors and other interested stakeholders and end users Students mature considerably in this first course, developing a "first hand" appreciation for the importance of teamwork, process, communications and attention to quality. The senior level project course involves working with a "live" project client in the community, usually a non-profit or charitable organization. In this course, student teams are expected to propose, design and implement an appropriate solution to their clients' need for better information or management support. In spite of having received some basic training in teamwork and development process, however, unsuspecting students often encounter more difficulties than anticipated. Working with a range of personalities, opinions, capabilities and team members' quirks creates interesting challenges and opportunities for all team members. In short, students must learn to work together as a team; the ambitious course projects simply cannot be completed without good teamwork and effective process. The Challenge of Teaching Teamwork Teaching team-based, project courses is an interesting challenge for faculty members. Team based project courses typically have multiple objectives for students: • To develop a right-sized software system to meet an organizational need for information or decision support, • To learn the principles of good project management and software development process, and • To learn the basics of effective teamwork and working with people on a real problem.

Randy S Weinberg & Jennifer A Tomal

183

____________________________________________________________ Faculty find that teaching team-based project courses is quite different than teaching traditional classroom courses. While the experience can be highly rewarding, it can also be labour intensive for faculty, with uncertain outcomes for students and project stakeholders. Student expectations of the course, their team-mates and the faculty are uncertain. As a result, substantial faculty engagement, in novel ways, is critical to ultimate outcomes. Attentiveness to student teamwork and process can make the difference between highly successful outcomes and those less so. Most college faculty do not have extensive experience with teambased courses, yet they are expected to advise teams on the variety of domain and teamwork issues that arise through this rigorous process. In addition to being the disciplinary expert, the advisor must guide the teams in their teamwork. They must explain methods and expectations as well as provide the guidance to help teams meet stated course and project expectations successfully. The faculty advisor serves as a key resource person to help teams improve their effectiveness, processes, decision making, problem solving and project management. The advisor guides and helps the team monitor its internal workings, quality of work products and progress; identify and solve problems or avoid potential problems; and generally helps the team achieve its goals. Hilburn and Humphrey [5] emphasize the need for faculty teaching such courses to define, communicate and enforce project and process expectations so that students know clearly what is expected in terms of work products, documentation, quality, deliverables. An important part of the faculty's role is to assign grades. The faculty needs to track and assess team and individual progress, process and performance. Evaluation of student work in project courses typically includes weighting several factors: meeting expectations and demonstrating learning and master of teamwork and process; quality of team meetings, reports, project quality, client grades and feedback, and peer evaluations. Structuring and organizing team-based project courses for undergraduate students also poses interesting and difficult problems for faculty. Given the constraints of the academic calendar, projects must generally be completed within one semester. While one semester may often not be sufficient to develop a full featured, stable solution, keeping teams together for longer periods of time is usually not possible. Assigned or selected projects must be interesting and sufficiently challenging or demanding to maintain student interest and motivation throughout the semester. Project scope must be carefully considered and realistic. Fundamentally, students must learn that executing large projects require more than computer programming. Students in our project courses follow a well defined phased development life cycle model that paces and

184

Advising Student Software Development Teams

____________________________________________________________ guides them throughout the semester. The model emphasizes planning before development, project estimation and tracking, project metrics, clearly defined milestones and deliverables, product quality, user centred design, testing and documentation, individual and team accountability. Ultimately, most students in these courses come to realize the value in the required process activity and project documentation. Students learn that attention to process, teamwork as well as programming and coding are key ingredients to overall outcomes. They learn that articulating project requirements involves discussions with stakeholders and eliciting information from multiple sources and clearly documenting findings. Finally, students must learn that effective teamwork is the foundation for overall project success; it is a necessary condition to complete the project and meet course expectations. Students assigned to work on software teams also generally have limited experience in team-based work. Common problems occur and can be anticipated when students work on teams: coordinating busy schedules, finding common meeting times, finding productive workspace, conducting effective meetings, coordinating work products, applying effective project management and holding each other accountable for individual and team performance. Working through these issues is important to teams expected to deliver large, working projects during a single semester. All of these areas need attention, and appropriate practices are not usually obvious to students new to teamwork and rigorous process expectations. Even simple communication among team members can be complicated by unproductive practices and inattentiveness to requests for information and participation. To promote satisfying outcomes for team members as well as project stakeholders, emphasis on teamwork and process should be essential components of project courses. In these courses, process and teamwork are as important as the product. As Schwarz [6] says, "The processes and structures used to carry out the work maintain or enhance the capability of members to work together on subsequent group tasks. The group experience, on balance, satisfies rather than frustrates the personal needs of group members." The instructor's challenge is to bring out the best in teams and in individuals within the teams. Merely allowing, or encouraging, student software development teams to work on their own and expecting superior results is unrealistic. Rarely is talent, technology or access to resources the cause of a team's inability to reach its potential. Student teams do need mentoring and advising and appropriate interventions when things don't go well. Process and teamwork don't necessarily come naturally to students. Supervised experiences can help students derive the most benefit from their project courses. Helpful mentoring helps turn teams' creative endeavours into successful projects.

Randy S Weinberg & Jennifer A Tomal

185

____________________________________________________________ Team coaching is best done on a regular basis. In our project courses, teams have weekly meetings with their faculty advisors. The faculty member conveys the notion that these are not to be passive sessions. Students are required to prepare for these meetings, with full status reports, agendas, a list of questions, problems and concerns and a presentation of current work products. During these meetings, faculty advisors help teams assess their progress and teamwork issues. Active probing can also reveal issues needing further attention or remediation. Team conflicts, lack of communication, coordination, and lapses in leadership, team citizenship, and performance can surface. A challenge for the faculty is knowing when and how to intervene, how to keep the teams cohesive, and how to keep them moving forward. Finally, one-on-one sessions with individual team members are always available either by request of the students or the faculty advisors. Team Success Factors To identify factors important for student team success and related advising strategies, representatives of over fifty junior and senior level student Information Systems project teams were interviewed at various points throughout their semester-length team based projects during the 1999-2003 academic years. At various points, focus groups of students were organized to provide additional student input. Thirty-two interviews with the faculty teaching the courses were also conducted primarily during the first year of the study. These interviews provided useful information about student team dynamics and feedback on the courses. Student interviews were correlated with measures of team performance: scores on project milestone reports, expert panel review of final projects, and course grades. This information has helped us improve the experience for student team members. Transcripts of the interviews were imported into a text analysis software package and analyzed for patterns and themes. Interestingly, there was consistency from year to year and from cohort to cohort. Five main factors influencing team success emerged from these interviews: leadership, attitude, cohesion, use of external resources, and communication [7]. Secondary, less influential factors emerged, as well: early identification and clarification of project vision and scope, information and requirements gathering capabilities, and the ability to manage client relations effectively. While these factors influence success, most teams who ultimately achieve the best results integrate these factors into their work in positive ways that enhance performance. Less effective teams tend to demonstrate less positive influence from these factors. Recognizing this, faculty advisors look for these key points throughout the project and help teams develop in these areas.

186

Advising Student Software Development Teams

____________________________________________________________ 1. Leadership Positive and effective leadership is the single-most important success factor success on the student project teams we studied. Good leadership can help create a sense of vision and team enthusiasm. Effective leaders facilitate individual and team performance - they bring out the best in their teams. Positive, effective leadership motivates teams to higher performance. Effective team leaders come to understand each individual's potential and coordinate efforts for maximum effect. They keep the project on track and help the team articulate their shared expectations. They push their team-mates' thinking, confidence and performance, and they raise team members' expectations of each other. They direct and motivate the team's efforts through example and persuasion and, therefore, typically, the best students make the most effective project managers and team leaders. Team leaders accept responsibility for various aspects of the project or project management. Nominal team leaders may be appointed, such as Project Manager, Quality Assurance Manager, or Design Manager, or may emerge informally in other roles, such as technical, design, or documentation leaders. These key team members accept the challenge of either managing a portion of the project, or volunteer for a particular role and take the leadership for some aspect of the project. They coordinate team mates' efforts to get things done on time, while meeting performance expectations. Team leaders ensure that everyone knows exactly what is expected, when it's expected and how it all fits into the overall project plan. Effective team leaders fully share information so that everyone is fully aware of all progress and problems on the project. They manage so that everyone on the team has the opportunity to contribute to the overall effort in meaningful and personally appropriate ways. They promote shared ownership of work products and shared accountability. On the most effective teams, every team member accepts leadership responsibility for some component of the team's work, like documentation, building, or research, which also contribute in positive, meaningful ways to the project's outcomes. Team leaders on the most successful teams learn how to delegate responsibility effectively. The most technically skilled leaders must learn to delegate difficult tasks and hold everyone accountable. They transfer responsibility and accountability for work products to the other team members. The most effective team leaders are willing to deal with team problems and problem individuals, sometimes with help from the faculty advisors.

Randy S Weinberg & Jennifer A Tomal

187

____________________________________________________________ 2. Attitude Attitude is a key factor in team performance; it plays a major role in the ability of effective teams to get things done. Talent and experience alone are simply not enough to create good team outcomes. Attitude is highly related to individual and team performance. A positive attitude can raise the potential of the team; a negative attitude can drain the potential of a team. A team's positive attitude goes beyond mere enthusiasm or the desire to do a good job. They find solutions to problems, they get things done, and they make good use of resources. Team members with good attitudes motivate others. Teams with a strong “can do” attitude, and who accept responsibility for their performance typically do the best job on their projects and report the most effective teamwork and high satisfaction with the project experience. Attitudes can be infectious. Good, positive attitude among team members inspires others; poor, negative attitude saps the team's morale. Negative attitudes, especially coming from team leaders, wreak havoc on team morale and performance. We now know that even one person can diminish any team's enthusiasm and outlook, and literally spoil a project. Maxwell writes about the "law of the bad apple" and states the evident: "rotten attitudes ruin a team" [8]. McConnell [3] also writes that the single most common complaint of software developers about their managers was the latter's inability to deal with the destructive influence of a "problem person." Naturally, students bring their attitudes, work habits and "baggage" with them to the project teams. Meeting (or not meeting) team members' expectations and meeting one's individual commitments is a clear demonstration of a team member's attitude. Students need to learn or acknowledge that their behaviours and attitudes have consequences. A good attitude, coupled with responsible behaviour, breathes hope and enthusiasm into the team, rather than the gloom that a bad attitude leaves. 3. Cohesion In his study of thirty one software projects, ranging in duration from six to fourteen months and from four to eight developers, Lakhanpal [9] reported "that group cohesiveness contributed more to productivity than project members' individual capabilities or experience did." This finding also applies to the student teams we studied. On highly cohesive teams, members all members feel fully engaged and participate in project ownership. They have an articulated or implied vision and set of values that helps them achieve their goals. They have a strong sense of identity and they understand that a team perception of shared or joint ownership is a key indicator of project success. Cohesion enhances a team's ability to work together, to identify major problems, to discuss them, and to reach a common understanding

188

Advising Student Software Development Teams

____________________________________________________________ about the problems and the needed solutions, and to keep enthusiasm high. Teams with good cohesion seem to enjoy their work, they work together well, and they focus on the tasks at hand. They trust each other and they work hard to solve problems and they share in all key aspects of the project. On highly cohesive teams, team mates feel that they are part of something special. A common, shared sense of success and performance helps cohesive teams keep on track throughout their projects. Team members hold each other accountable for team performance and the ontime delivery of quality work products. They are proud of their work and of each other. They look forward to showing off their accomplishments to peers, clients, faculty and other stakeholders. They are confident and know they are capable of doing high quality work. They bond, often socially, and feel a sense of unity. Respect and trust underlie their interactions with each other and project stakeholders. The combination of a “can do” attitude with a high level of cohesion makes these teams more effective than the others. Cohesive groups also typically have high enthusiasm toward their projects. Members of cohesive teams support each other and develop an important sense of shared responsibility and ownership of the project. They see their efforts as supporting the team, rather than advancing their own individual goals. They work through problems more readily than teams with less cohesion. In our study, cohesive teams reap a "synergy bonus" [10]. Early team successes contribute to a sense of achievement and often increase the team's confidence and energy. Members of cohesive teams are motivated to consistently contribute their fair share of the team's work. Teams lacking cohesion, or shared vision and values, often experience the opposite effect - early setbacks discourage team members, lower morale and decrease the team's energy and confidence. While some teams can naturally recover from setbacks, many cannot. Effective intervention, or coaching after a setback can help the team rally and recover lost energy and enthusiasm. Team members often appreciate help to understand the causes of the setback, and assistance in crafting realistic solutions is usually welcomed. 4. Communication Good communication is fundamental to team performance and outcomes. "Communication increases commitment and connection; they in turn fuel action." [8] Communication among team members, between team members and faculty and project clients all influence teams' effectiveness - for the better or the worse. High performance teams understand the importance of good communication. Without good communication, teams cannot work together and the results can be disastrous. Effective teams pay attention to interpersonal

Randy S Weinberg & Jennifer A Tomal

189

____________________________________________________________ communications - they learn how to conduct effective meetings, share work products, and keep each other up to date on progress and potential project risks. They talk and listen to one another. They know how to volunteer information, ask questions, ask for assistance, provide advice, feedback and criticism. Communication has many dimensions: maintenance of regular, routine communication; conflict management; joint problem solving and decision making; management of meetings. Effective teams know the importance of providing everyone with current, accurate, timely information and deliverables. They also are fully aware of overall project status and aware of potential or emerging performance problems. They practice (or invent, when necessary) methods to promote communications so that members have access to the information they need. They typically utilize face-to-face, email, cellular telephones, instant messaging, online group spaces, discussion boards, and project management tools to facilitate communications. 5. Use of External Resources Closely related to good communications is the ability to make use of external resources. The most successful teams turn to external resources for help when they realize that they have a problem that they cannot readily solve by themselves. They make good use of their project clients, faculty advisors, and other human resources and conduct independent research to find solutions for their problems. The most successful teams are quick to implement the suggested solutions to solve problems, and if not resolved, they request additional assistance. Teams also benefit when they are honest about project status and potential problems. In software projects with firm deadlines, any potential problems or slippage can cause serious risk of underperformance. Seeking early help with difficult issues is an effective risk management strategy. Teams who view their advisors as sources of help and information can reduce the likelihood that small problems turn into large problems later in the project. Student Feedback A clear pattern of student needs emerged from our interviews with our student teams, follow-up course evaluations and various survey instruments. Meeting these needs can help to make the project experience more manageable and clear. The issues the students identified matter to them; successful team advisors take these into account. These factors helps us direct our advising efforts to areas that students find most important and relevant.

190

Advising Student Software Development Teams

____________________________________________________________ 1. Clearly communicate expectations and instructions Providing clear, unambiguous instructions about course goals, expectations, and grading criteria gives students a realistic picture of what they need to do. This information should include details and expectations about course policies, life-cycle processes, teamwork, documentation, assessment and grading criteria, peer evaluations, and level of expected instructor involvement. The role of the instructor and the team members should be described. Ambiguous or ill-defined goals lead to ambiguous, unsatisfactory outcomes. Ground rules for appropriate and inappropriate behaviours, reporting requirements, accountability, as well as individual and team performance measures are necessary. 2. Comprehensive, frequent feedback High quality, consistent, and frequent feedback is essential. It reassures students when they are on-track and alerts them if they are not. Students appreciate knowing that their work products and processes are reviewed by their faculty. They appreciate timely feedback so that they can make appropriate adjustments in process or seek additional help if needed. Observing how teams respond to feedback and criticism of the work products and process can also be very helpful to faculty advisors actively looking for ways to promote higher performance. 3. Latitude, but within limits; intervention when needed Students want to be treated like they are competent and professional and be encouraged to solve their own problems. Student teams like to solve problems themselves, they usually enjoy the challenge. Teams need to develop independent thinking and actions; they must take responsibility for their actions and understand the consequences of their decisions and actions. Allowing team members to make mistakes and helping them learn from the consequences is important. There can be many levels of complexity and ambiguity in project work; appropriate intervention and coaching can help unravel issues without obvious solutions. Students on the teams we studied are technically skilled and they want the chance to act as software professionals, but they also understand that their inexperience may contribute to team problems and process inefficiencies. Teams do expect their faculty advisors to guide them through unfamiliar or difficult issues and to teach them how to accomplish goals for themselves. 4. Accessible, engaged and helpful faculty Teams expect their advisors to be involved and be expert in process and technology. Even with a "can do" attitude, students may have a variety of problems at all phases of the project that require attention and help. Faculty coaches should listen carefully to team issues and be patient

Randy S Weinberg & Jennifer A Tomal

191

____________________________________________________________ with teams with problem members and those having vague feelings of unease. Faculty should be available to help teams and individuals articulate their issues and formulate questions. Students who do not get help when needed soon stop asking, sometimes thinking their faculty coach either does not care or have the skills needed to help them. 5. Technical resources Even the best student programmers and designers still have technical questions. We clearly inform our students that the faculty will not solve their technical problems, but will help guide them to find workable solutions. However, having good, solid technical resources available is important to help teams solve their problems and avoid being stuck for too long on technical issues. 6. Consistent, fair assessment and grading Students appreciate knowing how their work will be evaluated before it's actually graded for credit. Grading consistent with the stated course objectives is important. Check sheets (published to students) to facilitate consistency and objectivity are useful. Examples of work at various levels and the feedback given are helpful. Discussion and Conclusions In team-based, software development courses, there are multiple objectives and expectations for performance. Students must deliver a high quality system that meets or exceeds the needs and expectations of its users, key stakeholders and instructors within the constraints of time and available resources. Equally important, team members must learn the fundamentals of practical project management, software process, teamwork, and communications. They must learn to work with each other in productive and supportive ways to achieve these outcomes. Effective team dynamics is a key success factor in student software team's ultimate outcomes. It is a necessary (but not always sufficient), condition for successful project completion. To promote effective outcomes for student software developers and their project stakeholders, we continue to investigate differences among teams. How is it, that teams -seemingly with to have the same potential for achievement or effectiveness - realize such different outcomes? We now recognize that team issues such as leadership, attitude, cohesion, use of external resources and communication make the difference between the most and least successful teams. Pinpointing these key areas, and identifying positive intervention strategies, helps us focus our interactions with student teams. Our students look forward to working in teams. They understand that "real" software is produced by teams working in organizations. They

192

Advising Student Software Development Teams

____________________________________________________________ come to understand the need for effective teamwork, process, and attention to quality and detail. Students learn that their success depends on teamwork factors as well as mastery of technology. They appreciate that experiencing teamwork is an important and legitimate educational endeavour. They generally enjoy their team experiences and rate their team-based, project courses very highly.

References 1. Standish Group, Inc. CHAOS: A Recipe for Success, research report, 1999; see www.standishgroup.com. 2. DeMarco, Tom and Lister, Timothy. Peopleware, 2nd ed, Dorset House Publishing, New York, 1999. 3. McConnell, Steve. Rapid Development. Microsoft Press, 1996, Redmond, Washington. 4. McConnell, Steve. 2002. "The Business of Software Improvement." IEEE Software, 20 (July-Aug 2002): 5-7. 5. Hilburn, T. B. and Humphrey, W. S. "Teaching Teamwork." IEEE Software 19 (Sept.-Oct. 2002): 72-77. 6. Schwarz, Roger M. The Skilled Facilitator: Practical Wisdom For Developing Effective Groups. San Francisco, CA: Jossey-Bass Publishers, 1994. 7. Tomal, Jennifer A. "A Factors Approach For Studying Success On Student Software Development Teams." Doctoral Dissertation, University of Pittsburgh, Pittsburgh, PA, 2001. 8. Maxwell, John C. The 17 Indisputable Laws of Teamwork, 2001, Thomas Nelson Inc. Nashville TN. 9. Lakhanpal, B. 1993. "Understanding the Factors Influencing the Performance of Software Development Groups: An Exploratory GroupLevel Analysis." Information and Software Technology, 35(8): 468-473. 10. Sundstrom, E., De Meuse, K. P., and Futrell, D. 1990. "Work Teams: Applications and Effectiveness." American Psychologist, 45(2): 120-133.

Randy S Weinberg & Jennifer A Tomal

193

____________________________________________________________ Randy S. Weinberg ([email protected]) teaches Information Systems at Carnegie Mellon University, Pittsburgh, Pennsylvania, USA. Jennifer A. Tomal ([email protected]) is an assistant professor teaching Emerging Technologies at Slippery Rock University, Slippery Rock, Pennsylvania, USA.

Use of Interactive Multimedia to improve the lecturing experience Clive Chandler 1.

Introduction Multimedia computer systems combine sound, image and text in an interactive style to provide more effective communication. In order to fully utilise multimedia lecturers need the flexibility to arrange and organise all the various multimedia components in order to incorporate them in their presentation. Unfortunately most multimedia material comes pre-packaged or is limited as part of a presentation software package, this means that there is no means of adapting or modifying the multimedia to suit a presentation style etc. As such the take-up of multimedia in the lecturing environment has been at best slow. There has been a large body of work on how to utilise multimedia within the field of education, much of it focused on the self-directed learning approach (1, 2, 3). Similarly a large amount of effort has been spent on how humans learn and the cognitive abilities, needs and goals of the higher educational student. This work culminated with a description of seven key principles to develop good practice with students in the USA (4). In the 1990’s a body of work highlighted the application of these principles to the generic pedagogy of student learning. At the same time within the Human Computer Interaction arena theories of how humans learn and interact with any kind of an interface were being refined and developed. This field involved both cognitive science and psychology combined with multimedia design, graphics design and information architecture. As early as 1993 researchers were proclaiming how the advent of multimedia would hail a new paradigm in higher education pedagogy, unfortunately that paradigm shift only resulted in the advent of the use of the World Wide Web as both a delivery tool and to foster student interaction. None of these approaches seems to have addressed a fundamental flaw in the current drive for suitable higher educational pedagogy. Most universities in the UK and according to Thomas (5) in the USA still rely on, and utilise the lecturing approach; here an expert (or gifted orator) stands in front of a class of students and attempts to disseminate knowledge over a period which varies from 30 Minutes to 2 Hours. Yet the body of pedagogical researchers argue that this particular route offers limited appeal and “success” from the students’ point of view. In addition most lecturers would agree that their utilisation of Multimedia within these lecture periods is at best limited to the use of animations (usually via

196

Use of Interactive Media to Improve Lecturing

Microsoft PowerPoint), video clips, and sound bytes. As such the whole experience is somewhat passive. This paper is an attempt to explore the various fields and to draw out from those experienced in these genres the fundamental goals and pedagogy in order to attempt to improve the situation and provide a vehicle for informed discussion. 2.

Pedagogical Overview According to St Edwards University Centre for Teaching Excellence (6) a lecture is an extended presentation in which the instructor presents information in an organized and logically sequenced way. It typically results in long periods of uninterrupted teacher-centred expository discourse that relegates students to the role of passive spectators in the college classroom. They argue that lecturing is the most common form of presenting information in colleges and universities and maintain that the pedagogy for this approach is based upon theories of the structure and organisation of knowledge, the psychology of meaningful verbal learning and ideas from cognitive psychology associated with the representation and acquisition of knowledge. They maintain however that this approach is only valid for presenting material that is not available elsewhere or too complex rather than dissemination of knowledge to a large number of students. This is a view supported by researchers such as Thomas (5) who argues that the teaching practices in 1998 across the USA often revolve around the lecturing process. He goes on to cite references which argue that this approach to the learning processes within the university are flawed and rely upon too much teacher-centred delivery (7, 8, 9, 10). Roper (11) argues that there is little or no student-student interaction and teacher-student interaction is often brief and impersonal. Hooks in 1994 (9) maintains that in a traditional classroom, students learn as isolated, independent individuals. He finalised his treatise by stating that the “current teaching practices must change”. From 1997 researchers have argued that the use of the internet would produce the required paradigm shift in student learning (12), as such researchers such as Baker (13) argue for a pedagogy of computer mediated learning. She quotes from a work by Gillespie that computer technology has rarely produced significant instructional changes in higher education and that we “first use new technology in old ways” (14). The drive at that stage was for an online learning approach claiming advantages such as efficient use of both instructor and students time; global access and interactivity; promotion of reflective thought with use of online discussion groups; timely feedback between instructors and learners etc. (13) However Baker goes on to quote some of the obvious

Clive Chandler

197

disadvantages at that time i.e. fear of change; lack of understanding of instructional technology etc. Bogdanov in 1999 (15) discusses active learning tools which promote active learning such as email, listservs, forums, chat rooms, online videoconferencing and simulated learning environments. In 2002 Ehrmann (16) argues that the drive for new technology may well be based on the wrong premise i.e. “Many advocates of technology want to improve current teaching. But too often they fail to ask whether traditional education is teaching the right content”. He goes on to quote a well know paradigm “Any undergraduate can tell you that grades are the key to interpreting the mysteries of higher education – Faculty give you high grades when you learn what they value” He goes on to argue that we are now focussing so much on learning outcomes and grades that we miss the fact that medium isn’t the message. He concludes by describing the “Flashlight” project which aims to evaluate the strategy used in higher education and offers three main goals for “what matters” Not the technology per se but how it is used Not so much what happens in the moments when the student is using the technology, but more how those uses promote larger improvements in the fabric of the student’s education Not so much what we can discover about the average truth for education at all institutions but more what we can learn about our own degree programs and our own students. 3.

Internet Innovations It may well be that the so-called traditional approach to the learning process still needs altering. Certainly in the authors experience in higher education for the last six years there have been no real innovative leaps. We as educators have taken the advent of the internet in our stride and many of us utilise the benefits of online resources for our students. We provide extra information via our own web site or a reading list which includes web site materials. We provide online tutorial, schedule, email and registration possibilities. Yet as such the internet itself has not been the mainstay of the students learning experience. Chong & Saukauchi (17) describe an approach to creating and sharing web notes, whilst Owen in 1996 discusses how to integrate the Web into undergraduate education (18). Gordin et al describes a learning environment approach for k-12 students to build communities for learning (19) . More interesting is the article by Ebersole (20) discussing the uses and gratifications of the web amongst students. All of these approaches serve to illustrate the directions in internet technology for learning. Indeed there are many other references which could be cited and the paper could go on to discuss the current virtues of the online self-directed learning approach, however all of this research, interesting and worthwhile though it may be,

198

Use of Interactive Media to Improve Lecturing

seems to leave our humble educator in somewhat of a dilemma with a difficult question to answer: Is lecturing dead? 4.

Lecturing as entertainment… According to Tannenbaum (21):

“People have been communicating with each other for hundreds of thousands, perhaps millions, of years. Whenever possible, the initiator of the communication has employed whatever additional methods were available to enhance the communication and make it as effective as possible. So, for example, modern storytellers, as perhaps ancient ones did, use their hands to illustrate the action and create sound effects to emphasize or portray more realistic scenes. As further examples, stage plays and their derivatives, such as opera and movies, normally include costumes and scenery to enhance and further the communication.” So-called modern presentations can include mixtures of two or more media and can be considered as a multimedia presentation. Typical examples include a poetry reading with music, a theatre play with photographic images as part of the script or backdrop (e.g. Oh what a lovely war). Which ever multimedia event one thinks of the goal remains to elicit a reaction from the audience to engage them in some fashion in order to achieve a vehicle for communication. The question then is that have we lost the art of performance in our lecturing? As mentioned earlier in this paper the pedagogical researchers argue that we are more teacher-centred in our delivery and as such, we could argue that we have moved away from performance related delivery of information. Tannenbaum argues that this analogy between performance and teaching has validity, however he argues that although multimedia is interactive, those interactions are primitive and cannot relate to those achieved by an actor and their audience. He concludes that developers will not be able to achieve the actor’s control of the audience in multimedia programs until such time as more sophisticated artificial intelligence is achieved. Given this premise that we consider the lecture to be akin to a performance, we can consider tools which would enhance the performance. Ideally we would prefer to engage our audience in a meaningful interaction, however given the nature of the lecturing process this is not always easy to achieve. Unless our experts become gifted orators we cannot hope to match the gifted actor in terms of audience engagement. With the advent of multimedia authoring systems and the ability to manipulate multimedia objects perhaps our lecturing could be further

Clive Chandler

199

enhanced by employing a multimedia strategy to elicit a meaningful engagement with our students? 5.

Multimedia From its earliest form educators were attempting to enhance the lecturing process by utilising multimedia. In most cases the drive was to produce a cd-rom of the actual lectures themselves. This is of course another solution promoting self-directed learning etc. Over the years more relevant research has taken place to incorporate multimedia elements within the lecture process (23, 24, 25). Bates in his article in 2000 (26) puts forward a view of multimedia education which argues that we should cater for all forms of learning styles and as such that the lecture could well be either outmoded or delivered via the Internet on demand. This view is supported indirectly in the work carried out by Blank (27) who discusses the adaptation of multimedia for different learners. Some of these approaches in and of themselves whilst holding merit still fall short of the performance paradigm and instead lean more towards the self-directed learner approached already mentioned. The poor lecturer then either has to “put up” with the current approach or else move to either a web-based (distance?) learning environment or a self-directed learning whereby the lectures may well be recorded on video with appropriate notes etc for the student to learn at their own pace. Practical experience with today’s university undergraduates tends to highlight the flaws in these approaches. It is almost impossible to substitute for the lecture environment (even in a virtual lecture theatre). The student-to-student or student-to-lecture interaction would be at best simulated and in most cases missing. It is the authors’ belief that the absence of the lecturing approach is in fact a detrimental step in the arsenal of tools for the undergraduate learner. Indeed what is more interesting and relevant in my opinion is the advent of technologies and aids which enhance the performance elements within the lecturing genre. One such technology is Agents; 6.

Agents With the advent of Microsoft Agent Technology (28) a new generation of multimedia interfaces and web sites has been designed. Agent technology is an attempt to produce an animated character which can be scripted to interact with its audience or learner; The figure below illustrates the standard animated characters available as a free download from Microsoft:

200

Use of Interactive Media to Improve Lecturing

Merlin

Genie

Peedy

Robby

Figure 1: Microsoft Agent Characters In addition to these standard characters, there are a number of royalty free websites which offer characters for any particular situation (29) and a variety of royalty free scripting programs such as MASH (30) to develop JavaScript programming for use in web technology. In addition such presentation tools as VoxProxy (31) or XtraAgent (32) enable programming for PowerPoint and Macromedia Director. The use of these characters within interface design and web presentations has been the subject of a number of research papers. According to Andre (33) the use of animated characters (agents) has been used to try and imitate the skills of human presenters in knowledge based systems. She argues that they are of great benefit in the self-directed learning arena because they can convey conversational signals such as waiting for a turn or awaiting feedback, similarly they can engender confidence by the use of facial expressions and body movements. She goes on to cite work carried out in empirical studies to show how users who have experienced given by animated characters as being more lively and engaging (34, 35). It is precisely this approach which may well lead to the development of agent assisted lectures.

Clive Chandler

201

Much of the work of Andre and other researchers such as Rist (36), Bates (37) and Blumberg (38), are directed at the self-directed arena in an attempt to endow the characters with emotional behaviour and distinct, predefined personality. Another interesting review was that carried out by Rickenberg and Reeves (39) in which they analyze the reactions to students on an animated assistant known as Einstein. The character was a selected agent utilizing Microsoft Agent technology (28) capable of various animated expressions, illustrated in the figure below.

Figure 2: Einstein Character The students reacted towards the characters expression during a web based learning exercise. This constituted at least a basic engagement. Andre (33) maintains that such a single character approach in presentations yields little engagement and in her research investigates the use of multiple character interactions illustrated by various applications including the inhabited marketplace. Here the interactions convey information about specific cars.

202

Use of Interactive Media to Improve Lecturing

Figure 3: Inhabited Marketplace Other examples of Agent technology include personal tutors Herman the Bug (34) and Steve (40). Another adaptation of this approach is the utilization of agents under human control via a handheld PDA (41). These approaches are designed to engage the user or student in the application and to develop a good user experience. What seems to be missing is any attempt at applying these innovations to the lecturing process. Although interestingly at the end of Andre’s work she does argue that it would be interesting to combine the use of agents with human performers or presenters. 7.

Agent Assisted Lecture Process

Given the above discussion of the use and apparent acceptance of agents within interface design and web information it would seem prudent to investigate their use in a simple lecturing approach. Advanced Multimedia for the Web and CD is a final year module taught to an average of 60 students on a tutorial and one hour lecture per week. The lecture theatre used has full multimedia projection equipment controlled via a PC. In order to investigate the possible enhancement of the lecturing process Microsoft Agent characters were used in a PowerPoint presentation facilitated by the use of VoxProxy software. The figure below illustrates the VoxProxy interface:

Figure 4: VoxProxy software for use in Microsoft PowerPoint The initial exposure took place during a lecture on the use of advanced sound, animation and the use of agents in multimedia interfaces. This was deemed appropriate as the topic itself was on the use of these animated characters and therefore required no seriously in depth programming for agent responses. Taking on board the work carried out by Andre et al, rather than utilizing a passive agent interaction, instead the initial interaction was

Clive Chandler

203

between three of the available agents in a general discussion about the Agent technology. In addition elements such as character to character conflict, interruption and human character interaction were included. The agents were controlled in several ways i.e. By the use of Mouse clicks (useful to include human-to-character interaction) By programming based on specific character phrases or appearance By VoxProxy built in animations and emotional reactions The use of these methods enabled study of the differences between agent-to-agent interaction, in which we as the audience were passive and human-to-character interaction in which by careful staging etc we appeared to be an interactive audience. The student reactions were noted by both observation and questioning post lecture as well as at the end of the module as a whole In addition to the lecture interaction, the students were exposed to the use of these agents during both interactive tutorial sessions, one with MASH and the other using XtraAgent with Macromedia Director. Again their responses were noted both by observation and also one-to-one questioning. Finally the students were exposed to the use of agents within the area of design by utilizing the agents in a CD-Rom interface (Semester 1 assignment) and also the same interface modified for web delivery (Semester2 Assignment). These experiences and observations were designed to indicate how easy it would be to incorporate the use of such agents both in presentations and demonstrations from the experienced novice viewpoint. This is essential as were the lecturing process seen to be improved by the use of such multimedia it would be necessary to ascertain how difficult it would be for faculty staff (experienced novices) to utilize the approach. 8.

Discussion Initial usage of multimedia objects within the lecture program at Staffordshire was met with interest but some boredom. Students liked the idea of video clips, humorous animations etc. Equipment demonstrations, especially in the mobile environment were also popular. The results of enhancing these approaches with Microsoft Agent technology were interesting, although at the outset one might be dubious at the use of these agents in a passive watcher environment the lecture proved to be a great success both in the passive and in the scripted interactive format. Students found the lecture more enjoyable than a “normal” lecture and felt that they were less “talked at” and more “involved” than in other lecturing styles. This would seem to bear out other researchers work on user-satisfaction. The difficulty it would seem is how much interaction between characters would be beneficial in the

204

Use of Interactive Media to Improve Lecturing

transmission of the information. In order to successfully ascertain the enhancement gained by such an approach it would be necessary to carry out a true comparison with a parallel group of students being exposed to a non-agent presentation and both groups completing a post questionnaire which would aim to test their learning and also their “enjoyment” and involvement in the process. Similarly the fact that students were exposed to authoring packages which could create animated characters seemed to develop more enthusiasm for their application in the final interface. The MASH program proved to be the most popular and successful. XtraAgent utilized in Macromedia Director was found to require more programming knowledge and as such was deemed by the students to have a very high learning curve (this is consistent with the Macromedia Director application itself). It was disappointing however to view the students final works both in the CD-Rom based and the web based prototypes. The use of the agent technology was crude to put it mildly and mainly consisted of a single character welcoming the user to the site or the application. In one case however the student had utilized the agent to introduce the application tools – consistent with the uses of agent technologies discussed by other researchers. On the whole there would appear to be much mileage in the use of these characters in an interactive way to enhance the lecturing process. However, the issue it would seem is how much of a performance should the lecture be and how to gain some interaction with the students. Improved command and control technologies have now made possible speech recognition from multi-voice within an auditorium. There is then the possibility of allowing the student to interact with the agents directly within either a PowerPoint presentation or alternatively a multimedia application produced for the purpose of lecturing. To this end Macromedia has recently announced the introduction of a tool called “breeze®” which is designed to integrate PowerPoint slides within a Macromedia Director project. This then opens a wealth of possibilities for the multimedia designer to add content to and enhance the delivery of lecture material. This however raises another issue, lecturing staff; Academics will claim they are already overworked and to apply another burden of multimedia presentation production is burdensome. Indeed many would argue that they would not bother. As such there is a need, if this approach finally proves successful, for a simple authoring environment or templates of interaction which could be as simple or difficult to use as PowerPoint or other delivery applications. The goal is still to achieve the level of interaction of an audience participation, but to make it both interesting and “fun” in other words we are still searching for a suitable edutainment in education approach. Such TV shows as “who wants to be a millionaire now employ the use of handsets to deliver so-called audience participation, perhaps there is a

Clive Chandler

205

possibility of utilizing such interfaces within the lecturing system, again not to replace our expert but to enhance the delivery of the material. As such we are still in search of the “right” approach but this particular enhancement may well feature in the final solution. 9.

Conclusions According to many researchers tradition Lectures are too teacher-

centred Multimedia can assist in engaging students in self-directed learning applications Use of Agent technology has proven successful in self-directed learning processes Initial enhancement of the lecture approach using Microsoft Agent has proved successful Experienced Novices found scripting tools much more useful than a programming approach There is still a lack of knowledge on the use of this technology to enhance presentations More work is essential to provide case studies for multimedia enhancements 10.

Further Work During the next semester further work on the utilization of the Microsoft agent approach will be carried out in a series of modules form level 1 to final year for undergraduates and in specific Masters level modules. It will also be necessary to develop a more rigorous metric to analyze the affect on learning and also the “enjoyment” factor. Further enhancements still need to be investigated such as use of command and control voice software etc. It is also hoped that a more strictly scripted lecturing “performance” can be developed and investigated. Similarly other faculty members will be encouraged to “try out” the software packages, although it is envisaged that similar to other multimedia technologies the degree of take-up will require a gradual ground-swell

Notes 1. E.Adams et al, “Interactive multimedia pedagogies”, ACM SIGCSE Bulletin, Proceedings of the 1st Conference on Integrating technology into computer science education, Vol 28, Issue: SI, June 1996 2. T. Arndt , S. K. Chang , A. Guercio , P. Maresca, “Education and Training: an XML-based approach to multimedia software engineering for distance learning” , Proceedings of the 14th

206

3.

4. 5.

6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16.

17.

Use of Interactive Media to Improve Lecturing international conference on Software engineering and knowledge engineering, July 2002 K. Coninx , B. Daems , F. Van Reeth , E. Flerackers, “Design and realization of an interactive multimedia server in education”, ACM SIGCSE Bulletin, Proceedings of the 2nd conference on Integrating technology into computer science education, Vol 29, Issue 3, June 1997 Chickering & Gamson, “Seven Principles for Good Practice in Undergraduate Education, AAHE Bulleting, 1987 M.Thomas, “aligning Student development Theories with College Classroom Pedagogy and the Implications for Student Affairs Professionals”, ED587, 1998, http://members.aol.com/MattT10574/ED587paper.html accessed 7/7/2003 “The Positive and Negative Uses of Lecturing”, St Edwards University CTE, Austin Texas, 2002 C.R. Christensen et al, “Education for Judgement: the Artistry of discussion leadership”, Boston: Harvard Business School Press, 1991 K.A. Bruffee, “Collaborative Learning: Higher Education, interdependence, and the Authority of Knowledge”, Baltimore: John Hopkins University Press, 1993 B.Hooks, “Teaching to Transgress: Education as the practice of Freedom”, New York: Routledge, 1994 P.Love & A.G.Love, “Enhancing Student Learning: Intellectual, Social, and Emotional Integration”, ASHE-ERIC Higher Education Report Series, Vol 24-4, 1996-1998 L.Roper, “Teaching and Training”, Student Services, A Handbook for the Profession, San Francisco: Jossey Bass, 1996 D.Beynon et al, “Experience with developing multimedia courseware for the World Wide Web”, Int. J. Human-computer studeies, V47, 197-218, 1997 J.Baker, “Pedagogical Innovation Via Instructional Technology”, Proc. Syllabus `99, 1999 F.Gillespie, “Instructional Design for the New Technologies”, in K.E.Gillespie (ed) The impact of technology on faculty development, life and work, v76, 45, 1998 D.Bogdanov, “Information & Communication technologies impact on academic curricula”, Educational Technology and Society, v2, 1999 S.C.Ehrmann, “Improving the Outcomes of Higher Education: Learning From Past Mistakes”, Viewpoint column EDUCAUSE Review, 2002 (also at http:www.tltgroup.org/Visions/Improving_Outcomes.html accessed 7/7/2003) N.S.T.Chong & M.Sakauchi, “Creating and Sharing web notes via a standard browser”, Proc. SAC 2001, 2001

Clive Chandler

207

18. G.S.Owen, “Integrating World Wide Web Technology Into Undergraduate Education”, Integrating Tech into CSE, Barcelona, 1996 19. D.N.Gordin et al, “Using the World Wide Web to Build Learning Communities in K-12”, Journal of Computer-Mediated Communications, V2 Is3 (Also http://www.ascusc.org/jcmc/vol2/issue3/index.html Accessed 6/7/2003) 20. S.Ebersole, “Uses and Gratifications of the Web Amongst Students”, Journal of Computer-Mediated Communication, V6, 2000 21. R.S.Tannenbauum, “Multimedia Developers Can Learn From the History of Human Communication”, Ubiquity Article source: Http://www.ac.org/ubiquity/Multimedia_Developers.htm accessed 6/7/2003 22. R.Gonzalez et al, “Academic directions of multimedia education”, Communications of the ACM, Vol. 43, Issue 1, Jan 2000 23. T. Arndt , S. K. Chang , A. Guercio , P. Maresca, “Education and Training: an XML-based approach to multimedia software engineering for distance learning” , Proceedings of the 14th international conference on Software engineering and knowledge engineering, July 2002 24. D.Hardaway & R.P.Will, “Digital Multimedia Offers Keys to Educational Reform”, Comm. of the ACM, V40 no 4, 1997 25. P.K.McKinley et al, “Moving Industry-Guided Multimedia Technology Into the Classroom”, SIGCSE `99, New Orleans, 1999 26. T.Bates, “Teaching, Learning and the Impact of Multimedia Technologies”, EDUCAUSE Review, 2000 27. G.D.Bank et al, “Adapting Multimedia for Diverse Student Learning Styles, JCSC V18 Is 3, Feb 2003 28. Microsoft Agent Software Development kit, Microsoft Press, Redmond Washington 1999 29. http://www.msagentring.org , 11 May 2003, (3 June 2003) 30. http://www.bellcraft.com/mash , 11 May 2003, (5 July 2003) 31. http://www.voxproxy.com , 2003, (6 July 2003) 32. http://www.directxtras.com 7 Nov 2003, (5 July 2003) 33. E.Andre & T.Rist, "Presenting through Peforming: On the use of Multiple Lifelike Characters in Knowledge Bsed Presentation Systems", Proceedings IUI 2000 conference, New Orleans, 2000 34. J.Lester et al, "Animated Pedagogical Agents and Problem-Solving Effectiveness: A large-Scale Emprical Evaluation, Atificial Intelligence in Education, IOS press: Amsterdam, 23-30, 1999 35. S. Van Mulken et al, "The PersonaEffect: How Substantial is it?", Proc. HCI 98, Sheffield England, 53-66, 1998 36. T.Rist et al, "A Flexible Platform for Building Applications with LifeLike Characters", Proc IUI `03, Miami Florida, Jan 2003

208

Use of Interactive Media to Improve Lecturing

37. J.Bates, "The Role of Emotion in Believable Agents", Comm of the ACM, Vol 37, No 7, pp 122-125, 1994 38. B.Blumberg & T. Gaylen, "Multi-Level Direction of Autonomous Creatures for Real-Time Virtual Environments, Proc SIGGRAPH `95, 1995 39. R.Rickenberg & B. Reeves, "The Effects of Animated Characters on Anxiety, Task Performance, and Evaluations of Use Interfaces", Proc CHI 2000, The Hague, Amsterdam, 2000 40. J.Rickel & L.Johnson, "Animated Agents for Procedural Training in Virtual Reality: Perception, Cognition, and Motor Control", Applied Artificial Intelligence, 13:343-382, 1999 41. Y.Sumi & K.Mase, Proc Agents `01, Montreal, Canada, June 2001 Dr Clive Chandler is a senior lecturer at Staffordshire University and Chair of the Interface research group

Learning with Interactive Media: Characteristics of its Impact in Three Different Environments Corine Fitzpatrick and Michael Mucciardi 1.

Introduction

The tools we use to learn change the ways in which we learn. Writing brought a new way in which we processed organized, stored and retrieved knowledge. Current newly developing technologies, in particular multimedia, are also impacting the way we learn. The variety of sources of information alone is staggering and in combination with multiple ways in which that information is presented, they enable learning opportunities to expand in many different environments. This chapter explores three different environments where multimedia was an integral component of learning. This research was funded through a technology grant from the United States Department of Education to a New York metropolitan area Graduate School of Education to infuse technology into teaching and learning on the collegiate and K-12 teacher level. The design of Project TITAN (Transforming Instruction through Technology and Networking) reflects the use of multimedia technologies and networking as tools for teaching and learning. Consortium partners in the grant included Manhattan College, Community School District Ten which serves approximately 47,000 students in a section of the Bronx Borough of New York City, and the Archdiocese of New York which serves 37,000 students in the Bronx, Apple Computer and a corporate video conferencing partner. The group was committed to spearheading a longterm effort designed to ensure that the availability of technology proficient educators becomes a reality in the area’s inner city schools where the “digital divide” is most prominent. A mandate of the grant was to gain perspective on the effect of infusing multimedia technologies into teaching and learning. Specifically, perceptions of college faculty, K-12 professional educators, and graduate students about the use of multimedia and its effect on their professional skills, were obtained in a variety of activities. This chapter addresses three specific developments in the grant that became the focus of those research efforts including: 1) multimedia-based curriculum design through collaborative teams meeting in real time and as virtual communities - involving educators from schools, faculty and students from the college - to reflect the infusion of technology as a tool for teaching and learning; 2) delivery of distance education using

210

Learning with Interactive Media

videoconferencing; and 3) development of the use of one particular medium (video analysis) in performance assessment. 2. Study 1: Collaborative Curriculum Teams A. Introduction. Over a three-year period, courses in both the undergraduate and graduate programs in a School of Education and selected courses in the School of Science were redesigned by teams of educators (college faculty and K-12 teachers) to infuse technology into the curriculum. The design work was carried out in three phases, the first year being devoted to the design (DES) phase, the second year to implementation (IMP), and the third year to refinement (REF) of the developed curriculum. Table 1 illustrates the Schedule for the undergraduate course revisions. Courses

Yr 1

Yr 2

Yr 3

Principles and Practices of Education Child and Adolescent Development Principles and Practices of Reading in the Content Area Management of Behaviour and Learning Computers and Their Uses Differential Equations

DES

IMP

REF

DES

IMP

Curriculum and Pedagogy in the Elementary Classroom Integrated Learning PreK-3 Integrated Learning 4-6 Curriculum and Methods of Teaching Mathematics Gr. 7-12 Curriculum and Methods of Teaching Science Gr. 7-12

Yr 4

REF

Introduction to Higher Geometry General Biology Math for Elementary School Teachers Topics in Science I

Table 1: Schedule of Undergraduate Curriculum Redesign Effort The teams were also designed to provide collaborative opportunities for K-12 educators to learn technologies and infuse them into their curriculum. Graduate and/or undergraduate students were assigned to each team and a technical consultant joined the group. The overriding goal was one shared by this community of educators: How can we best move forward on technology integration: from exposure to emerging technologies to infusion by the professor, the teacher, and the K12 student? Shared learning collaborations are critical. Inner city urban K12 students are often the last group to gain exposure to advanced technologies. College professors are often the first to see the newest technologies. How can we reduce that exposure gap? The team approach

Corine Fitzpatrick and Michael Mucciardi

211

enabled a unique collaboration between a group of K-12 urban school educators and faculty from a College of Education that hopefully would result in a “trickle down effect” of infusing technology into college classrooms and most importantly, to the inner city K-16 classroom. Research supports this collaborative learning approach. Successful college - school partnerships include the nurturing of a learning community, a true collaboration both within and between partners, and accountability for teaching and learning. 1 This learning community is one in which teaching includes knowledge based practice, collegial interaction and an inquiry based orientation. Collaboration is critical to development of these partnerships. 2 College of Education faculty is not the experts and the school faculty is not the learners. Rather, each member of the learning community brings certain strengths to the relationship. Fitzpatrick & Mucciardi, in collaborations with inner city schools, have noted that teachers often bring more practical experience and understanding of how to implement technology into curriculum than do professors. 3 Furthermore, they found that both groups learned more with on site technology support. Technology is increasingly being linked to constructivist learning. Education leaders who have been involved in revising frameworks and standards for learning have based their suggestions on the constructivist model. 4 Pepi and Scheurman 5 suggest that advocates of computer technology believe computers and telecommunications are the primary, if not the exclusive tools for implementing constructivist-teaching methodologies. Yet, projects that address the disparities in access to technology for minority and disadvantaged students are few. 6 They found substantial gains in reading and math when inner city students were paired in groups with college students and faculty using technology. Gardner, in an article on the Quality of Educational Research speaks to the problem of making educational research meaningful for teachers and available to them. 7 The same might well be said of technology infusion. We need to make it meaningful and available for learning in comfortable, interesting environments. Cantwell & Britain note that teachers in their project devoted to technology infusion learned best in project-based assignments, in groups with technology specialists, and when they had to directly relate the learning to their curriculum. 8 Technological collaborations between Universities and K-12 schools have received little or no notice in the literature. No one doubts that technology has the power to reshape both education and learning institutions in providing teachers and professors with other models of teaching, learning, and assessment. Moreover, technology standards (e.g., NETS) have become a crucial part of educational standards and therefore accountability.

212

Learning with Interactive Media

Technology is increasingly being linked to constructivist learning. Cognitive psychologists have turned overwhelmingly toward a constructivist view of learning. 9 Constructivist learning is concerned with understandings achieved through relevant, hands-on, engaging activities. Education leaders who have been involved in revising frameworks and standards for learning have based their suggestions on the constructivist model. 10 Pepi and Scheurman suggest that advocates of computer technology believe computers and telecommunications are the primary, if not the exclusive tools for implementing constructivist-teaching methodologies. 11 The focus of this particular research study was to ascertain if learning technology using a variety of multimedia including asynchronous and synchronous modes, in a collaborative setting, enabled participants to have better attitudes towards and skills in technology. The curriculum teams focused on the learning of multimedia technologies for infusion in courses. Experimental teams utilized real time, synchronous (e.g., virtual classroom, instant messenger) and asynchronous environments (e.g., discussion threads) in learning and reviewing research, while control teams learned the technologies in real time only and engaged in no on line discussion of research. Thus, some teams could learn technologies together in person, asynchronously on line through a course management system, and synchronously through virtual classrooms and instant messenger. Some teams (experimental group) also engaged in the review and discussion of research on multimedia and its implications on their work. A third group were not members of teams and received no technology instruction. Findings of the survey from that group are not discussed here. B. Participants Participants included college faculty and K-12 educators only. Twelve teams were included in the experimental group, and 13 teams in the control group, based on use of more multimedia (e.g., use of more course management features, Instant messenger, discussion threads, interactive projects involving more than one technology). C. Instruments Participants’ perceptions of their attitudes toward technology and their skills in technology were assessed using two measures designed for the grant. The first measure, on attitudes/dispositions, included 25 questions using a Likert scale format with a total possible score range of 0 to 125. The second measure, on perceived skills, similarly designed, was based on 33 questions, with a total possible score range of 0 to 165.

Corine Fitzpatrick and Michael Mucciardi

213

D. Procedures Participants were handed the instruments at the conclusion of their second year on their teams. E. Results Means and standard deviations are shown in Table 2. Scale/Group Attitude EXP. Cont. Skill EXP. Cont.

N

M

S.D.

12 13

116.1 109.9

3.8 8.0

12 151.4 5.7 13 143.3 9.5 Table 2: Means and Standard Deviations

T-tests indicated significant differences by group on each measure (Attitude: t (23) =2.384, p < .05; Skill: t (23) = 2.562, p < .05). The data suggest that the faculty and K-12 educators on teams that engaged in a variety of multimedia modalities in learning and collaborating on their learning had better attitudes towards the role of technology in learning and teaching and in their perception of their skills. F. Discussion Professional educators engaged in collaborative experiences to infuse technology into their work and to learn how to use it to communicate with their students. The teams provided a unique opportunity for college faculty to work with K-12 educators in developing ways to enrich their curriculum. They learned about various forms of multimedia as well as shared their ideas for incorporating different media into their work. These professionals thought, dialogued, and created. Yet there were ultimately differences in attitude and skill in the two sets of groups with the experimental groups perceiving themselves as having a better attitude toward technology and better skills. The question is what in those experiences might have mattered? What were the differences that might best suggest which characteristics are important to enhancing this kind of learning environment? The experimental groups had a more multimedia rich experience that included a connection to their group through the course management system, through review of research articles and others’ responses to those articles on a discussion thread, through interactive media such as instant messenger and through collaborative development and critique of projects. While the control groups had as many tasks to complete, and their teams were project based, they primarily focused on learning technologies and

214

Learning with Interactive Media

did not have the immersion into the various media that in one sense provided extra cohesion to the collaboration. Additionally, activities in the experimental groups that were done outside the actual class yet were connected to the class (e.g., analysis of articles, critiques of other’s responses, and analysis of other’s projects on line), included more cognitive and metacognitive processing and thus provided more of a constructivist approach to the learning. This research suggests that collaborative experiences in developing skill in using technology in educational settings bolstered by immersion into a variety of media using cognitive/metacognitive tasks are characteristics associated with positive perceptions of attitude and skill.

3. Study 2: A Model for Development of a Similar Distance Experience for Real Time Learning with Multimedia through Videoconferencing A. Introduction This study examined the perceived effect of using multiple technologies on participants’ attitudes in a course. A course in a Graduate Education program was delivered in real time and through videoconferencing. All students were exposed in real time to media besides videoconferencing, such as archived music and video, PowerPoint, and content delivered asynchronously via email and a course management system. Multiple camera angles were used in order to obtain maximum visual stimulation of classroom events, as well as wireless microphones in order to minimize audio feedback while adding a control mechanism to variables such as “out of control” discussions which would normally not be a problem to a local in-class student but mere chaos for the distance education student. Videoconferencing and multimedia have been delivered in conjunction with each other to create a flexible learning environment that can be used in a variety of ways for quite some time now. 12 Since the only initial means of easily doing so was by using once cutting edge technologies such as ISDN, very few academic organizations had the resources available to carry out such endeavours without the assistance or financial support of large telecommunications companies. Technology has advanced in such a way as to open the doors of videoconferencing to more people while still allowing at least the same variety and flexibility in the way videoconferencing can be used to enhance learning environments. Although certain difficulties still exist in using videoconferencing as a medium for two-way transference of knowledge (e.g., the lag of video/audio; the cumbersome use of microphones), such obstacles are merely an inconvenience and can be overcome to effectively learn. 13 It has been shown that although one modality of learning might be preferred

Corine Fitzpatrick and Michael Mucciardi

215

over another, for example face to face versus videoconferencing, just as much can be learned by someone who is being delivered knowledge over videoconferencing as opposed to receiving this information face to face with an instructor. 14 One of the issues that had been noticed in delivering instruction over a two-way television-like venue such as videoconferencing is the participants’ attention span. Similar methods have been employed in the delivery of videoconferencing, such as using the functionality of pan/tilt/zoom cameras; these devices have been shown to be particularly effective in retaining television viewers’ attention span while increasing visual processing. 15 Very few studies involving videoconferencing in a real classroom setting have taken place. 16 The model described in this study is unique in using a robust videoconference design in order to compensate for the shortcoming of distance versus faceto-face learning. Moreover, the study idiosyncratically captures the videoconferencing environment in a real setting. The purpose of the study was to gain some perspective of participants’ perceptions about how positively or negatively videoconferencing and using multimedia impacted their learning. B .Participants Two groups participated. The first group consisted of students (N=19) who participated in the course in real time. The second group (N=5) participated through videoconferencing. In addition, the first group had one experience where they were the distance group and vice versa. Students were matriculated in a graduate education program. C. Instruments A survey was devised by the researchers, based on prior research and experience in the development of the videoconferencing component of the grant. One of the researchers also oversaw the use of videoconferencing in the course taken by the participants. The survey was given to the students at the end of the course; it included 26 questions in Likert scale format with the scores ranging from “Never agree” to “Always agree” (scale went from 1 to 5). There was also a section on the survey asking participants for comments. One of the researchers made documented observations throughout the course as he was present during every class and captured the reflections of the professor in the course regularly. D. Procedure Throughout the course, the technical consultant (one of the researchers) documented personal observations. After the conclusion of the class, a survey was developed and placed on the on-line course

216

Learning with Interactive Media

management system. Students were contacted through email and asked to respond to the survey. E. Results The means and standard deviations of some of the results from the survey are shown below in Table 3. Scores could range from 1 to 5 – 5 being most positive. Number Q1

Question

Mean

Use of the Internet both during and after 3.2 class helped me form a more concrete relationship with those in class locally. Q3 Blackboard assisted me in strengthening 3.7 my knowledge of the topics being discussed during class. Q4 Blackboard allowed me to become more 3.8 familiar with my colleagues than if I was to only see them during class. Q6 I thought I did have to participate even 4.0 though the professor was not paying attention to me over videoconferencing. Q15 Microphones allowed for a free flowing 1.8 discussion. Q22 Being able to see multiple people 3.2 speaking allowed me to keep track of the discussion taking place. Q23 I think it is better when the camera 3.9 moves (pan/tilt/zoom to track speakers). Q25 I prefer when the camera moves in order 3.8 to obtain a constant larger image of the professor. Q26 I like it when the camera moves in order 3.9 to obtain images of various students at different times. Comm. (sum of 24 Q’s related to Quality of 77.3 Communication Support Two questions related to quality of 9.0 support Table 3: The Means and Standard Deviations of Results

Std Dev 0.8 1.0 1.1 1.4 1.0 0.6 .9 0.9 1.2 9.1 2.3

The personal observations of the technical consultant supported the results obtained from the survey which indicated that the use of multimedia, videoconferencing, as well as an asynchronous online course

Corine Fitzpatrick and Michael Mucciardi

217

management system enabled learning experiences to be enhanced (Q1; Q3; Q4: Χ = 3.2 , s.d. = .8; X = 3.7 , s.d = 1.0; Χ = 3.8 , s.d. = 1.1). This extra mode of content delivery allowed the professor to add additional media sources and forms of communication with and amongst the class. A setup was used that allowed a video output of the presentation computer to be sent into the videoconferencing unit. This was used to visually cue the distance students to the fact that the professor/presenter had transitioned PowerPoint slides and allowed them to view the presentation on their local computers in much higher resolution than was possible using just videoconferencing with the aid of an online course management system. The use of microphones, although allowing the distance class participants to more easily follow local classroom discussions, was found to take away from the natural flow of discussions that would normally take place had microphones not been used (Q15: Χ = 1.8 , s.d. = 1.0). The researcher and the professor were both in agreement in noting an increased attention to the lecture on one particular day. On that day there was a power failure to the Wireless Access Point disabling any type of in-class Internet Access. This particular event caused the professor to step back and reflect upon the use of the Internet and multimedia in the class. The question was then raised about the effectiveness of multimedia when the instructor does not deliver it in a controlled fashion. The particular manner in which the wireless microphones were used effectively controlled another variable: allowing all relevant conversations to be captured for the distance audience. Although students are inclined to mostly agree with the importance of the way the video content was dynamically delivered with multiple camera angles (Q23, Q25, Q26: Χ = 3.9 , s.d. = .9, Χ = 3.8 , s.d. = .9,

Χ = 3.9 , s.d. = 1.2), students mostly agreed that participation was still important regardless of the amount of attention given to them by the instructor through videoconferencing (Q6: Χ = 4.0 , s.d. = 1.4). A series of questions were grouped together to reflect perception of overall communication quality (Comm.) and another series were grouped together to reflect perception of quality of technical support (Support). Participants mostly agreed that the communication quality was good and very strongly agreed that the support was good. A regression with Communication quality as the dependent variable and using age, gender, and support as independent variables was significant (F = 12.415, p< .01). Post hoc tests revealed that support was the primary predictor (t = 5.4, p