IT Seminar - Prestige Institute Of Management Gwalior

11 downloads 2975214 Views 15MB Size Report
Dec 14, 2014 - The New Concept of Academic Libraries; Applicability and Availability of ...... visuals affect the theme and feel of the web page that affects your purchasing power. ...... systems the responsibilities of adding members and revoking signature anonymity are ...... 18. http:/hrushikeshzadgaonkar.wordpress.com ...
Information Technology Applications: Strategies, Issues and Challenges

Information Technology Applications: Strategies, Issues and Challenges

Chief Editor Dr. S.S. Bhakar Director Prestige Institute of Management Gwalior

Editors

Prof. Krishan Kant Yadav

Ms. Anamika Shrivastava

Assistant Professor Department of Computer Application Prestige Institute of Management, Gwalior

Deputy Librarian Prestige Institute of Management, Gwalior

Prestige Institute of Management Gwalior (M.P.) 474020 (India)

Copyright Prestige Institute of Management, Gwalior

All right reserved, no part of this publication may be reproduced or transmitted, in any form or by any means, without permission. Any person who does any unauthorised act in relation to this publication may be liable to criminal prosecution and civil claims for damages.

First Published, 2015 ISBN:

Printed in India: PRESTIGE INSTITUTE OF MANAGEMENT, GWALIOR Airport Road, Opp. Deendayal Nagar, Gwalior (M.P.) 474020 Phone: 0751-4097000, 2470724 E-mail [email protected] Website: http://www.prestigegwl.org

Preface __________________________________ Information Technology Applications: Strategies, Issues and Challenges Organizations have been using IT as support system in developing and implementing organizational strategies but have rarely based organizational strategies based on information technology. The fast paced developments in the area of Information technology have made IT an important component of Organizational Strategy. Last decade has seen unprecedented success for the organizations whose organizational strategies revolve around Information Technology. In last decade business organizations have been formulating overall organizational strategy based on four fundamental domains of strategic choice: business strategy, information technology strategy, organizational infrastructure and processes, and information technology Infrastructure and processes--each with its own underlying dimensions. Information technology can act as the basis for designing organizational strategy. In doing so, the organizations can use IT for both fundamental characteristics for strategy: the interrelationships between external and internal components and integration between business and functional domains. IT based business strategy is capable of transforming the way the business is conducted. One look at the way business is conducted today and comparing it with the way business was conducted a decade back provides sufficient evidence about the capability of IT in transforming the way business is carried out. The effective and efficient utilization of information technology requires the alignment of the IT strategies with the business strategies, something that was not done successfully in the past with traditional approaches. New methods and approaches are now available. The strategic alignment framework applies the Strategic Alignment Model to reflect the view that business success depends on the linkage of business strategy, information technology strategy, organizational infrastructure and processes, and IT infrastructure and processes. Managers are continually being confronted with new and ever-changing competitive pressures from deregulation, globalization, ubiquitous connectivity and the convergence of industries and respond rapidly to those challenges is based on having a sophisticated and facile organizational and technical infrastructure, and a degree of informationtechnology flexibility that traditional approaches cannot provide. Increasingly, even at global companies known for their competitive and technical savvy, the gap between emerging strategic With every industry currently undergoing transformation, most companies strive to develop new approaches to competition and value creation. However, the capacity to change embedded in the technical and social infrastructures lags the desired strategic direction. Agreement on urgency is lacking, as are systems that support collaboration.

IT infrastructure must energize the internal organization, engage customers in dialogue and foster collaboration among all parties. IT is not merely a support function needed to improve efficiency. That thinking locked companies into rigid enterprise-application software only to have them realize that their business practices and processes did not fit a vendor-designed package. In confirmation to the requirements of packaged software, IT organizations limited the ability of managers to adapt to changing competitive conditions. Information infrastructure has to be able to accommodate changes quickly and at low cost. Operating an enterprise with widely varying capacity for change is analogous to driving an automobile with each wheel spinning at a different number of revolutions per minute. of innovation and efficiency. Spending on information infrastructure is significant about 2% to 8% of revenues. Unfortunately, such investments are not often viewed as strategic, and the old efficiency-oriented return-on-investment yardstick is used. The time has come for new measures to gauge the success of investments in basic infrastructure and applications. The disconnection between efficiency and flexibility cannot be bridged without an active understanding of the technical and organizational impediments to tying both the IT and the line manager to the mind-sets and the skill sets of line managers and IT specialists. The scorecard gives managers a framework to understand the locus of efficiency and innovation in their information-infrastructure portfolio.

This book is based on the selected research papers from the ones that were presented in the first IT applications seminar organized by Prestige Institute of Management, Gwalior on February 20, 2015. The book brings together research contributions from several academics and industrial professionals on the Information and provides understanding on the term sustainability and has comprehensive coverage on managing sustainability through innovations in the process of innovating. Information Technology Applications:

. The book has been divided into three sections. The first section includes research papers on recent developments in the area of Information Technology. This section includes fourteen chapters, they are: Domain Specific Content Based Satellite Image Retrieval; Mathematical Model To Represent the Role of Hypothalamus in Hunger Regulation; Generations of the World Wide Web: from Web 1.0 To Web 4.0, On Assessment of it Infrastructure Functionality: A Study Based on View of MCD Officials; E-Signatures And Its Network Security Issues; Global and Distributed Software Engineering; Systematic Applications of Cloud Services in Inquiring Concert, Self Motivated Route Optimization for AODV in Mobile Adhoc Network; Issues and Challenges of E-Commerce in Contemporary World; A First Data Mining Model for Predicting Customer Profitability; Review on Feature Extraction Based on Diagonal Direction for Handwritten Recognition System Using Neural Network; Performance Evolutions of Proactive, Reactive & Hybrid Routing Protocols in ACMANET; Privacy and Challenges: Data Storage Security Issues in Cloud Computing; Advanced Encryption Standard (AES) Cryptography to Secure Web Transactions in E-Commerce: A Review. The Second section of the book contains eighteen chapters on IT applications in Library Management for facilitating Institutions in dissemination and recording of knowledge. They are: Cloud Computing: A New Buzz in 21st Century Library Services; Digital

Object Identifier: An Overview; Role of Institutional Repositories & Oss for E-Resource Management; Development of Knowledge Through E-Learning; E-Learning Environment: The New Concept of Academic Libraries; Applicability and Availability of Open Source Software and Cloud Computing in Libraries; Gwalior College Library Network (Gwaclibnet): An

Initial Step for the Proposed Maclibnet; Concept of Cloud Computing in Libraries; Impact of Information Technology in Library Services: An Overview; Exploring Smart Digital Differences in Libraries; Cloud Computing Technology in Libraries; Importance of Distance Learning and Roll of LIS Professionals; Web Services: Advantage for Libraries and Information Seekers; Social Media and Digital Librarianship; Paramount Practices Adopted in Academic Library and its Challenges; Virtual Library - A Global Symbol of the Information Access Paradigm; Blog: A Marketing Tool for Library Services; Knowledge Management and Libraries of Technological Era The Third section contains nine chapters on application of IT in Management. They are: A Study of Globalization and its Impact on Management Education in India; Knowledge Management and its Utilizations; Managing Knowledge and its Implications; Customer Perception Towards the Cash on Delivery; Application of Total Quality Management in Libraries; Import Export and Exchange Rate: Evidence from OIC; A Study on Attitude of Bank customers about credit card; Organizational Culture: A Study of Internet Sites. Optimization and nature inspired algorithms: A review. The book will provide greater insights into the application of IT in business management and library management. The research papers included in the book are a useful source for the scholars and researchers looking for base material in the area of IT applications in business management and library management..

Contents __________________________________ Preface List of Contributors

SECTION -I INFORMATION TECHNOLOGY 1

Domain Specific Content Based Satellite Image Retrieval -Ms. Chandani Joshi, Prof. G.N. Purohit, Dr. Saurabh Mukherjee

2-7

2

Mathematical Model To Represent The Role Of Hypothalamus In Hunger Regulation -Ms. Divya, Dr. Saurabh Mukherjee

8-15

3

Generations Of The World Wide Web: From Web 1.0 To Web 4.0 -Krishan Kant Yadav

16-23

4

On Assessment Of It Infrastructure Functionality: A Study Based On View Of Mcd Officials -Vilender Kumar , Dr. Sanjay Kumar Gupta

24-35

5

E-Signatures And Its Network Security Issues -Dr. Dharmesh Kumar,Dr. Dharmendra Badal,

36-45

6

Global And Distributed Software Engineering -S.S. Khadri

46-52

7

Systematic Applications Of Cloud Services In Inquiring Concert -P.Sudha

53-61

8

Self Motivated Route Optimization For Aodv In Mobile Adhoc Network -Vishnu Mishra

62-69

9

Issues And Challenges Of E-Commerce In Contemporary World -Krishan Kant Yadav, Mahendra Singh, Dharmendra Singh

70-78

10

A First Data Mining Model For Predicting Customer Profitability -Dr. Madhur Srivstava, Dr. Dharmendra Badal

80-92

i

11

Review On Feature Extraction Based On Diagonal Direction For Handwritten Recognition System Using Neural Network -Samta Jain Goyal

93-97

12

Performance Evolutions Of Proactive, Reactive & Hybrid Routing Protocols In Acmanet -Satyendra Soni, Margi Patel , Ashish Saxena

98-104

13

Privacy And Challenges: Data Storage Security Issues In Cloud Computing -Rakesh Prasad Sarang

105-112

14

Advanced Encryption Standard (Aes) Cryptography To Secure Web Transactions In E-Commerce: A Review -Ram Kumar Paliwal, Asheesh Kumar

113-129

SECTION II IT APPLICATIONS FOR LIBRARIES

15

Cloud Computing: A New Buzz In 21st Century Library Services -Dr. Pawan Kumar Sharma, Prof. Hemant Sharma, Satya Prakash Pandey

131-139

16

Digital Object Identifier: An Overview -Shraddha Shahane, Manjula Chauhan

140-145

17

Role Of Institutional Repositories & Oss For E-Resource Management -Aslam Ansari

146-154

18

Development Of Knowledge Through E-Learning -Raghvendra Tripathi, Chanchal Gyanchandani , Anamika Shrivastava

155-161

19

E-Learning Environment: The New Concept Of Academic Libraries -Nidhi S Tiwari, Manoj Tiwari, Dr. Ramnivas Sharma

162-169

20

Applicability And Availability Of Open Source Software And Cloud Computing In Libraries -Dr. Sarita Verma, Mrs. Rashmi Sikarwar

170-180

ii

21

Gwalior College Library Network (Gwaclibnet): An Initial Step For The Proposed Maclibnet -Dr. Anil K. Sharma

181-186

22

Concept Of Cloud Computing In Libraries -Dr.M.Anandamurugan

187-193

23

Impact Of Information Technology In Library Services: An Overview -Archana Yadav

194-199

24

Xploring Smart Digital Differences In Libraries -Bankapur (V M), Ramesh Patil , Sanjivkumar Bakanetti

200-203

25

Cloud Computing Technology In Libraries -Neha Kanojia, Mrs. Deepika Raj, Mrs. Avinash Kaur

204-212

26

Importance Of Distance Learning And Roll Of Lis Professionals -Anamika Shrivastava, Renu Saxena , Varsha Sahu

213-220

27

Web Services: Advantage For Libraries And Information Seekers -Anamika Shrivastava,Omvati Sharma & Jyoti Mittal

221-229

28

Social Media And Digital Librarianship -Dr. Anil Kumar Dhiman, Ranjendra Kumar Bharti

230-237

29

Paramount Practices Adopted In Academic Library And Its Challenges -Sarita Bhargava, Alka Chaturvedi

240-244

30

Virtual Library - A Global Symbol Of The Information Access Paradigm -Dr. Rakesh Shrivastava

245-254

31

Blog: A Marketing Tool For Library Services -Anamika Shrivastava, Krishan Kant Yadav, Sumit Rajput

255-261

32

Knowledge Management And Libraries Of Technological Era -Shraddha Shahane , Manjula Chauhan

262-272

iii

SECTION III IT APLICATIONS IN MANAGEMENT

33

A Study Of Globalization And Its Impact On Management Education In India -Sourabh Jain

274-281

34

Knowledge Management And Its Utilizations -Mohd Ateek

282-286

35

Managing Knowledge And Its Implications -Meenakshi Shrivastava

287-293

36

Customer Perception Towards The Cash On Delivery -Tarika Singh,Dr Seema Mehta,Brajendra Singh Sengar, Sunil Upadhayay, Manish Dubey

294-300

37

Application Of Total Quality Management In Libraries -Navin Bhargava, Sunder Lal , Sanjay Kumar Soni

301-306

38

Import Export And Exchange Rate: Evidence From Oic -Dr. Navita Nathani,Amit Jain,Vikas Shrivastava, Jaspreet Kour

307-320

39

A Study on Attude of Bank customers about credit card -Dr. Nandan Velankar, Stayam Dubey, Ankit Sharma

321-327

40

328-334 Organizational Culture: A Study Of Internet Sites -Dr. Garima Mathur, Dr. Richa Banerjee ,Sonali Srivastava

41

Optimization And Nature- Inspired Algorithms: A Review -Prof. Vani Agrawal, Prateeksha Kulshretha

iv

335-341

List of Contributors Alka Chaturvedi

Asstt. Professor Prestige Institute of Management, Gwalior

238

Amit Jain

Alumni Prestige Institute of Management, Gwalior

307

Anamika Shrivastava

Dy. Librarian, Prestige Institute of Management Gwalior

213

Ankit Sharma

Alumni Prestige

321

Archna Yadav

Ex. Student , BU Jhansi

194

Asheesh Kumar

Student MCA, Indore

113

Ashish Saxena

Asstt. Prof. KKITM, Gwalior

98

Aslam Ansari

Asstt. Librarian, Integral University , Lucknow

146

Avinash Kaur

Ex Students, DLISc. BBAU Lucknow (U.P.)

204

Bankapur (V M)

Bundelkhand University, Jhansi (U.P.)

199

Brajendra Singh Sengar

Alumni Prestige Institute of Management, Gwalior

294

Chanchal Gyanchandani

Librarian & Information Asstt. National Library Kolkata

155

Chandani Joshi,

Research Scholar Department of computer science, AIM & ACT

02

Deepika Raj

MPhil Scholar DLISc. BBAU, Lucknow (U.P.)

204

Dharmendra Singh

Student BCA Prestige Institute of Management, Gwalior

70

Divya

Research Scholar Department of computer science, AIM & ACT

08

Dr. Anil K. Sharma,

Librarian, Lakshmi bai National Institute of Physical Education, Gwalior

181

Dr. Anil Kumar Dhiman,

Information Scientist Gurukul Kangri University, Haridwar

230

Dr. Dharamesh Kumar

Research Scholar BU Jhansi

36

Dr. Dharmendra Badal,

Sr. Lecturer Deptt. Of Mathematical Sciences and Computer Application Bundelkhand University Jhansi

36

Dr. Garima Mathur

Associate Professor, Prestige Institute of Management, Gwalior

328

Dr. Hemant Sharma

Dean, Faculty of Arts & Chairman SOS in Library & Information Science Jiwaji University, Gwalior

131

v

Dr. M. Anandamurugan

Deputy ,Librarian , Banaras Hindu University, Varanasi

187

Dr. Nandan Velnkar

Asstt. Professor, Prestige Institute of Management, Gwalior

321

Dr. Navita Nathani

Associate Professor, Prestige Institute of Management, Gwalior

307

Dr. Pawan Kumar Sharma

Librarian, Madhav Institute of Technology & Science, Gwalior

131

Dr. Rakesh Shrivastava

HOD Library and information Science VRG Girls P.G. College Morar , Gwalior

245

Dr. Ramnivas Sharma

I/C Central Library, Rajmata vijayaraje scindia Krishi Vishwa Vidhalaya, Gwalior

162

Dr. Richa Banerjee

Asstt. Professor, Prestige Institute of Management, Gwalior

328

Dr. Sanjay Gupta

SOS in Computer Science & Application s, Jiwaji University, Gwalior

24

Dr. SaritaVerma

Asstt. Professor Department of Library And Information Science, MLB Govt. College of excellence, Gwalior

170

Dr. Saurabh Mukherjee

Associate Professor, Department of Computer Science, AIM & ACT Banasthali University

02

Dr. Seema Mehta

Associate Professor, IHMR, Jaipur

294

Dr. Tatika Singh

Associate, Professor, Prestige Institute of Management, Gwalior

294

G.N. Purohit

Dean, AIM & ACT Bansthali Jaipur

02

Jaspreet Kour

Research Scholar , Jiwaji University

307

Jyoti Mittal

Asstt. Librarian Bosten College, Gwalior

221

Krishan Kant Yadav

Asstt. Professor, Deptt. In Computer Application, Prestige Institute of Management, Gwalior

16

Madhur Shrivastava

Asst. Professor, Department of Mathematics and computer application Bundelkhan University Jhansi

80

Mahendra Singh

Faculty, Sri Datiya

70

Manish Dubey

Visiting Faculty, Prestige Institute of Management, Gwalior

294

Manjula Chauhan

Asstt. Professor, Dept. of Library and information science Choudhary Charan singh P.G. College Etawah

140

Manoj Tiwari

Liberian, K.D. Dental College & Hospital, Mathura (U.P.)

162

vi

Margi Patel

Asstt. Prof. IIST, Indore

98

Meenakshi Shrivastava

Asstt. Professor , Vikrant Group of Management, Gwalior

287

Mohd Ateek

Semi Professional Assistant, Library Faculty of Dentistry Jamia Millia Islamia Univessity, New Delhi

282

Naveen Bhargava

Librarian, Shree Krishna Institute of Technology & Management, Gwalior.

301

Neha Kanojia,

Ex Students, DLISc. BBAU Lucknow (U.P.)

204

Nidhi S Tiwari

HOD/ Librarian Rajiv Academy for Technology and Management Mathura (U.P.)

162

Omvati Sharma

Asstt. Librarian Bosten College, Gwalior

221

P.SUDHA,

Faculty, Department of Computer Science, Thiruvalluvar University Consititue College of Arts and Science Tittakudi, Tamilnadu, India

53

Pratiksha Kulshresth

Student, PrestigeInstitute Of Management, Gwalior

235

Raghvendra Tripathi

Librarian, Govt. M.J.S.P.G. College, Bhind

155

Rakesh Prasad Sarang

SOS in Computer Science and application, Gwalior

105

Ram Kumar Paliwal

Asstt. Professor, Prestige Institute of Management, Gwalior

113

Ramesh Patil

Research scholar, Bundelkhand University, Jhansi

200

Ranjendra Kumar Bharti

Research Scholar, S.V. University, Gajaraula (U.P.)

230

Rashmi Sikarwar

Research Scholar , Jiwaji University, Gwalior

170

Renu Saxena

Librarian , Bosten College, Gwalior

213

S.S. Khadri

Research Scholar, Bundelkhand University, Jhansi

46

Samta Jain Goyal

Dean, Department of Computer science, Amity University, Gwalior

93

Sanjay kumar soni

Library Assistant, ABV-IIITM, Gwalior

301

Sanjiv kumar Bakanetti

Research Scholar, Bundelkhand University, Jhansi

200

Sarita Bhargava

Asstt. Librarian Prestige Institute of Management, Gwalior

238

Satya Prakash Pandey

Professional Assistant, Central Library Mahatma Gandhi Kashi Vidyapeeth, Varansi

131

Satyendra Soni

Students M.Tech Indore

98

vii

Sonali Srivatstava

Alumni, Prestige Institute of Management, Gwalior

328

Sourabh Jain

Asstt. Registrar and Asstt. Professor Gyan Ganga College of Technology, Jabalpur.

274

Sradha Shahane

Asstt. Librarian, M.P.High Court, Bench Indore

140

Stayam Dubey

Alumni Prestige Institute

321

Sumit Rajput

Student BCA Prestige Institute of Management, Gwalior

255

Sundaer Lal

Librarian, New Delhi Institute of Management, New Delhi

301

Sunil Upadhyay

Alumni Prestige Institute of Management, Gwalior

294

Vani Agarwal

Asstt. Professor ,Prestige Institute of Mangement

235

Vankapur VM

Associat Professor DLISc., RCUB

200

Varsha Sahu

Asstt. Librarian MITS, Gwalior

213

Vikas Shrivastava

Alumni Prestige Institute of Management, Gwalior

307

Vilender Kumar

Associate Professor- Computer Science, IITM, New Delhi

24

Vishnu Mishra

Asstt. Professor, BVM College, Gwalior

62

viii

Section -I Information Technology

1

DOMAIN SPECIFIC CONTENT BASED SATELLITE IMAGE RETRIEVAL Ms. Chandani Joshi, Prof. G.N. Purohit, Dr. Saurabh Mukherjee Abstract The development of multimedia technology, satellite imagery and huge data availability in the internet has raised concern for an application to store, access and retrieve the images efficiently from the large databases. Content Based Image Retrieval (CBIR) provides us the solution to the problems under consideration. It is the process of retrieving and displaying query images from an image database on the basis of its feature contents. The contents are color, textures, shapes or any other information that can be derived from the images itself. The failure of Traditional Text Based Image Retrieval System (TIBR) has raised the importance of the CBIR. As every user has different perception of the image and the use of different keywords makes TBIR inefficient for image retrieval. With the help of the CBIR and the ERDAS imagine tool, we have classified the domain specific satellite images with its different features such as water body, agriculture, dense forest, open forest, settlement, fallow land, gullied land, scrub land, mining and are able to provide the statistical results of the changes occurred in the previous years. Keywords: - TBIR, CBIR, ERDAS INTRODUCTION Image retrieval is concerned with techniques for storing and retrieving images both efficiently and effectively. Work on image retrieval started from 1970 (Danish M. et al., 2013).Content Based Image Retrieval (CBIR) is a technique to retrieve images on the basis of visual content such as color, shapes and textures (SaadM., 2008). With the because of the similarity of the color of different objects. For this along with the color, the texture or shape or both should be combined to retrieve similar result. Color, textures, shapes are still low level feature and they should be used along with the high level features like text annotation for the optimized results (Mallick A. et al., 2014). By extracting the, descriptors, histograms, colors, shapes, textures, etc., the content of the image is analyzed. The performance or the accuracy of the retrieved images can be calculated by some of the methods available such as by the Precision and recall or by Length of String to Recover All Relevant Images i.e. LSRR (Pal M. et al., 2006, Dr.Kekre et al., 2010).The problem faced with the low level feature extraction was the semantic gap. This was due to the difference between the low level features connected with the high level user semantics. It was difficult to convert the user need for the image in a complete manner to a Content Based Image Retrieval (CBIR). Due to this problem, the images retrieved would not be more effective and efficient. Through extensive research, Content Based Image Retrieval came into existence in 1992 and 2

since then many systems have been developed for Content Based Image Retrieval for uses in commercial fields. Hafiane et al., [2005] retrieved the satellite images on the basis of the region, using the Motif Co-occurrence Matrix (MCM) in conjunction with spatial relationships. Maheshwary et al., [2009] have used the 3 LISS III + multispectral satellite images with 23.5m resolution. With the features like color and texture the four semantic categories such as mountain, vegetation, water bodies, and residential area, were used for the retrieval of the similar images from the database. Ning et al., [2006] developed a web based application through which the user can find the images by the query being raised. The domain-dependent concept for the image retrieval was used. In another research paper, Mamatha et al., [2011] used the color feature for the similarity measure for the image retrieval. A low resolution satellite image of the rural area had been taken for the experiment. 2. OBJECTIVE Objective of the paper is to extract the domain specific satellite images and its different features such as water body, agriculture, dense forest, open forest, settlement, fallow land, gullied land, scrub land, mining and to provide the statistical results of the changes occurred in the previous years. 3. MOTIVATION The occurrence of the global warming and the deforestation in the nation has led to much dangerous effect. The changes occurring in the nature due to these factors have motivated us to do the research and to find out the changes occurred in an area in the past decades. Through this prediction we could be able to take the major steps like afforestation and reforestation to maintain the ecological balance. The study was applied in Alwar Tehsil and further can be applied to the other parts of the nation. 4. METHOD Step-1: Unsupervised classification is applied on the 10 classes of the Landsat image of the last four decades of the Alwar region. Step-2: The results are recorded and the raster attributes are set for color and names. Step-3: The process is iterated to get the desired classified image. Step-4: Perform the statistical analysis of the images to predict the changes occurred in the previous years. Step-5: The correlation coefficient was applied on the defined classes, for some classes positive correlation was obtained and for some negative correlation. 5. RESULTS AND DISCUSSION

3

Fig.1(a)

Fig.1(b)

Fig.1(c)

Fig.1 (d)

Fig.1 (e) Figure 1(a), (b), (c), (d) shows the image of the AlwarTehsil region consisting of the 10 classes. Fig. 1(e) shows the legend used in the image.

4

Landuse/Landcover 1976 S.No. Classes Area in Sq.Kilometer 1 Settlement 9.1224 2 Dense Forest 478.8288 3 Open Forest 38.5524 4 Waterbody 20.448 5 Agriculture 298.0008 6 Fallow land 381.0168 7 Gullied Land 4.4532 8 Scrub Land 8.7048 9 Mining 0.00 10 Industry 0.00 Total Area in Sq.Km. 1239.13

Landuse/Landcover 1990 S.No. Classes Area in Sq.Kilometer 1 Settlement 13.0851 2 Dense Forest 304.5861 3 Open Forest 203.4531 4 Waterbody 5.2722 5 Agriculture 327.9762 6 Fallow land 371.9682 7 Gullied Land 3.1995 8 Scrub Land 5.274 9 Mining 0.00 10 Industry 0.5607 Total Area in Sq.Km. 1235.38

Table.1 (a)

Table.1 (b)

Landuse/Landcover 2000 S.No. Classes Area in Sq.Kilometer 1 Settlement 26.6823 2 Dense Forest 304.3998 3 Open Forest 201.0042 4 Waterbody 5.4981 5 Agriculture 281.9025 6 Fallow land 405.6354 7 Gullied Land 2.8179 8 Scrub Land 4.1148 9 Mining 0.00 10 Industry 3.3201 Total Area in Sq.Km. 1235.38

Landuse/Landcover 2010 Classes Area in Sq.Kilometer 1 Settlement 35.9028 2 Dense Forest 304.2927 3 Open Forest 200.7882 4 Waterbody 6.6951 5 Agriculture 436.4712 6 Fallow land 239.7591 7 Gullied Land 2.8188 8 Scrub Land 2.79 9 Mining 0.0504 10 Industry 5.8068 Total Area in Sq.Km. 1235.38

S.No.

Table.1 (c)

Table.1 (d)

Table.1 (a), (b), (c), (d) shows the landuse/landcover data of the Alwar Tehsil region. Statistically it

shows the difference in the respective years.

Fig.2a: Graph representing the landuse/landcover data of the respective years. 5

Fig.2b: Graph representing the landuse/landcover data of the respective years. 6. CONCLUSIONS The study shows the changes occurred in the past four decades. We have found the changes in all the classes taken but the major changes have been seen in the dense forest class since year 1976 to year 2010 as shown in the fig.1.We can also find the major reduction in the water body too. 7. Future Work The classified satellite images would be stored in the databases and on the basis of content based image retrieval the similar images would be extracted using MATLAB. 8. References 1. 2.

Digital Signal Processing MallickA. ,Kapgate D. , Vaidya IJCAT(International Journal of Computing and Technology), Volume 1

3. Advanced Research in Computer Engineering and Technology (IJARCET), Volume 2 4. 5. 6.

7. 8. 9.

International Journal of Engineering Science and Technology (IJEST) Hafiane A., Chaudhuri S., Seetharaman G., Zavidovique B.,[2005], -based CBIR in GIS with local space filling curves to spatial representation Maheshwary P.,Sricastava N. on Color Moment and Gray Level CoInternational Journal of Computer Science Issues, Voume 3 Semantic-Based Image Retrieval in Remote MamathaY.N.,AnanthA.G

Feature Extraction from Rural Satellite Imagery Using Color International Journal of Software Engineering & Applications (IJSEA),

Volume 2 10. IJCAT(International Journal of Computing and Technology), Volume 1

6

11.

International Journal of Science, Engineering and Technology Research (IJSETR), Volume 3 12. Survey: Content Based Image Retrieval Based On Color, Texture, Shape & Neuro Fuzzy Journal Of Engineering Research And Applications, Volume 3

7

MATHEMATICAL MODEL TO REPRESENT THE ROLE OF HYPOTHALAMUS IN HUNGER REGULATION Ms. Divya, Dr. Saurabh Mukherjee Abstract Hypothalamus plays a very important role in the variety of biological functions of human body. It works as the regulator of many Homeostatic mechanisms. It controls the Hunger regulation, Energy balance and body temperature. Hunger system of human body is simulated by a number of Hormones secreted by internal organs. These hormones are Ghrelin, Cholecystokinin (CCK), Neuropeptide Y (NPY), Peptide YY (PYY) and Amylin. Each Hormone has its functional effect on Hunger Regulation. Ghrelin is a strong simulator of hunger regulation and growth hormone release. CCK evolves in generating the satiety signals through the Vagus Nerve. NPY and PYYregulate the food intake and after meal Amylin Hormone increase and it decrease the food intake. Tosimulate the process of hunger regulation we use the concept of Differential equation in which we include the functions which simulate each step of hunger regulation from internal organs to Hypothalamus and Central Nervous system. With the help of a mathematical module function we simulate the secretion of Hormones from internal organs. We use Wavelet Transform to simulate the satiety signals travel via the Vagus nerve. We also use fractals to represent the repeated pattern at every scale. Regarding the satiety signals travel through Vagus nerve the hypothalamic receptors generate signals which further transfer to the Central Nervous system (CNS). To Compute and validate the model we will use various model validation techniques and we use certainty factors with the entropy, correlation coefficient and energy. Keywords: Hypothalamus, Hunger Regulation, Central nervous System, Wavelet Transforms, Fractals, Entropy and Differential Equation. INTRODUCTION Hypothalamus is an almond shape glandof human body (diencephalon). It located under (hypo) the thalamus and above the mesencephalon (midbrain). It is the major integrating link between the Endocrine system and Limbic system. This complex endocrine gland is the master of the pituitary gland (Matthias, Christian.2009)[5]. Hypothalamus receives the sensory and hormonal signal from the thalamus, limbic system and the internal organs of human body [6]. It act as the main control centre for many biological functions of Homeostasis mechanisms such as Hunger regulation, water balance, energy balance, body temperature and others. Here our main concentration is on analyzing the Hunger regulation process in human body with the Hypothalamus behavior. Hypothalamus contains a number of nuclei that plays important role in Hunger control. Lateral hypothalamus (Hunger center), ventromedial hypothalamic nucleus (Satiety center) and hypothalamic Arcuate nucleus (Controller of Food intake) are main participants in hunger regulation [3]. The discovery about hypothalamus starts with Second century A.D. and continues with new finding each 8

about the anatomical division of human brain with their growth. Ramon Y. Cajal discovers an important fact that there is connection between Hypothalamus and the pituitary gland [5]. After that a lots of physiological aspects are discovered regarding the hypothalamus and human body like vascular system(1930), connection between hypothalamus and the other Multiple neuronal messengers Hunger Regulation process can be divided into the three sub parts: Recognizing the feeling of Hunger, Regulating the Hunger and Identifying the feeling of fullness. Many Hormones participate in Hunger regulation: Ghrelin, Neuropeptide Y (NPY), Peptide YY (PYY), Cholecystokinin (CCK) [9] and Amylin [4] [8] [12]. Hypothalamus processes variety of hormones. It gets the hormonal signals from internal organs like stomach, pancreas and others. When the feelingof hunger is generated, the Hormones are secreted from the internal organs. Ghrelin (Hunger simulator) is 28-Amino acid peptide. It has an important role in secretion of the growth hormone [10]. It is very important hormone of endocrine system. Ghrelin is secreted from the cells of stomach and Intestine. The Functional effect of Ghrelin is accepted by its receptor at hypothalamus region. That receptor is Growth Hormone Secretagogue Receptor Hypothalamus (LH), Para-Ventricular nucleus (PVN) and Ventromedial nucleus (VMN) contain GHSR [11] [15]. Ghrelin receptors receive hormonal signals from stomach by bloodstreams through Vegal Nerve [12]. After meal, the hormonal levels of Ghrelin get decreased [14] [15]. Neuropeptide Y (NPY) is another hormone that also participates in Hunger regulation. This powerful peptide is 36 amino acid increases the food intake. It acts on at least two receptors (G protein coupled receptors) Y1 and Y5. It is found in Para-ventricular Nucleus (PVN), Arcuate Nucleus and Ventromedial Nucleus (VMN) and the autonomic nervous system. If the secretion of Neuropeptide Y decreases then food intake decrease [7] [13]. Peptide YY, also known as Gut hormone, is 36 amino-acid. It is owned by the Pancreatic Peptide and NPY. Two hormones formed from PYY internally. These are PYY (1-36) and PYY (3-36). PYY triggers the four subtypes of receptors of Neuropeptide Y i.e. Y1, Y2, Y3 and Y5 which are present in Arcuate Nucleus. L-cells of alimentary canal produce PYY. It increases the effect of Neuropeptide Y [4]. Amylin (37-Amino Acid) is also a peptide hormone which participates in Hunger regulation process. It decreases the consumption of food [12]. Thus we can say that Amylin secreted with another Hormone named Insulin also plays an important role in changes in body Fat at Long term basis. [12]Cholecystokinin (CCK) also participates in the complex process named Hunger regulation. It is the controller of the feeding behavior which occurs for the short term period. The I-cells of small intestine secret CCK Hormone. This peptide also plays an important role in generating the satiety signals which further transfer to hypothalamus through Vegal Nerve [9]. Thus all the hormones participating in hunger regulation process have their own role and impact on this process which is being explained by our mathematical model. 9

2. OBJECTIVE The objective of our paper is to develop a model which represents the mapping between the hormonal signals generated from the internal organs (stomach, pancreas, intestine and kidney) of Human body with the spikes generated and regulated for the feeling of Hunger at Hypothalamus. 3. MOTIVATION From the origin study of the Hypothalamus and Hunger regulation much more research has been done on biological aspects. But very few researchers have represented their work using mathematical model. A work on clinical applications of hypothalamus and pituitary gland is represented with mathematical terms [1]. In a mathematical model the Hypothalamus, Pituitary gland and Adrenal axis constitute hippocampus mechanism [2] [3]. Our model simulates the role of hypothalamus in hunger regulation with the Hunger regulator hormones like Ghrelin (Appetite Simulator), Neuropeptide Y, PYY and CCK whereas the secretion of Amylin inhibits the feeding behavior. 4. Method We can divide the simulation process of hunger regulation into five different steps (Figure 1), which is as follows:

Figure 1: The hunger Regulation process which is explained in fivedifferent steps.

10

Step-1: Secretion of Hormone from the internal organs of Human Body such as: Stomach, Kidney, Pancreas etc represented by function G(h). Here h represents hormone. Step-2: Hormones desolve in Blood steam and then Hormonal Signal travel through Vegal Nerve Hypothalamic Region. H(s) is the set of Hormonal Signals. For this step we use Daubechies Wavelet function D4(h) with fractals f(t). Step-3: The functional Receptors of each hormone generate the Spikes in response to the Hormonal Signal (received by Vegal Nerve) at Hypothalamus Regions. R(s) is the set of signals generated by receptors. In this step we use Scaling function Sc(s) with an Entropy measure function Em(s) for each paired set (H(s), R(s)). Step-4: The Signals generated by the Receptors (Spikes) (R(s)) at the hypothalamic Region are Transfered to the Nervous System (Limbic System). Step-5: The Feeling of Hunger is generated and Feeding process is started. After meal the process goes to stable level. The whole process of hunger Regulation is explained clearly in Figure1. 5. Model

The mathematical model which represents the role of Hypothalamus in Hunger Regulation process includes a number of functions which represent the steps and parts of the above discussed method. is the binary function which represents the Hormone secretion by internalOrgans where h={g,n,p,c,a}, where g stands for Ghrelin, n stands for Neuropeptide Y, p stand for PYY, c stands for Cholecystokinin (CCK) and a stands for Amylin. The function G(h) is represented as follows. G(h) =

1

for

h={g,n,p,c,a};

0for H(s) is the set of hormonal signals generated from internal organs of human body that travel via vegalnerve to the Hypothalamus Region, which is represented by Wavelet transformation function D4(h) (Daubechies Wavelet). Fractals function f(h) (constant) is also connected with Wavelet transformation as it is a repeated pattern that is displayed at every scale. Hypothalamus has the functional receptors for each hormone. The Receptors generate signals in response to each hormonal signal reached at the Hypothalamus through vegal nerve. R(s) is the set of the signals generated by each receptor. The range of the effects of each hormonal signal is Rg, Ra, Rn, Rp and Rc. The Scaling Function Sc(s) generates the hypothalamic signal. Here we also use Entropy measure function Em(s) as we are dealing with physiological data [16]. The generated signals pass to Limbic System for the Feeling of Hunger.We categorize the signals generated at Hypothalamus as the power signals because we consider the

11

average power of represent the role of Hypothalamus can be structured as:

Where

is the change in the processing of Hypothalamus, H with respect to Time t.

is the Correlation between Hormonal Signal (Vegal Nerve) H(s) Hypothalamic receptorssignals R(s). As we know the hypothalamic signals depend upon the Hormonal signals. To investigate this dependency we use the coefficient of correlation r and determination r2. The coefficient of determination gives us the percentage of the data close to best fit.

Figure 2: The Generated process of the Simulation of Hypothalamic Hunger Regulation Model Where a Binary Function G(h) either have the value 1 or 0 for the secretion of Hormone and inhibition in the secretion of the Hormone respectively. Here h stands for Hormone and the value for h can be equal to g (Ghrelin), a (Amylin), n= (Neuropeptide Y) and p (PYY). Each Block Represent the functions and steps we use to calculate the result and analyze the behavior of our mathematical model.

6. Results To observe the behavior of the developed mathematical model we use MATLAB Functions such as Coefficient of Correlation (corrcoef (H)), Entropy etc. We used mathematical functions as the scaling function Sc(s). Using MATLAB script we analyze our mathematical model by giving values to Hormonal Signals (H(s)). We used 2-D wavelet transforms i.e. D4 to represent our Wavelet transformation. Since we are having the random numbers as input hormonal signal, correlation between the hormonal signals H(s) and the receptors signals R(s) is positive which 12

implies the fact that Receptors signals increase with the increase in Hormonal signals. Figure3 contains the graphs which depicts the behavior of the signals according to D4 Wavelet function and Haar wavelet.

Figure.3: The Originals Signal, Hormonal signals (H(s)) with Haar Wavelet and 2-D Discrete Wavelet (D4), Haar coefficients and D4 Wavelet coefficients with the Hypothalamic Signals R(s) are represented with the help of Graph.

13

7. Conclusion This model explains the physiological behavior of the very important gland of human body i.e. Hypothalamus in Hunger regulation. To understand the whole process we also use some important factors like Entropy (uncertainty), Fractal etc. these factors help us to understand our study with real life scenario. 8. Future Work After analyzing this mathematical model with statistical functions, our next step was to validate this model with clinical observation using fMRI images and USG images using MATLAB. 9. References 1.

2.

3.

4.

Simon L. Goede, Systems Research, Oterlekerweg 4, The Netherlands, Melvin Khee-Shing Leow, Singapore Institute for Clinical Sciences, Brenner Centre for Molecular Medicine and National University of Singapore, Singapore, , Jan W.A. Smit, Department of General Internal Medicine of Radboud University Nijmegen Medical Centre, The Netherlands, Johannes W. Dietrich, Laboratory XU44, Medical Hospital I, Bergmannsheil University Hospitals, Germany: minimal mathematical model of the hypothalamus pituitary Biosciences 249 :1 7, 2014 @ Elsevier. Matthias Conrad, Department of Mathematics and Computer Science, Emory University, Atlanta, USA, Christian Hubold, Achim Peters, Medical Clinic 1, University of Lubeck, Germany and Bernd Fischer, -Pituitary-Adrenal system: Homeostasis by interacting posi -162, 2009@ -671(2000) Simon L. Goede, Systems Research, Oterlekerweg 4, The Netherlands, Melvin Khee-ShingLeow ,Singapore Institute for Clinical Sciences, Brenner Centre for Molecular Medicine and National University of Singapore, Singapore, , Jan W.A. Smit, Department of General Internal Medicine of Radboud University Nijmegen Medical Centre, The Netherlands, Johannes W. Dietrich, Laboratory XU44, Medical Hospital I, Pituitary Thyroid Feedback Control: Implications of Mathematical Modeling andConsequencesforThyrotropin (TSH) and FreeThyroxin (FT4) 1287, 2014 @ Springer. Batterham RL1, Bloom SR., 1Imperial College Faculty of Medicine, Hammersmith Campus, London, Sciences 994:162-168, 2003.

5. 6. 7.

-Pituitary-Thyroid Axis: An Historical and . Printed in U.S.A. (2000). -786(2012) Elsevier Inc. -

8.

B 361, 1159 1185, 2006 @Biological Science. M. D. Klok, S. Jakobsdottir and M. L. Drent, Department of Endocrinology, VU University Medical

9.

J. Obesity reviews 8, 21 34 @ 2007 the Authors. Chung Owyang, Andrea Heldsinger , Department of Internal Medicine, University of Michigian, USA 17, No.4, 338-348 (October 2011),

10.

-hormone-

-660,1999 @ PubMed. 11. Masayasu Kojima , Hiroshi Hosoda, Hisayuki Matsuo, Kenji Kangawa . 12. Katherine J. Pulman, W. Mark Fry, G. Trevor Cottrell, and Alastair V. Ferguson, Department of

14

roscience, 26(7):2022 2030, 2006 @ Society for Neuroscience. 13. P.C. Konturek, Department of Medicine, Erlangen-Nuremberg University, Erlangen, M. CzesnikiewiczGuzik, Institute of Stomatology , Cracow, Poland, W. Bielanski and S.J. Konturek, Department of P HELICOBACTER PYLORI INFECTIONIN NEUROJournal of Physiology and Pharmacology, 57, Supp 5, 67-81 @ 2006. 14. Paul J. Wellman, P. Shane Clifford, Juan A. Rodriguez and Samuel Hughes, Behavioral Neuroscience Program, Department of Psychology, Texas A&M University, College Station,TX, USA, Carla Di Francesco,SergioMelotto, MichelaTessari and Mauro Corsi, GlaxoSmithKline Medicine Research Centre, Italy, Angelo Bifone& Alessandro Gozzi, Center for Nanotechnology Innovation @NEST, IstitutoItaliano pharmacologicalfMRI and intracranial self-stimulatio for the Study of Addiction. 15. MasayasuKojima,Molecular Genetics, Institute of Life Science, Kurume University, Kurume, Fukuoka, ciences; 3(2): 92-95, 2010 @JMS. 16. Ranjit A. Thuraisingham, Georg A. Gottwald, School of Mathematics and Statistics, University of Sydney, @Science Direct(Online).

15

GENERATIONS OF THE WORLD WIDE WEB: FROM WEB 1.0 TO WEB 4.0 Krishan Kant Yadav Abstract The WWW is more and more used for application to application communication. The Web is entering a new phase of evolution. There has been much debate recently about what to call this new phase. The web of documents has morphed into a web of data. The semantic web embraces three stages of internet growth. The first stage, web 1.0 as a web of information connections, web 2.0 as a web of people connections. Some would prefer to not name it all, while others suggest continuing to call it Web 2.0. However, this new phase of evolution has quite a different focus from what Web 2.0 has come to mean. John Markoff of the New York Times has suggested naming this third-generation of the Web, Web 3.0 as a web of knowledge connections. Web 4.0 as a web of intelligence connections are described as four generation of the web. Keywords: WWW, Web 1.0, Web 2.0, Web 3.0, Web 4.0, Intelligence, Semantic web. Introduction Internet has become a part of our everyday life. Web services are not new and usually take the form of an Application Programming Interface. From being used to plainly surf the web to being used in the security purposes by the United States military for communications. There are millions of applications on internet and every day we are exploring new applications and upgrades in the world of web 2.0. When we talk about applications, it was not so long ago when Web 1.0 was launched which quickly transformed into Web 2.0. Many people are still trying to get accustomed to Web 2.0 and tech geeks are already beginning to think about Web 3.0. Web 2.0 is well established, and sites like YouTube, Flickr, and Digg have turned the internet from a static source of information into a huge, interactive digital platform. The inaugural Web, sometimes referred to as Web 1.0, was the version of the Web in existence between 1991 and 2003. This was essentially a 'read-only' Web, somewhere we could go to access information on a kind of 'look but don't touch' basis. From 2004 onwards came the evolution of the 'read-write' Web, or Web 2.0, which, by contrast to the static nature of its predecessor, was all about interaction and collaboration. In a wave of development of, blogs and social media, users were now controlling the content of the Web rather than merely observing it. The logical progression of this should therefore be the 'read-write-execute' Web, a version of the Web in which users can create and execute their own tools and software to manipulate and extract information, rather than using other people's software and websites. However, though this may indeed be one aspect of Web 3.0, use of the term seems at present to focus on the concept of enhancing the 'intelligence' of the underlying architecture of the Internet the idea that information will be organized and identified in a way that makes searches more effective because the platform 'understands' and makes connections between pieces of data. Related works

16

The web was created in 1989 by Sir Tim Berners-Lee, working at CERN (The European Organization for Nuclear Research) in Geneva. A web service is a software system designed to support computer to computer interaction over the internet. There is not any specific research about the web generations from the web advent in outlines three qualities of the web based on an analytical distinction, Fuchs, Christian, et al (2010). Web 1.0 Web 1.0 was introduced as a tool for thought, a small number of writers created web pages for a large numbers of readers. The web 1.0 is a system of interlinked, hypertext documents accessed via the internet. In this web, a small number of writers created web pages for a large number of readers. The WWW is a system of interlinked, hypertext documents accessed via the Internet. According to BernersWeb 2.0 in a conference brainstorming session betwee known as people-centric web, participative web, wisdom web and read-write web. This web is a platform where users can leave many of the controls they have be used in web 1.0. Web 2.0 is also known as wisdom web and read write web. One of the outstanding features of web 2.0 is to support collaboration and to help gather collective intelligence information rather than web 1.0 San, Murugesan (2007). Audio Chats E-commerce Games

BlogPod Collaboration E-learning Multimedia

Bookmarking Community Filesharing Lists

Blogging Communication E-Mail Knowledge

Table 1: Types of web 2.0 Web 3.0 Web 3.0 is a term, which definition is not confirmed or defined so far as several experts have given several meaning, which do not match to each other, but sometimes it is referred to as a Semantic Web. Tim Berners-Lee, the inventor of first World Wide Web has coined the term Semantic Web. But the concept of Web 3.0, first entered among the public in 2001, when a story appeared in scientific article written by American Coauthored Berners-Lee that described this term as a place where machines can read Web pages as much as humans read them e.g. web connected bathroom mirrors, which can read the news coming through on the web. The expression Web 3.0 is, of course, a logical progression from the term Web 2.0. Following the pattern of Web 2.0, various spoken forms are possible, such as 'Web three point oh' and 'Web three (point) zero'. Web 3.0 is defined as the creation of high-quality content and services produced by gifted individuals using Web 2.0 technology as an enabling platform. The data will come from the users and the web will adjust to meet the needs of the user.

17

Fig 1: Web 3.0 Adaption Cycle There are several definitions of the web, but usually Web 3.0 is defined as a term, which has been coined with different meanings to describe the evolution of web usage and interaction among the several separate paths. A precise definition of Web 3.0 is difficult to pin down, but most descriptions agree that a fundamental characteristic of it is the ability to make connections and infer meaning essentially, the Web is going to become more 'intelligent'. This has led to the coining of expressions such as the semantic Web, or the intelligent Web, in reference to Web 3.0. Web 3.0 Expanded Definitions. The propose expanding the above definition of Web 3.0 to be a bit more inclusive. There are actually several major technology trends that are about to reach a new level of maturity at the same time. The simultaneous maturity of these trends is mutually reinforcing, and collectively they will drive the third-generation Web. The current web is of documents, in some ways like a global file system that the most important problems about it are included: The web of documents was designed for human consumption in which primary objects are documents and links are between documents. Semantics of content and links are implicit and the degree of structure between objects is low. Christian, Bizer & Tom, Health & Tim, Berners Lee (2009) Web 2.0 Read/Write Web Communities Sharing Contents Blogs AJAX Wikipedia, Google Tagging

Web 3.0 Portable Personal Web Individuals Consolidating Dynamic Content Life Stream RDF Dbpedia, Igoogle User Engagement

Table 2: Comparison of web 2.0 and 3.0

18

Fig 2: Web 1.0, Web 2.0 and Web 3.0 Tim Berners-Lee proposed a layered architecture for semantic web that often represented using a diagram, with many variations since. A typical representation of this diagram Jane, Greenberg & Stuart, Sutton & D. Grant, Campbell (2003).

Fig 3: Semantic Web Layered Architecture The layers of the semantic web architecture are briefly described as follows:Unicode and URI: Unicode is used to represent of any character uniquely whatever this character was written by any language and Uniform Resource Identifier (URI) are unique identifiers for resources of all types Haytham,Al-Feel & M.A.Koutb & Hoda Suoror (2009). 19

Extensive Markup Language: It supports mixing of different elements from various vocabularies to do a specific function. XML schema assures that the received information is according to the sent information when two applications at this level exchange information with together Haytham, Al-Feel & M.A.Koutb & Hoda Suoror (2009). Resource Description Framework: RDF is a simple data model that uses URIs to identify web-based resources and describes relationships between the resources in terms of named properties and values. RDF Schema: provides a predefined, basic type system for RDF models. It describes classes and properties of the resources in the basic RDF model. RDF Schema provides a simple reasoning framework to infer types of resources. Ontology: The ontology layer described properties and the relation between properties and different. Ontology can be defined as a collection of terms used to describe a specific domain with the ability of inference. Logic and Proof: This layer is on top of the ontology structure to make new inferences by an automatic reasoning system. The agents are able to make deductions as to whether particular resources satisfy their requirements by using such the reasoning systems Oktie, Hassanzadeh (2008). Trust: The last layer of the stack addresses trust in order to provide an assurance of quality of the information on the web and a degree of confidence in the resource providing this information. Ubiquitous Connectivity Broadband adoption Mobile Internet access Mobile devices Network Computing Software-as-a-service business models Web services interoperability such as Amazon S3) Open Technologies Open APIs and protocols Open data formats Open-source software platforms Open data (Creative Commons, Open Data License, etc.) Open Identity 20

Open identity (Open ID) Open reputation Portable identity and personal data (for example, the ability to port your user account and search history from one service to another) The Intelligent Web Semantic Web technologies (RDF, OWL, semantic application platforms, and statementbased data stores such as associative databases) Distributed databases -area distributed database interoperability enabled by Semantic Web technologies) Intelligent applications (natural language processing, machine learning, machine reasoning, autonomous agents) Advantages of Web 3.0 A Web where the context of content is defined by data. A web which is capable of reading and understanding context. When the web can understand the content it can better satisfy the request of users. Web 3.0 is a new version of World Wide Web which is still evolving. Web 3.0 allows data to come from the user end. Web 3.0 is considered to be semantic web by experts. Web 3.0 has dynamic effects on design. It can generate greater excitement in people. It can persuade its audience to make a purchase and affects its purchasing ability. It provides great visuals and outstanding color tones visibility. Great buttons, corners and visuals affect the theme and feel of the web page that affects your purchasing power. The Easy Access: The most important benefit of Web 3.0 is the move towards being able to access data from anywhere which is mainly being driven by the heavy usage of smart phones and cloud applications. Web 3.0 has a technique to access global data easily anywhere. You don't need to have a computer system to access this data as you can also view any files, images, videos, or just any other information on your smart phones. Nowadays almost all the business web design is being created keeping this feature in mind. Web 3.0 browsers learn likes and dislikes and would function as trusted advisor, mentor and personal assistant and less like a search engine. Information categorized and presented in a visually improved manner that enhances interaction, analysis intuition and search functions. Web 3.0 is about openness. By opening application programming interfaces (APIs), protocols, data formats, open-source software platforms and open data, you open up possibilities for creating new tools. Openness can result in identity theft; Web 3.0 attempts to remedy this through open identity, open ID, open reputation, and the ability for roaming portability identity and personal data. By opening up access to information, Web 3.0 applications can run on any device, computer or mobile phone. The applications can be fast and customizable. Unlike Web 2.0, where programs such as Facebook and MySpace exist in separate silos, Web 3.0 allows users to roam freely from database to database, program to program.

21

Web 3.0 promotes incorporation of some of the highly advanced features in websites such as flash, animation, intensive graphics, etc. Own task manager - tells how much memory needed and CPU and network usage by a website. The Ever-Present Web 3.0. Not so much a prediction of what the Web 3.0 future holds so much as the catalyst that will bring it about, the ever-present Web 3.0 has to do with the increasing popularity of mobile Internet devices and the merger of entertainment systems and the Web. Web 4.0 Web 4.0 is still an underground idea in progress and there is no exact definition of how it would be. Web 4.0 is also known as symbiotic web. The dream behind of the symbiotic web is interaction between humans and machines in symbiosis. It will be possible to build more powerful interfaces such as mind controlled interfaces using web 4.0. Web 4.0 will be the read-write-execution-concurrency web. It achieves a critical mass of participation in online networks that deliver transparency, governance, distribution, participation, collaboration into key communities such as industry, political, social and other communities. The web 4.0 is in developing stage. Although there is no exact idea about web 4.0 and its technologies, but it is obvious that the web is moving toward using Artificial Intelligence (A.I.) to become as an intelligent web. This will be parallel to the human brain and implies a massive web of highly intelligence interactions Dan, Farber (2007).

Fig 4: Generations of web 1.0 to web 4.0 Drawbacks 22

Less advanced computers won't be able to handle it Web 1.0 websites will seem that much more obsolete Technology is not entirely ready for it yet A lot of money has been spent by the government on research for it It is very complicated. Conclusion This paper provided an overview from the evolution of the web. From web 1.0 to web 4.0 were discussed in this paper. The characteristics of the generations are introduced and compared. Web 3.0 will not be part of our lives almost immediately and users will have to rely on Web 2.0 for some more time. This is mainly because; the semantic web technology is in its infancy and will need a considerable amount of time to take the lead. The success of Web 3.0 is unsure however, it seems like it is going to make internet a better place virtually and life altering experience for its consumers. It took over 10 years to make the transition from the original web to Web 2.0. The tech world should witness Web 3.0 sometime around 2015. Web 3.0 will be more connected, open, and intelligent, with semantic Web technologies, distributed databases, natural language processing, machine learning, machine reasoning, and autonomous agents. Future work on this paper will focus on the deeper and much broader research about the semantics web and its implications. References 1.

- The Story So Far:, Journal Semantic Web and Information Systems.

2. 3.

4.

semantic-web-30-to-the-webos-40/4999/. Fuchs,Christian, & Wolfgang, Hofkirchner & Matthias, Schafranek & Celina, Raffl &Marisol, Sandoval & -operation. Towards an understanding of web 1.0, 2.0, Haytham,AlSemantic Web 2009, ISSN 2070-3740.

5. Bulletin of the American Society for Information Science and Technology Volume 29, Issue 4, pages 16 18. 6. 7. http://www.cs.toronto.edu/~oktie/slides/web-of-data-intro.pdf 8. 9. 10. 11. 12. 13. 14. 15. 16. 17.

The Semantic Web: web 3.0. Available at http://blog.iia.ie/2007/the-semantic-web3.0/ 12/08/14). Tim, Berners17th International World Wide Web Conference. http://www.unicode.org/14 http://lifeboat.com/ex/web.3.0 http://www.innovins.com/advantages_of_web_3.php http://www.javajazzup.com/issue3/page61.shtml http://www.macmillandictionary.com/buzzword/entries/web3.html http://www.marcuscake.com/economic-development/intenet-evolution/. http://www.technology-digital.com/web20/what-will-web-30-be-all-about

23

(Accessed on

ON ASSESSMENT OF IT INFRASTRUCTURE FUNCTIONALITY: A STUDY BASED ON VIEW OF MCD OFFICIALS Vilender Kumar Dr. Sanjay Kumar Gupta

Abstract MCD is a big service provider organization in Delhi and becoming more and more dependent on data for timely inferences and decision making to provide citizen-oriented services. Processing of very voluminous data in MCD not only require substantial IT infrastructure but also lot of time, so processing of large volume of data using technologies newness is necessary. In this work, effort has been made in order to understand the s and its functionality by getting feedback to MCD officials those who are practically using software applications. Objective is to know about infrastructures which are used by different MCD departments work like ERP system. Keywords: IT infrastructure, ERP, Software Application, Database Introduction expanding markets, and rising customer expectations are some of the big challenges faced by organizations. Umble et al. [15] described intense business rivalry that increases the pressure on many business organizations to lower down the cost of their entire services, shorten throughput times, drastically reduce inventories, expand products and service choices, provide more reliable delivery dates and better customer services, improve quality, and efficiently coordinate global demand, supply, and productions. Municipal Corporation of Delhi (MCD) is a large service provider organization and is always interested in achieving higher people closer by higher quality service and sales volume, and for that uses Employee Information System for managing the payroll of their employees, Hospital Information System for managing their hospitals across Delhi, software for managing Engineering Information System, Solid Waste Transportation Management System, Property Tax Information system, Financial Management System, Computerized system for Gazipur Abattoir and many more [6]. These MCD applications have their own software system to meet their requirements and working nicely but integration of such systems poses problems in data flow and its manipulation [7,8,10]. This resulted in a fragmentation of information, as all of the information was stored separately on different systems in business units sometimes spread across the organization. However, uses of modern technologies are increasing and so common in our daily, professional and business lives. Therefore, MCD is a demand of time to make services speedy, response time quick and, brings significant benefits to employees, customers and 24

most importantly to the management by getting all the required information in real- time and hence allowing them to make quick decision making [5]. Thus, IT infrastructures of MCD with citizens, businesses, and other public agencies as well as its internal business processes and employees [13]. There are many advantages of the ERP enabled system, it improve business performance, reduction in business cycle time, increase business ability, inventory management, order fulfillment improvement, support in business growth requirements , system available in multiple languages and currencies and provide flexible, integrated, real time, decision support system for various department within organization, eliminate data redundancy and inflexibility to change with future technologies and support client server/open system technology environment [1,6,7,8,9,13] In this work effort has been made in order to understand the perception of MCD officials getting feedback to MCD officials those who are practically using software applications. Objective is to know about MCD departments, work like ERP system, and moreover, explore the further possibilities to make MCD services equivalent like implementation of ERP software. The outline of this paper is as follows. We begin with introduction of problem of study of MCD in section 1. Section 2; contain the brief about earlier work done related to IT infrastructures and its functionality aspects used in organizations to implement ERP. In resent and related IT infrastructures which are used by different MCD departments work like ERP system analyzed, interpreted and illustrated in section 4. Section 5 represents statistical tests results and significant findings to test the mentioned hypothesis to ensure the status of present departments work like ERP system to make MCD premier service providing organization. In Section 6, we give our concluding remark. 2. Review Of Literature In past, researchers and scientists have done great deal of research work in the area of ERP implementation. Gupta and Kumar et al. [6] highlighted issues involved with MCD functionality, ERP software, and identify the basic motivational requirements and concerns of ERP implementation for MCD to enable improved services. Gupta and Kumar [7] investigated present information system of MCD for information sharing. Results are summarized that present MCD information system does not follow the single database Gupta and Kumar [8] searched information sharing capabilities of MCD and concluded that there is a lack of information sharing among various departments of MCD to flow real time information to offer quality online services for citizens. Gupta and Kumar [9] considered CSF and designed a SDM model of an ERP implementation for MCD. This will act as a reference model to help and avoid previous mistakes, and minimize the ERP failure risks associated with MCD for successful ERP implementation. Gupta and Kumar [10] focused on 25

integrated database management system in MCD. Outcome of this work is analyzed and having all benefits of integrated database management system. Batini et al. [2] suggested that database schema integration is the activity of integrating the schemas of existing or proposed databases into a global, unified schema. The objective of their research is to provide first a unifying framework for the problem of schema integration by analyzing the strengths and weaknesses of individual methodologies, as well as general guidelines for future improvements and extensions of database integration. Reddy et al. [14] given importance of local database integration with global databases that require transformation of existing local databases to global level through the four layered procedure that stresses total schema integration and virtual integration of local databases. Gable et al. [4] objective of research is to identify limitations of current ERP knowledge management practices and introduce methods of best practices for ERP implementation. Schoenefeld and Vering (2000) aim of the research is to summaries the experiences, benefits and advantages of ERP integration with standalone CSCW system intended to guide the reader towards easy data sharing across the worlds. Kumar et al. [11] found that each adopting organization has a distinct set of objectives and challenges for its systems project like ERP implementation. They also found many similarities in motivations, concerns, and strategies across organizations for such implementation. Luo and Strong [12] presented a useful way for managers to identify feasible customization options for ERP implementation in their organization. Dillibabu et al. [3] designed and developed an ERP model that has potential to manage large amount of information about patients for improving healthcare services to under privileged sections of Indian rural society. 3.0 Experiences With Mcd Site Visit Interaction with MCD officials, those who are concerned, during site visit was shared past experiences that Radio Frequency Identification Device (RFID) need to make more efficient when receiving huge request in a day for dealing on-line customer requests. Some of the hardware requires up-gradation. The employees clued-up about of system integration advantages and ERP benefits and also adequate manpower to manage IT. Earlier, Hi load server (if request call more than 2000 than it arises) software gets slow and some time it System (RMIS), File Tracking System (FTS) and mentioned in section 1.0 etc. Therefore, looking at these issues, now there is an urgent need to do technological advancements according to established best standards. 4.0 Hypothesis A hypothesis is a precise, testable forecast about what you expect to come regarding in your study. Researchers might explore a number of different factors to determine that prediction is right or wrong. In order to test the perception of the MCD officials about the current MCD IT related infrastructure aspect H01 hypothesis is derived. Perception of MCD officials those who are practically using software applications are collected through questionnaire. Although, it is assumed that neither they are having good knowledge of an ERP nor they are expert.

26

H0 like an ERP software. 5.0 Analysis And Interpretation Table 1.0 as shown below represents the frequency distribution of responses of MCD One of the main objectives of maintaining databases is to increase the efficiency and to provide better quality services to citizen. In this work effort has been done in order to understand the perception of MCD officials about the statement The frequency distribution of response is shown in table 1.0 & Figure 1.0. Present

Responses Strongly Disagree Disagree Somewhat Disagree Neutral Somewhat agree Agree Strongly Agree

Frequency Percent 17.1% 34.3% 14.3% 8.6% 5.7% 5.7% 14.3%

Cumulative Percent 17.1% 51.4% 65.7% 74.3% 80.0% 85.7% 100.0%

Table 1.0

Figure 1.0

The result indicates that majority of respondents, around sixty six percent disagree with the twenty five percent MCD officials are found to agree with the statement. It is observed during the research because of awareness about information technology but not recognizing worth of ERP system, MCD is satisfied with the existing state of quality control mechanism of MCD. Table 1.1 & Figure 1.1 as shown below represents the frequency distribution of responses of MCD MCD information systems have online service Table 1.1 & Figure 1.1: Frequency distribution of the responses for the statement MCD

Responses trongly Disagree Disagree

Frequency Percent 8.6% 28.6%

Cumulative Percent 8.6% 37.1%

27

Somewhat Disagree Neutral Somewhat agree Agree Strongly Agree

17.1% 11.4% 11.4% 17.1% 5.7%

54.3% 65.7% 77.1% 94.3% 100.0%

Table 1.1

Figure 1.1

In an online service monitoring platform, Delhi citizen can get to know their service status online if entire MCD different department databases are available online. This can improve the service quality of MCD. Hence, online service monitoring system of MCD must be available into 24x7 for all their citizens. In this work, MCD official were asked to give their MCD information systems have online service monitoring the result indicates that fifty four percent of MCD officials are disagree with the statement. The result indicated that MCD is not having online service monitoring platform. However, thirty four percent of MCD officials are agree with the statement because a few of MCD services like hospital management, school and date of birth certificate management has online service monitoring platform. Table 1.2 & Figure 1.2 as shown below represents the frequency distribution of responses of MCD officials related to the statement It was observed during the MCD site visit that each department has its own database system that only contains record of their own citizen service related data. Such isolated database management system cannot exchange citizen details across different departments within MCD because of lack of their database integration. Table 1.2 & Figure 1.2: Frequency distribution of the responses for the statement

Responses Strongly Disagree Disagree Somewhat Disagree Neutral Somewhat agree Agree Strongly Agree

frequency Percent 28.6% 14.3% 20.0% 8.6% 11.4% 5.7% 11.4%

Cumulative Percent 28.6% 42.9% 62.9% 71.4% 82.9% 88.6% 100.0%

Table 1.2

Figure 1.2

The research shows that more than sixty two percent of the MCD officials disagree with the statement Around twenty eight percent of MCD officials agree with the statement 28

because they thought moving file manually to know about citizen information is still be the effective inter departmental communication within MCD. Table 1.3 as shown below represents the frequency distribution of responses of MCD officials related to the statement MCD all IT Hardware, Software and IT infrastructure is new IT hardware, software and related communication infrastructure in their business environment to remain competitive in respective business domain. Table 1.3 & Figure 1.3 : Frequency Distribution of the responses for the statement MCD all IT

Responses Strongly Disagree Disagree Somewhat Disagree Neutral Somewhat agree Agree Strongly Agree

Frequency Percent 11.4% 28.6% 14.3% 8.6% 28.6% 5.7% 2.9%

Cumulative Percent 11.4% 40.0% 54.3% 62.9% 91.4% 97.1% 100.0%

Table 1.3

Figure 1.3

MCD also realizes the importance of such resources but somehow is not able to implement the related IT infrastructure according to the citizen requirement. During the MCD site visit is observed that many of the senior MCD officers are not at ease with workings of available computer systems. About more than fifty four percent of the MCD officials are found to disagree with the statement MCD all IT Hardware, Software and IT infrastructure is More than thirty seven percent of the MCD officials are found to agree with the statement because they are not fully aware with latest innovation in the field of information technology that supports service industries. Table 1.4 as shown below represents the frequency distribution of responses of MCD officials related to the statement . Table 1.4 & Figure 1.4: Frequency Distribution of the responses for the statement

Responses Strongly Disagree Disagree Somewhat Disagree

Frequency Percent

Cumulative Percent

34.3%

34.3%

20.0%

54.3%

20.0%

74.3%

29

Neutral Somewhat agree Agree Strongly Agree

5.7%

80.0%

11.4%

91.4%

5.7%

97.1%

2.9%

100.0%

Table 1.4

Figure 1.4

Some quality IT standards are needed to maintain while developing any new software products. There are some standards which have to be accomplished by many IT software development organizations like SAP, ORACLE, TCS, and INFOSYS etc. that justify their software products quality according to their software development standard. Capabilities Maturity Model (CMM) level is one of the software quality standard that justify the quality of the software product. At the time of the research study it was observed that some of the software products used by MCD are not made-up according with CMM standards. More than fifty one percent of MCD officials are disagree with the statement More than forty two percent of MCD officials were found to agree with the statement because they are not aware with CMM standard of software development. Table 1.5 as shown below represents the frequency distribution of responses of MCD official Stock management is a highly specialized job that forecasts demand with supply of any business organization to remain active in the business world. Since MCD is divided into three zones it has many citizen service centers in Delhi. To check the stock availability for many of their services like blood bank management, school admission management, MCD inventory management etc. require dynamic stock information management system. More than fifty four percent of MCD officials disagree with the statement Present software capable to do MCD stock management thirty six percent of the MCD officials were agree with the statement because they felt that present MCD software system is capable to do stock management either by file base system or with the software management system. Table 1.5 & Figure 1.5: Frequency Distribution of the responses for the statement Present software

Responses Strongly Disagree Disagree Somewhat Disagree Neutral Somewhat agree Agree Strongly Agree

Frequency Percent 42.9% 25.7% 8.6% 8.6% 2.9% 2.9% 8.6%

Cumulative Percent 42.9% 68.6% 77.1% 85.7% 88.6% 91.4% 100.0%

Table 1.5

Figure 1.5

30

Table 1.6 as shown below represents the frequency distribution of responses of MCD Software programming languages used in MCD . There are many software languages likes Java, XML etc. are web enabled languages that are easily compatible in web enable environment. Table 1.6 & Figure 1.6: Frequency Distribution of the responses for the statement programming languages used in MCD compatible in web enable environment

Responses Strongly Disagree Disagree Somewhat Disagree Neutral Somewhat agree Agree Strongly Agree

Frequency Percent 17.1% 14.3% 17.1% 17.1% 8.6% 5.7% 20.0%

Cumulative Percent 17.1% 31.4% 48.6% 65.7% 74.3% 80.0% 100.0%

Table 1.6

Figure 1.6

The software which is made up of web enabled languages can easily be accessed on internet. The growing number of internet users across India has opened many new business opportunities related with this sector. In this research study, efforts are made in order to find out perception of MCD officials about the statement Software programming languages . More than forty eight percent of MCD officials were found to disagree with the statement and around thirty four percent of the MCD official was found to be agree with the statement, because many MCD online through internet environment. Table 1.7 as shown below represents the frequency distribution of responses of MCD officials related to the statement There are many database software Oracle, Sybase etc. that are web e compatible with any web enable programming languages like Java, XML etc. These day that opens many new business opportunities related with this sectors. Table 1.7 & Figure 1.7: Frequency Distribution of the responses for the statement

Responses Strongly Disagree Disagree

Frequency Percent 25.7% 8.6%

Cumulative Percent 25.7% 34.3%

31

Somewhat Disagree Neutral Somewhat agree Agree Strongly Agree

25.7% 5.7% 17.1% 14.3% 2.9%

60.0% 65.7% 82.9% 97.1% 100.0%

Table 1.7

Figure 1.7

In this research study as shown in the table & figure 1.7, more than sixty percent of MCD officials here found to be disagree with the statement Around thirty four percent of MCD officials are agree with the statement because many of MCD software presently provide online service facilities like date of birth certificate, booking of community hall etc. Table 1.8 as shown below represents the frequency distribution of responses of MCD officials related to the statement Enterprise Resource Planning (ERP) system software integrates different department database system into a single integrated system. ERP software system allows different departments to share dynamic information in real time. Table 1.8 & Figure 1.8: Frequency Distribution of the responses for the statement

Responses Strongly Disagree Disagree Somewhat Disagree Neutral Somewhat agree Agree Strongly Agree

Frequency Percent 40.0% 17.1% 20.0% 5.7% 8.6% 2.9% 5.7%

Cumulative Percent 40.0% 57.1% 77.1% 82.9% 91.4% 94.3% 100.0%

Table 1.8

Figure 1.8

ERP systems bring business transparency, improve human capital management, control inventory management and improve work efficiency within any organization. Many private, public and government organizations already enjoy the success of ERP implementation in their different business domain. An effort has been made to find out the perception of the MCD officials about the statement More than seventy seven percent of the MCD officials are found to disagree with the statement around seventeen percent of the

32

MCD officials are found to agree with the statement because they thought that present MCD database is integrated. 6.0 RESULT AND DISCUSSION related infrastructure and its functionality work like ERP software; one sample T-test is applied. The hypothesis tested in the research study using one sample T-test is as follows. Enterprise resource planning (ERP) combines all business database requirement of an organization together in the form of single, integrated package software program that run on a single database, so that the various department can more easily share information and communication with each other in real time basis. Table 1.9: One Sample T- test for Issues related with present MCD software and IT related infrastructure compare with ERP software Mean Std.. T - P - Remark Deviatio Statistic Value n s 3.257 2.04857 -2.145 .039 mechanism to improve services (V12) 1 MCD information systems has online service 3.628 1.83248 -1.199 .239 monitoring platform (V13) 6 MCD information & IT infrastructures support inter 3.228 2.05921 -2.216 .033 department communications(V14) 6 MCD all IT Hardware, Software and IT infrastructure 3.428 1.68533 -2.006 .053 is regularly upgraded(V15) 6 Present MCD information 2.685 1.74510 -4.456 .000 business practice as described by CMM level(V16) 7 Present software capable to do MCD stock 2.457 1.89958 -4.805 .000 management (V17) 1 Software programming languages used in MCD 3.828 2.12112 -.478 .636 compatible in web enable environment(V18) 6 The entire MCD citizen databases and its supported 3.457 1.99031 -1.614 .116 1 online(V19) MCD software system work like an ERP 2.571 1.80336 -4.687 .000 software(V20) 4

The result as shown in table 1.9 & figure 1.9, all probability values except in statement two (MCD information systems has online service monitoring platform), seven (Software programming languages used in MCD compatible in web enable environment), and eight (All the MCD citizen databases can access online), are less than five percent level of significance, hence with ninety five percent confidence level, it can be concluded that for statement one (Present MCD quality control mechanism to improve services), three (MCD information & IT infrastructures support inter department communications), four (MCD all IT Hardware, Software and IT infrastructure is regularly upgraded) five (Present MCD build upon best business practice as described by CMM level), six (Present software capable to do MCD stock management) and nine (MCD software system 33

work like a ERP software) statements the MCD officials found to significantly disagree with the statement because they know that, present MCD database management system is not hypothesis H01 is rejected. Due to lack of integrated online service monitoring platform, many of the MCD services lead to substandard service quality in management of citizen related IT infrastructures which are used by different MCD departments, are not build to CMM software development standard.

Figure 1.9 In case of statement two (MCD information systems has online service monitoring platform), seventh (Software programming languages used in MCD compatible in web enable environment) and eight (All the MCD citizen databases and its supported operating , they found to neither disagree nor agree i.e. they are neutral with the statement. 7.0 Conclusion MCD is largely a core service organization and delivering various municipal services but in modern competitive word, only good services are recognized and used nicely by customers. Therefore MCD requires drastic transition from legacy to latest software technology like maximum benefits to their citizen. Feedback of MCD officials are compiled and result es and its functionality are not working like an ERP software. Thus, better IT infrastructure is required to integrate whole workflows seamlessly for speedy and more proficient services to the citizens, employees, and administrators. 34

References 1.

2. 3.

4.

5. 6.

7.

8.

9. 10. 11.

12. 13. 14.

15.

Abdelghaffar, H. and Abdel Azim, R. H., Significant factors influencing ERP implementation in large organizations: Evidence from Egypt, European, Mediterranean & Middle Eastern Conference on Information Systems 2010, April 12-13 2010, Abu Dhabi, UAE. Batini C, Lenzerini M and Navathe S B (1986), A Comparative Analysis of Methodologies for Database Schema Integration C. available at: ACM Computing Surveys, Vol. 18, No. 4, December 1986. Dillibabu R , Ramya R. and Krithika S (2006) , Development of an ERP Model for Indian Rural Health Care Industry -A Case Study Proceedings 7th Asia Pacific Industrial Engineering and Management Systems Conference 2006,17-20 December 2006, Bangkok, Available at ,http://scholar.google.co.in/scholar?start=100&q=ERP+ model+in+ government +sector&hl= en&as_sdt=2000. Gable Guy G, Scott Judy E and Davenport Tom D (1998), Cooperative ERP Life-cycle Knowledge Management, available at: Proceedings of the Ninth Australasian Conference on Information Systems, 29 September 2 October 1998, Sydney, Australia, pp.227-240 1/12. Gefen, D. and A. Ragowsky. (2005). A Multi-Level Approach to Measuring the Benefits of an ERP System in Manufacturing Firms, available at : Information Systems Management, 22(1), 18-25. Gupta Sanjay Kumar, Kumar Vilender, Chhabra Susheel (2012), A Window View on Prospects of ERP Implementation in Municipal Corporation of Delhi, International Journal of Emerging Technology and Advanced Engineering Website: www.ijetae.com, Volume 2, Issue 9, September 2012, pp.252-260 Gupta Sanjay Kumar, Kumar Vilender (2014), A Study on MCD Officials Perception Towards Present Information Sharing Potential in MCD, Delhi, International Journal of Engineering Technology and advanced Engineering (IJETAE), Vol. 4, Issue 6, June 2014, pp.730-735 Gupta Sanjay Kumar, Kumar Vilender (2014), Database Integration and Quality Services of MCD: A Study Based on MCD Officials Perception, International Journal of Recent Development in Engineering and Technology Website: www.ijrdet.com, Vol. 2, Issue 6, June 2014, pp.27-30. Gupta Sanjay Kumar, Kumar Vilender, On Design of SDM Model of an ERP Implementation for MCD, Website: www.ijrdet.com , Volume 3, Issue 4, October 2014) pp.54-61. Gupta Sanjay Kumar, Kumar Vilender, Some observations on integrated database system of MCD, Communicated. Kumar V, Maheshwari B, and Kumar U (2003), An investigation of critical management issues in ERP implementation: empirical evidence from Canadian organizations, available at: www.elsevier.com/locate/technovation. Luo and Strong (2004), A Framework for Evaluating ERP Implementation Choices, available at: IEEE Transactions of Engineering Management ,Vol. 51, NO. 3, August 2004 .s Mission Mode Project Name , Available . Reddy M P, Prasad B E, Gupta A and Reddy M G (1994), Meththodology for Integration of Hetrogenous Databases, available at: IEEE Transaction of Knowledge and Data Engineering, Volume 6, No.6, December 1994. Umble Elisabeth J., Haft Ronald R, Umble M. Michael, Enterprise resource planning: Implementation procedures and critical success factors, European Journal of Operational Research (2003) 146 (2003)241 257, available at: Sciencedirect.com.

35

E-Signatures and Its Network Security Issues Dr. Dharmesh Kumar Dr. Dharmendra Badal, Abstract Rapid development in the field of communication technologies has increased the need for online security and authentication for the information exchange. e-Signatures very much useful in stored data and data which flows in world wide web. My paper describe introduction of types of e-Signatures like Directed Signature, Special Signature, Group Signature, Blind Signature and Digital Signature schemes and in this paper we also presents about network security issues. Keyword: - Directed Signatures, Special signature, Group Signature, Blind Signature, Digital Signature etc. 1.

Introduction

The Electronic transfer of information from one user to another may be in form of documents, data file or e-mail messages. The shift from papers-based to electronic transfer of information has raised many anxieties especially about the authenticity, integrity, confidentiality and non-repudiation of the electronic documents .the development in the field of E-transformation has increased the need for online security and authentication. Many technologies are being developed to provide online authentication .the major concern in E- transformation is the need for the replacement of hand- written signature with an electronic signature which could provide senders authentication, integrity and nonrepudiation i.e. confirming the identity of the senders, ensuring for non-tampering and ensuring the sending and receiving of massage by the parties claiming to have sent and received the message. The traditional eintegrity and non-repudiation, does not fulfill these basic requirements. The integration of eSignatures may easily provide solution of this problem. Diffie & Hellman was first to

2.

Network Security Issues:

As we know that there is no one single security policy for all networks and information system. However, there are a few common issues that are of concern for most organization and companies that manage information. We shall discuss some issues her: Authentications mechanisms help establish proof of identities. The authentication process ensures that the source of an electronic message or document is correctly identified. It is a mechanism by which the sender identity is expressed and the receiver of a transaction or message can be confident of the identity of the sender. Authentication verifies the identity of a user or a service using certain encrypted information from the sender to the receiver. For 36

example Let us assume that a person A wants to send a envelope of a check worth $100 to another person B. then B would like to be assured that the check has indeed come from A, and not from someone else posing as A (as it could be a fake check in that case). This is Authentication. Integrity of a message means that the received message is exactly the same as the message transmitted by the senders. In other words a message that has not been altered in any way, either intentionally or unintentionally, during transmission is said to have maintained its integrity. For example Let us assume that a person A wants to send a envelope of a check worth $100 to another person B. then A and B will further like to make sure that no one can tamper with the contents of the check (i.e. amount, date, signature, name of the payee). This is integrity. [4] Non-repudiation -repudiation prevents either the sender or the receiver from denying a transmitted message. Thus when a message is sent and the sender tries to dency it, the receiver can prove that the alleged sender in fact sent the message. Similarly when a message is received and the recipient denies its receipt, the sender can prove that the message was infact received by the alleged receiver. Non-repudiation services are concerned with three types of issues: proof of origin, proof of receipt and proof of content. For example Let us assume that a person A wants to send a envelope of a check worth $100 to another person B. then What will happen tomorrow if B account , and then A refuses having wr signature to disallow A to refute this claim and settle the dispute. This is non-repudiation. [4] Confidentiality means that only the sender and receiver, and not any other party may know the contents of the message. For example Let us assume that a person A wants to send a envelope of a check worth $100 to another person B. then A will like to ensure that no one except B gets the envelope and even if someone else gets it. He does not come to know about the details of the check. This is confidentiality. Access Control the ability to limit and control the access to system and data only to authorized users is the objective of access control. To achieve access control, each entity trying to gain access must first be identified or authenticated and permitted entry only if his details match the access criteria pre-designed for the individual or system. The mechanism used is logins, passwords, and firewalls. 3.

What is Signature?

To avoid forgery and ensure confidentially of message, it has been common practice for centuries for the sender of the message to put his identification marks, i.e. signature on the letter and the to seal it in envelope, before handing it over to a deliverer. In electronic-era, physical signatures are not workable and then digital signatures are the cryptographic answer to the problems of information security and authenticity. 4.

Types of e-Signatures: 37

A) Directed signature Schemes: There are so many situations, when the signed message may be sensitive to the signature receiver. Signatures on medical records, tax information and most personal business transactions are such situations. Such signatures are called as directed signatures. In directed signature scheme, the signatures receiver has full control over the signature verification process. Nobody can check the validity of signature without his cooperation Moreover, a directed signature scheme must satisfy the following properties: 1. Correctness: properly formed (signer A and confirmer B): -directed, A-converted and Bconverted signatures must be accepted by the verification algorithms; private key, to produce a directed signature that is accepted by the verification algorithms or by the confirming protocols; 3. Completeness and soundness: the verification protocols are complete and sound, where completeness means that valid (invalid) signatures can always be proven valid (invalid) and soundness means that no valid (invalid) signature can be proven invalid (valid). 4. Invisibility: given a message m and a purported (A,B)-directed signature computationally infeasible, without the knowledge of the conprivate key, to ascertain that is a valid (A,B)- directed signature of m.

on m, it is

5. Non-transferability: a user participating in an execution of the confirming/ denying protocols does not obtain information that could be used to convince a third party about the validity/invalidity of a signature. B) Special Signature Schemes: Since the time Diffie and Hellman introduced the concept of digital signatures, many signature schemes have been proposed in cryptographic literature. These schemes can be categorized as either conventional digital signature schemes (e.g., RSA, DSA) or special signature schemes depending on their security features. In a conventional signature scheme (the original model defined by Diffie and Hellman), we generally assume the following situation: * The Anyone who knows the public key of the signer can verify the correctness of the signature at any time without any consent or input from the signer. (Digital signature schemes with this property are called self-authenticating signature schemes.) * The security of the signature schemes (i.e., hard to forge, non-repudiation,) is based on certain complexity-theoretic assumptions. In some situations, it may be better to relax some of these assumptions, and/or add certain special security features. For example, when Alice asks Bob to sign a certain message, she 38

may not want him to know the contents of the message. In the past decade, a variety of special signature schemes have been developed to fit security needs in different applications. C) Group Signature Schemes: A Group signature scheme is a method for allowing a member of a group to anonymously sign a message on behalf of the group. David Chaum and Eugene van Heyst first introduced the concept in 1991. For example, an employee of a large company could use a group signature scheme where it is sufficient for a verifier to know an employee signed a message, but not the particular employee who signed it. Another application is for keycard access to restricted areas where it is inappropriate to track individual employee's movements, but necessary to secure areas to only employees in the group. Essential to a group signature scheme is a group manager, who is in charge of adding group members and has the ability to reveal the original signer in the event of disputes. In some systems the responsibilities of adding members and revoking signature anonymity are separated and given to a membership manager and revocation manager respectively. Many schemes have been proposed, however all should follow these basic requirements * Soundness and Completeness: Valid signatures by group members correctly, and invalid signatures always fail verification.

always

verify

* Unforgeable: Only members of the group can create valid group signatures. * Signer ambiguous: Given a message and its signature, the identity of the individual signer cannot be determined without the revocation manager's secret key. * Unlikability: Given two messages and their signatures, we cannot tell if the signatures were from the same signer or not. * No Framing: Even if all other group members (and the managers) collude, they cannot forge a signature for a non-participating group member. * Unforgeable tracing verification: The revocation manager cannot falsely accuse a signer f creating a signature he did not create. Hence we can define the group signature as Definition : A group signature scheme is a digital signature scheme consisted Of the following four procedures: Setup: On input a security k, the probabilistic algorithm outputs the initial group public key Y and the secret key S of the group manager. Join: A protocol between the group manager and a user that results in the user becoming a secret.

39

Sign: A probabilistic algorithm that on input a group public key, a membership certificate, a membership secret and a message m. Outputs is the group signature Sig of m. Verify: An algorithm takes as input the group public key mathcalY, the signature Sig, the message m to output 1 or 0. Open: The deterministic algorithm takes as input the message m, the signature Sig, the S A secure group signature must satisfy the following properties: Correctness: Signatures produced by a group member using Sign must be accepted by Verify. Unforgeability: Only the group members can sign messages on behalf of the group. Anonymity: Given a valid signature, it is computationally hard to identify the signer for anyone except the group manager. Unlinkability: Deciding whether the same computed two different valid signatures group member is computationally hard for anyone except the group manager. Traceability: The group manager is always able to open a valid signature and identify the signer. Exculpability: Neither the group manager nor a group member can sign messages on behalf of other group members. Also, the group manager or colludes with some group members cannot misattribute a valid group signature to frame a certain member, i.e., the member should responsible for a valid signature that he did not produce. Coalition-resistance: A colluding subset of group members (even if comprised of the whole group) cannot produce a valid signature that the group manager cannot open. Efficiency: The efficiency of group signature is based on the parameters: The size of the group public key, the length of the group signatures and the efficiency of the algorithms and protocols of the group signatures D) Blind signature scheme: Blind signature schemes, first introduced by Chaum allow a person to get a message signed by another party without revealing any information about the message to the other party. Using RSA signatures, Chaum demonstrated the implementation of this concept as follows: Suppose Alice has a message m that she wishes to have signed by Bob, and she does not want Bob to learn anything about m. Let (n,e) be Bob's public key and (n,d) be his private key. Alice generates a random value r such that gcd(r, n) = 1 and sends x = (re m) mod n to Bob. The value x is ``blinded'' by the random value r; hence Bob can derive no useful information from it. Bob returns the signed value t = xd mod n to Alice. Since 40

xdº (re m)d º r md mod n, Alice can obtain the true signature s of m by computing s = r-1 t mod n. Now Alice's message has a signature she could not have obtained on her own. This signature scheme is secure provided that factoring and root extraction remains difficult. However, regardless of the status of these problems the signature scheme is u random. The random r does not allow the signer to learn about the message even if the signer can solve the underlying hard problems. There are potential problems if Alice can give an arbitrary message to be signed, since this effectively enables her to mount a chosen message attack. One way of thwarting this kind of attack is described in Blind signatures have numerous uses including time stamping, anonymous access control, and digital. Thus it is not surprising there are now numerous variations on the blind signature theme. Further work on blind signatures has been carried out in recent years Uses Blind signature schemes see a great deal of use in application where sender privacy is For example, the integrity of some electronic voting system may require that an electron authority certify each ballot before it can be accepted for counting; this allows the authority to check the credentials of the voter to ensure that they are allowed to vote, and that they are not submitting more than one ballot. Simultaneously, it is important that this authority not authority will not see the contents of any ballot it signs, and will be unable to link the blinded ballots it signs back to the un-blinded ballots it receives for counting. Signature blinding also offers some protections against side channel attacks, and is used by many web servers implementing the Secure Socket Layer (SSL) to prevent timing attacks. Blind Signature Schemes Blind signature schemes exist for many public key-signing protocols. Some examples are provided below. In each example, the message to be signed is contained in the value m. m is considered to be some legitimate input to the signature function. Blind RSA Signatures One of the simplest blind signature schemes is based on RSA signing. A traditional RSA signature is computed by exponentiating the message m with the secret exponent d, all mod a public modulus N. The blind version adds a random value r, such that gcd(r, N) = 1. r is exponentiated with the public exponent e (mod N), and the value re is used as a blinding factor. The signing authority receives the product of the message and blinding factor m (r e) mod N, which obscures the message. The blinded signature s' is then calculated as:

41

s'= (m(re))d\ (mod N) The author of the message can then remove the blinding factor to reveal s, the valid RSA signature of m: s=\equiv s' * r-1 (mod N) E) Digital signature Schemes: A digital signature is a code attached to an electronic document that uniquely identifies the sender (i.e. authentication), provides for message integrity and allows for non-repudiation. As per Internet Security Glossary of Internet Engineering Task Force (IETF), the Digital Signature id defined as way the any recipient of the data can Or of the data unit to prove the source and integrity of the data unit and protect against forgery Digital Signature is created with the transformation (encryption) of the message using hashing algorithm and working with the concept of key pair (private and public key) of cryptographic using Public key Infrastructure (PKI). 5.

Working of Digital Signature:

The electronic message is converted to condensed version of a message (message digest) digest or a digital fingerprint of the message. The sender of the message applies its private key to encrypt the message digest to create the digital signature. The message in text and the digital signature together is transmitted foe the intended receive. The recipient receives two parts together in form the message in text and the digital available with the recipient to get the message digest. The same hash function as applied by the sender while sending the message, is applied on the received input for signature verification process by matching both the digest. If both 5the digest matches, the recipient is assured that about the authenticity- that the originator sent the message (as signature generated by the ori that the

42

message was not modified. If even one bit of the original message was changed, the digest generated using the received message would cause the signature verification process to fail. 6. Why Digital Signature used There are three common reasons for applying a digital signature to communications: Authenticity Public-key cryptosystems allow anybody to send a message using the public key. A signature allows the recipient of a message to be confident that the sender is indeed who s/he claims to be. Of course the recipient cannot be 100% sure that the sender is indeed who s/he claims to be - the recipient can only be confident - since the cryptosystem may have been broken. The importance of authenticity is especially obvious in a financial context. For example, suppose a bank sends instructions from its branch offices to the central office in the form (a,b) where a is the account number and b is the amount to be credited to the account. A devious customer may deposit £100, observe the resulting transmission and repeatedly retransmit (a,b). This is known as a replay attack. Integrity Both parties will always wish to be confident that a message has not been altered during transmission. The encryption makes it difficult for a third party to read a message, but that third party may still be able to alter it in a useful way. A popular example to illustrate this is the homomorphism attack: consider the same bank as above which sends instructions from its branch offices to the central office in the form (a,b) where a is the account number and b is the amount to be credited to the account. A devious customer may deposit £100, intercept the resulting transmission and then transmit (a, b3) to become an instant millionaire. Non-repudiation In a cryptographic context, the word repudiation refers to the act of denying association with a message (ie claiming it was sent by a third party). The recipient of a message may insist that the sender attach a signature in order to prevent any later repudiation, since the recipient may show the message to a third party to prove its origin. 7. Application of different type of Signature schemes The processes / application which need electronic transfer of information over network and require strong authentication of both the sender and the contents of the message, or nonrepudiation, would require signatures to be integrated. The electronic mail system, electronic funds transfer system, automate forms processing, electronic processing of contract and Tender Documents and other B2G, B2B and B2C application are the some of the examples.

43

The exchange of commercial and regulatory documents over Internet is common feature in e-commerce and e-business all over the world. In such case signatures scheme would be required to ensure the security for electronic transfer of the documents. To ensure the validity and acceptance of signatures scheme in the court of law (in case of Digital signature) the legal frame work and legislations are required to be in place. More and more countries are working in this regard. The countries like US, China, UK, Japan, Russia, Germany, France, EU, and many more, are already having their laws and legislations in place to provide legal sanctity to the electronic transaction and signature scheme. With the legal and technological development, and growing market demand for secured transactions on the Internet, more and more application based on digital signature, group signature and other signatures technology would come in to existence. 8. Conclusion In this paper we first explain what are network security issues, after that we define what is signature scheme and type of signature (Digital signature, Directed signature, Group signature and Ring signature) after that we explain the application of different type of signature scheme used. Reference 1. Diffie W., Hellman m., New direction in Cryptographic, IEEE Trans . Inform. Theory, Vol. It-22, pp. 644-654, 1976. 2. Gelboard, B., Signing your 011001010: The problem of Digital Signature, Communications of the ACM, vol.43 No.12: 27-28, December 2000. 3. Kuechler W., Grupe F.: Digital Signature- A Business View, Information system Management, Vol. 20, No.1: 19-28, winter 2003. 4. -08 5. Utah Department of Commerce, Division of Corporation and Commercial Code, Utah Digital Signature Act, 1994, http://www.le.state.ut.us. 6. Preneel B: Analysis and Design of cryptographic Hash function, Ph.D. Thesis, Katholieke University Leuven, 1993. 7. Boyar J., Chaum D., Damgard I. and Pederson T., (1991), Convertible undeniable signatures. Advances in Cryptology Crypto, 90, LNCS # 537,p.p. 180-205. 8. Chaum D. (1995), Designated confirmer signatures. Advances in Cryptology Crypto, 94, LNCS # 950,p.p. 86-91. 9. Dr. sunders Lal and Manoj Kumar (2003), A Directed Signature scheme and its applications. Proceedings of the NCIS, p.p.124-133. 10. F Laguillaumie, P Paillier, and D Vergnaud" Universally Convertible Directed Signatures" http://hal.inria.fr/docs/00/05/92/30/PDF/ucds.pdf 11. 91 (1995) 12. RSA Laboratories' Frequently Asked Questions About Today's Cryptography, Version 4.1,Year: 2000 http://www.rsasecurity.com/rsalabs/node.asp?id=2338 13. -based Group Signature Scheme from http://eprint.iacr.org/2003/116.pdf 14. M. Stadler, J.M. Piveteau, and J. Carmenisch, Fair blind signatures, Advances in Cryptology - Eurocrypt '95, Springer-Verlag (1995), 209-219. 15. Internet Engineering Task Force (IETF)- Internet Security Glossary http://www.ietf.org/rfc/rfc2828.txt 16. http://en.wikipedia.org/wiki/Blind_signature 17. D. Chaum and E. van Heijst, Group signatures, Advances in Cryptology - Eurocrypt '91, Springer-Verlag (1991) 257-265.

44

18. American Bar Association, Digital Signature Guidelines Tutorial, section of Science and technology Information Security Committee, 2002, http://www.abanet.org/scitech/ec/isc/dsg-tutorial.html 19. J. Camenisch, M. Michels. A Group Signature Scheme Based on an RSA-Variant. 1998, http://www.brics.dk/RS/98/27/BRICS-RS-98-27.pdf 20. D. Chaum and E. van Heyst (1991). "Group signatures". Advances in Cryptology EUROC volume 547 of Lecture Notes in Computer Science, 257-265 21. D. Chaum, Blind signatures for untraceable payments, Advances in Cryptology - Crypto '82, Springer-Verlag (1983), 199-203. 22. D. Chaum, Security without identification: transaction systems to make big brother obsolete, Communications of the ACM 28 (10) (1985), 1030-1044 23. D. Chaum, A. Fiat and M. Naor, Untraceable electronic cash, Advances in Cryptology - Crypto '88, SpringerVerlag (1988), 319-327. 24. M. Franklin and M. Yung, Blind Weak Signature and its Applications: Putting Non-Cryptographic Secure Computation to Work, Advances in Cryptology - Eurocrypt '94, Springer-Verlag (1994), 67-76 25. http://www.rsasecurity.com/rsalabs/node.asp?id=2342 26. L. Chen and T.P. Pederson, New group signature schemes, Advances in Cryptology - Eurocrypt '94, SpringerVerlag (1994), 171-181. 27. L. Chen and T.P. Pedersen, on the efficiency of group signatures: providing information-theoretic anonymity, Advances in Cryptology - Eurocrypt '95, Springer-Verlag (1995), 39-49. 28. Shiralkar P., Vijayarama B.S.: Impact of Digital Signature Technology on E- business, Preceding of the National Meeting of Decision Science Institute, San Diego, CA, November 2002 29. D. Chaum and E. van Heyst (1991). "Group signatures". Advances in Cryptology volume 547 of Lecture Notes in Computer Science, 257-265. 30. managements special issues on IT, pp 66-71.

45

GLOBAL AND DISTRIBUTED SOFTWARE ENGINEERING S.S. Khadri Abstract Global software development is not a phenomenon but a reality nowadays. Whatever, it is still poorly explored. Lack of awareness of the particular factors inherited in the nature of globally distributed software projects makes practitioners struggle and invent new approaches to survive. It uncovers the necessity to support risk management activities. This paper describes a Knowledge Base and a Risk Barometer developed to support practitioners who lack experience in global projects. Particularities of globally distributed projects and their effect on project performance are formalized in a reusable framework for managing uncertainty. The described tools provide input for risk identification and help to evaluate risks based on the experience from former projects. Introduction Globally distributed software development achieves division of labor by dispersing software development tasks among several remote development centers. This mode of software development has become a popular business model for software organizations. There are several compelling business reasons supporting the adoption of distributed software development: 1) ability to extend work beyond the regular office hours at a single site, 2) software development costs at offshore centers, like in India, are as much as four times less expensive , 3) the capabilities of workforce in remote centers located in emerging economies have improved significantly in the recent years, 4) advances in information And communication technology have facilitated easier collaboration between remote workforce. At the same time several challenges in distributed software development have been reported. Structured and disciplined software engineering processes have often been advocated as a key remedy for addressing the aforementioned challenges. In this paper, we report our findings, from our field study, on the effectiveness of deploying structured software engineering processes and stringent quality management practices in globally distributed software development. The main contribution of this paper is in developing Empirical models of distributed software project performance and verifying them using data collected from large scale, real world projects. Modeling Globally Distributed Software Development A model that captures the individual effects of factors that influence global software development project performance is necessary. The modeling Framework for this study is developed based on the economic view of software development. Researchers using this framework treat software development as a production process, and model software performance indicators, such as productivity and quality, as a function of personnel related factors and software methodology related factors. Prior studies using the economic view of software development have predominantly focused on collocated Software development scenarios. In this study we extend this framework to address distributed software 46

development. Also, prior software engineering research studies have not extensively focused on right mix of quality practices in different stages of product development to improve the net outcome in a project. To address this, we study the effects of Prevention, appraisal and failure-based quality activities on distributed project performance.

Figure 1: Research Model Prevention Programming training

Appraisal

Failure

Requirement, specification and design reviews Code inspection Status reviews

Business domain training Process training

Unit testing Module testing Integration testing System testing Error tracking and correction

Configuration

Figure 2: CATEGORIZATION PRACTICES

OF

INDIVIDUAL QUALITY

MANAGEMENT

2.1 Research Model Figure 1 gives a pictorial overview of our research model. On the left side of the model are the factors affecting software development; namely work dispersion and Quality Management Approaches (QMA). On the right side of the model are the project performance indicators. To model the other factors affecting both project performance and software development, and to understand the development process in more detail, we introduce a number of control variables. We explain work dispersion, quality management approaches, project performance, and control variables in more detail in the next few subsections. 2.1.1 Work Dispersion development process is. We measure work dispersion between development centers using a variable similar to the Herfindahl-Hirschman Index. This index is a well tested and widely 47

used measure that quantifies how diversified a large corporation or a particular industry is. Since there are only two development centers in our data set, the work dispersion measure is defined as Work dispersion = 1002 (% effort at first development center)2 (% effort at second development center)2 A value of zero for our work dispersion measure indicates that the software project is completely co-located, and an increasing value represents increasing levels of work dispersion. \ 2.1.2 Quality Management Approaches A key component of the model is determining how to categorize software management quality. Instead of creating our own categories, we use the well studied and accepted categorization used in manufacturing quality research. This categorization has three components; prevention-based, appraisal based, and failure-based QMAs. Figure 2 provides a detailed breakdown of the elements of each approach. Prevention-based approach: Prevention-based quality management practices in software development involve activities 2.1.3 Project Performance Similar to past software engineering economics studies, we use two different performance indicators to determine the quality of the software product. They are: 1) Development productivity: Development productivity is defined as the ratio of software code size in function points to the total development effort in person-hours. The advantage of function point measurement for code size is that it incorporates measures of customer perceived software functionality as well as complexity [29]. Function points have also been shown to be a reliable output measure in the context of commercial systems . The development effort includes effort incurred in all stages of development until the customer signs off the project. 2) Conformance Quality: Our quality measure captures the number of unique problems reported by customers during the acceptance tests and production trials before the project Signoff. It is calculated as follows: Conformance Quality= This reciprocal formulation represents quality as a decreasing function as the number of defects increases. 2.1.4 Control Variables We introduced a number of control variables into the model. These variables serve two purposes; a) provide a deeper understanding of the distributed software development process, 48

And b) allow us to create empirical models that can be computed. We use five variables; two that affect primarily productivity, two that affect primarily quality, and one that affects both. 2.1.4.1 Productivity Variables We used two control variables that affect primarily productivity; Team Size and Reuse. Team Size: Team size is the headcount of the number of persons involved in the project. Team size is expected to be a good surrogate for the coordination difficulties that could occur within The software project team. An increased team size poses difficulties in both administrative and expert coordination. Reuse: Reuse in this study is measured as the percentage of lines of code that have been utilized from the generic code libraries maintained centrally in the knowledge database at our research site. All reused modules and objects in the projects we studied were maintained with unique tags for readability and hence could be easily identified in applications. Reuse in software enables developers to use standardized routines that have been stored in organization-wide repositories and libraries to accomplish certain Functionality. 2.1.4.2 Quality Variables We used two control variables that affect primarily quality; Code Size, and Upfront Investment. Code Size: Code size is measured as function points over the entire project code base. Code size is a widely recognized control variable for software quality models as software size captures both the magnitude of the project and much of the complexity involved in developing the application. Upfront Investment: Upfront investment is measured as the Percentage of total effort spent during the requirements and design stages of the life cycle. Higher levels of investment in activities done before commencing actual coding of system, such as requirement analysis and high level design, have been shown to positively influence system quality. 2.1.4.3 Common Variable We use a control variable, design rework, which affects both productivity and quality as it gives deeper insights into the effect of dispersion on the overall project performance. Design Rework: As stated earlier, we measure code size in terms of function points. Hence, we define design rework as the effort, in terms of person hours, spent per function point to implement the new design. Agreeing on a common non-volatile design early in the project life cycle is likely to be very important in a distributed environment. We hypothesize that changing the basic design framework often results in cascading changes and rework in individual components that affects project performance. Hence, we also account for design rework when determining the effect of dispersion on project performance. 2.1 GSE benefits and risks GSE has a number of potential benefits, including shortening time-to-market cycles by using time-zone differences and improving the ability to quickly respond to local customer needs. Globally distributed software engineering also allows organizations to benefit from 49

access to a larger qualified resource pool with the promise of reduced development costs. Another potentially positive impact of globally distributed engineering is innovation, as the mix of developers with different cultural backgrounds may trigger new ideas (Ebert & De Neve, 2001; Herbsleb & Moitra, 2001; Damian et al., 2004). 2.2 Global software engineering modes There is no commonly accepted definition for GSE or collaboration modes. Instead, there are a multitude of terms that are used which mean some mode of GSE. This section will explain a few of these modes. These modes were chosen in order to give different viewpoints of the GSE modes. First, modes differentiated by the type of agreement will be presented. Next, other categorizations presented in the literature are discussed, including equity and non-equity collaborations, upstream, horizontal and downstream collaboration, dyadic alliances, alliance constellations and alliance networks and explorative and exploitative collaboration. 2.3 General GSE challenges GSE challenges are discussed in many publications from various perspectives. Silva et al. (2010) carried out a systematic literature review of distributed development challenges, best practices, models and tools. They analysed 54 papers and found that the top-five challenges appeared in 120 pieces of evidence (45%) out of a total of 266 for all 30 identified challenges. These five challenges were effective communication, cultural differences, coordination, time-zone differences and trust. Similar to these findings and a commonly referenced classification for challenges caused by globally distributed development is (Carmel, 1999; Carmel & Tija, 2005): Communication breakdown (loss of communication richness) Coordination breakdown Control breakdown (geographical dispersion) Culture clash (cultural differences). These challenges affect all aspects of product development and different authors have studied these aspects in more detail either from certain process viewpoints or from the challenge viewpoint. 3. GSE challenges The previous section focused on GSE challenges on a general level, as discussed in the GSE literature. Based on the empirical work carried out during the Merlin and Prisma projects, several concrete challenges have been identified that make GSE less productive in companies in practice. The industrial companies express the challenges differently than theory, thus the GSE framework presented in this thesis (Sections 3 and 4) is needed to help companies in finding relevant solutions to the challenges they are facing.

50

3.1 Industrial expressions of challenges The most critical points in global software engineering, based on the industrial inventory and expressed by the Merlin and Prisma partners in workshops, were the contracting and requirements definition, project planning and tracking, architecture analysis and design and integration. Contracting and requirements definition: The more detailed the prepared specification of the work is, the better (within a reasonable degree of effort). Thus, if all collaboration partners have the same view/shared understanding of what is to be done and if that is documented well, fewer conflicts will occur. Challenges and problem expressions relating to this point were:

References 1. bowls?:

- 34, 2006.

2. -18, 1991. 3.

R. -450, 1998. ted computer-aided software -401, 1991.

4. 5. on software engineering, vol. 15, pp. 1199-1205, 1989. 6. 7. 8. 9.

-240, 2000. Boehm, Software engineering economics. Upper Saddle River, NJ: Prentice Hall, 1981. Cairncross, The Death of Distance: How the Communications Revolution Will Change Our Lives. Boston, MA: Harvard Business School Press, 1997. Carmel, Global software teams: Collaborating across borders and time zones. Upper Saddle River, NJ: Prentice Hall, 1999.

10. Quarterly Executive, vol. 1, pp. 65-76, 2002. 11. Organization Science, vol. 12, pp. 346-371, 2001. 12. P. B. Crosby, Quality is Free: The art of making qualitycertain. New York: McGraw-Hill, 1979. 13. -78, 2000. 14. B. Curtis, W. Hefl Pittsburgh CMU/SEI-2001-MM-01, 2001. 15. R. Davidson and J. G. Mackinnon, Estimation and Inference in Econometrics. New York: Oxford University Press, 1993. 16. W. E. Deming, Quality, Productivity and Competitive Position. Cambridge, MA: MIT Center for Advanced Engineering Study, 1982. 17. vol. 46, pp. 1554-1568, 2000.

51

18.

-444, 1988. 19. J. Fox, Applied Regression, Linear Models, and Related Methods. Thousand Oaks, California: Sage, 1997. 20. software processes and communication in 21. 45, pp. 193-200, 2002.

52

SYSTEMATIC APPLICATIONS OF CLOUD SERVICES IN INQUIRING CONCERT P.Sudha AbstractThe early intend of cloud was dealing with vast degree of data difficulties. Presently despite of its large skyline spreads both systematic and technological application related to science and scientist. The pledge of cloud is to meet the demand of the significant performance with an alternative. The appraisal of the utility of the cloud computing trend in service of the scientists and professionals is dealt herewith. The simulated infrastructure model is erected for analysis with regard to hardware and budget control. Numerous job schedules by the users were indulged for the quantification. .Acquired data through simulation shows that performance and cost results can be extrapolated to large-scale problems and cluster infrastructures. Index Terms Cloudcomputing, computing cluster, multi-cloud infrastructure, computing process Introduction Supercomputers today are used mainly by the military, government intelligence agencies, universities and research labs, and large companies to tackle enormously complex calculations for such tasks as simulating nuclear explosions, predicting climate change, designing airplanes, and analyzing which proteins in the body are likely to bind with potential new drugs. Cloud computing aims to apply that kind of power measured in the tens of trillions of computations per second to .problems like analyzing risk in financial portfolios, delivering personalized medical information, even powering immersive computer games, in a way that users can tap through the Web. It does that by networking large groups of servers that often use low-cost consumer PC technology, with specialized connections to spread data-processing chores across them. By contrast, the newest and most powerful desktop PCs process only about 3 billion computations a second. Let's say you're an executive at a large corporation. Your particular responsibilities include making sure that all of your employees have the right hardware and software they need to do their jobs. Buying computers for everyone isn't enough -you also have to purchase software or software licenses to give employees the tools they require.

53

Fig 1- cloud computing architecture Whenever you have a new hire, you have to buy more software or make sure your current software license allows another user. It's so stressful that you find it difficult to go to sleep on your huge pile of money every night. installing a suite of software for each computer, you'd only have to load one application. That application would allow workers to log into a Web-based service which hosts all the programs the user would need for his or her job. Remote machines owned by another company would run everything from e-mail to word processing to complex data analysis programs. It's called cloud computing, and it could change the entire computer industry In a cloud computing system, there's a significant workload shift. Local computers no longer have to do all the heavy lifting when it comes to running applications. The network of computers that make up the cloud handles them instead. Hardware and software demands on the user's side decrease. The only thing the user's computer needs to be able to run is the cloud computing system's interface software, which can be as simple as a Web browser, and the cloud's network takes care of the rest. Instead of running an e-mail program on your computer, you log in to a Web e-mail account remotely. The software and storage for your account doesn't exist on your computer --it's on the service's computer cloud. Cloud Technologies Four Selected Clouds: Amazon EC2, GoGrid,ElasticHosts, and Mosso We identify three categories of cloud computing services : Infrastructure-as-a-Service (IaaS), that is, raw infrastructure and associated middleware, Platform-as-a-Service (PaaS), that is, APIs for developing applications on an abstract platform, and Software-as-a-Service (SaaS), that is, support for running software services remotely. Many clouds already exist, but not all provide virtualization, or even computing services. The scientific community has not yet started to adopt PaaS or SaaS solutions, mainly to avoid porting legacy applications and for lack of the needed scientific computing services, respectively. Thus, in this study we are focusing only on IaaS providers. We also focus only on public clouds, that is, clouds that are not restricted within an enterprise; such clouds can be used by our target audience, scientists. Based on our recent survey of the cloud computing providers , we have selected for this work four IaaS clouds. The reason for this selection is threefold. First, not all the 54

clouds on the market are still accepting clients; FlexiScale puts new customers on a waiting list for over two weeks due to system overload. Second, not all the clouds on the market are large enough to accommodate requests for even 16 or 32 coallocated resources. Third, our selection already covers a wide range of quantitative and qualitative cloud characteristics, as summarized in Table 1 and our cloud survey respectively. We describe in the following Amazon EC2; the other three, GoGrid (GG), ElasticHosts (EH), and Mosso, are IaaS clouds with provisioning, billing, and availability and performance guarantees similar to Amazon

sense that it enables the user to extend or shrink its infrastructure by launching or terminating new virtual machines (instances).

Fig 2

types of cloud services

The user can use any of the instance types currently available on offer, the characteristics and cost of the five instance types available in June 2009 are summarized in Table 1. An ECU is the equivalent CPU power of a 1.0-1.2 GHz 2007 Opteron or Xeon processor. The theoretical peak performance can be computed for different instances from the ECU definition: a 1.1 GHz 2007 Opteron can perform 4 flops per cycle at full pipeline, which means at peak performance one ECU equals 4.4 gigaflops per second (GFLOPS). To create an infrastructure from EC2 resources, the user specifies the instance type and the VM image; the user can specify any VM image previously registered with Amazon, including physical machine (theresource status is running), the instance is booted; at the end of the boot process the resource status becomes installed.The installed resource can be used as a regular computing node immediately after the booting process has finished, via an ssh connection. A maximum of 20 instances can be used concurrently by regular users by default; an application can be made to increase this limit, but the process involves an Amazon representative. Amazon EC2 abides by a Service Level Agreement (SLA) in which the user is compensated if the resources are not available for acquisition at least 99.95 percent of the time. The security of the Amazon services has been investigated elsewhere MTC PRESENCE IN SCIENTIFIC COMPUTING WORKLOADS An important assumption of this work is that the existing scientific workloads already include Many Task Computing 55

users, that is, of users that employ loosely coupled applications comprising many tasks to achieve their scientific goals. In this section, we verify this assumption through a detailed investigation of workload traces taken from real scientific computing environments Cloud Deployement Model The selection of cloud deployment model depends on the different levels of security and control required. The Private cloud infrastructure is operated solely for a single organization with the purpose of securing services and infrastructure on a private network. This deployment model offer the greatest level of security and control, but it requires the operating organization to purchase and maintain the hardware and software infrastructure, which reduces the cost saving benefits of investing in a cloud infrastructure. Rackspace, Eucalyptus, and VMware6 are example providers of private cloud solutions. A Community cloud infrastructure is shared by several organizations and supports a specific community that has shared concerns. It may be established where organizations have similar requirements and seek to share cloud infrastructure. Example of community cloud is Google's Gov Cloud. Public clouds provide services and infrastructure over the Internet to the general public or a large industry group and is owned by an organization selling cloud.

Fig 3- Multicloud Deployment Architecture Major public cloud providers are Google and Amazon. These clouds offer thegreatest level of efficiency in shared resources, however they are also more vulnerable than private clouds. A Hybrid cloud infrastructure, as the name suggests, is a composition of private, public, and/or community clouds possibly through multiple providers. Reasoning for hybrid cloud infrastructure is to increase security, better management or failover purposes. For some it may not be feasible to place assets in a public cloud, therefore many opt for the value of combining different cloud deployment models. The drawbacks of a hybrid cloud however is the requirements of managing multiple different security platforms and communication protocols. Performance Evaluation

56

Performance evaluation of clouds and virtualized environments. There has been a recent spur of research activity in assessing the performance of virtualized resources, in cloud computing environments and in general In contrast to this body of previous work, ours is different in scope: we perform extensive measurements using general purpose and highperformance computing benchmarks to compare several clouds, and we compare clouds with other environments based on real long-term scientific computing traces. Our study is also much broader in size: we perform in this work an evaluation using over 25 individual benchmarks on over 10 cloud instance types, which is an order of magnitude larger than previous work (though size does not simply add to quality). Performance studies using general purpose benchmarks have shown that the overhead incurred by virtualization can be below 5 percent for computation and below 15 percent for networking Similarly, the performance loss due to virtualization for parallel I/O and web server I/O has been shown to be below 30and 10 percent respectively. In contrast to these, our work shows that virtualized resources obtained from public clouds can have a much lower performance than the theoretical peak. Recently, much interest for the use of virtualization has been shown by the HPC community, spurred by two seminal studies that find virtualization overhead to be negligible for compute-intensive HPC kernels and applications such as the NAS NPB benchmarks; other studies have investigated virtualization performance for specific HPC application domains or for mixtures of Web and HPC workloads running on virtualized (shared) resources [67].

Table 1

performance of various cloud service providers

Our work differs significantly from these previous approaches in target (clouds as black boxes versus owned and controllable infrastructure) and in size. For clouds, the study of performance and cost of executing a scientific workflow, Montage, in clouds investigates cost performance trade-offs between clouds and grids, but uses a single application on a single cloud, and the application itself is remote from the mainstream HPC scientific community. Also close to our work is the seminal study of Amazon S3 , which also includes 57

a performance evaluation of file transfers between Amazon EC2 and S3. Our work complements this study by analyzing the performance of Amazon EC2, the other major Amazon cloud service; we also test more clouds and use scientific workloads. Several small-scale performance studies of Amazon EC2 have been recently conducted: the study of Amazon EC2 performance using the NPB benchmark suiteor selected HPC benchmarks the early comparative study of Eucalyptus and EC2 performance , the study of file transfer performance between Amazon EC2 and S3, etc. An early comparative study of the DawningCloud and several operational models extends the comparison method employed for Eucalyptus but uses job emulation instead of job execution. Our performance evaluation results extend and complement these previous findings, and gives more insights into the Performance of EC2 and other clouds. Other (early) performance evaluation. Much work has been put into the evaluation of novel supercomputers and nontraditional systems for scientific computing. We share much of the used methodology with previous work; we see this as an advantage in that our results are readily comparable with existing results. The two main differences between this body of previous work and ours are that we focus on a different platform (that is, clouds) and that we target a broader scientific computing community (e.g., also users of grids and small clusters). Other cloud work. Recent work has considered running mixtures of MTC with other workloads in cloudlike environments. For this direction of research, our findings can be seen as further motivation and source of realistic setup parameters.

58

Fig 4

performance of various cloud service providers plotted

analysis of workloads is important for understanding how systems are used. In addition, workload models are needed as input for the evaluation of new system designs, and for the comparison of system designs. This is especially important in costly large-scale parallel systems. Luckily, workload data are available in the form of accounting logs. Using such logs from three different sites, we analyze and model the job-level workloads with an emphasis on those aspects that are universal to all sites. As many distributions turn out to span a large range, we typically first apply a logarithmic transformation to the data, and then fit it to a novel hyper-Gamma distribution or one of its special cases. This is a generalization of distributions proposed previously, and leads to good goodness-of-fit scores. The parameters for the distribution are found using the iterative EM algorithm. The results of the analysis have been codified in a modeling program that creates a synthetic workload based on the results of the analysis. Conclusion With the emergence of cloud computing as the paradigm in which scientific computing is done exclusively on resources leased only when needed from big data centers, e-scientists are faced with a new platform option. However, the initial target of the cloud computing paradigm does not match the characteristics of the scientific computing workloads. Thus, in this paper we seek to answer an important research question: Is the performance of clouds sufficient for scientific computing? To this end, we perform a comprehensive performance evaluation of a large computing cloud that is already in production. Our main finding is that the performance and the reliability of the tested cloud are low.

59

Table 2- Workload distribution Thus, this cloud is insufficient for scientific computing at large, though it still appeals to the scientists that need resources immediately and temporarily. Motivated by this finding, we have analyzed how to improve the current clouds for scientific computing, and identified two research directions which hold each good potential for improving the performance of this directions and we plan to test their services to see if they can hold their claims. We will extend this work with additional analysis of the other services offered by Amazon: Storage (S3), database (SimpleDB), queue service (SQS), Private Cloud, and their inter-connection. We will also extend the performance evaluation results by running similar experiments on other IaaS providers and clouds also on other real large-scale platforms, such as grids and commodity clusters. In the long term, we intend to explore the two new research topics that we have raised in our assessment of needed cloud improvements. Referees 1. 2008. 2.

Anoep, and

3.

http://www.gogrid. com, Dec. 2008.Iosup, O.O. Sonmez, S. -of-Tasks in Large-108, 2008. oward Loosely

4. -269, 2006. 5. -1122, 2003. 6.

D.G. Feitelson, 34,1997.

7. environment

ISCN: towards a distributed scientific computing -486, 2006.

60

8. 9. -Aware Distributed Computing, pp. 55-64, 2008. 10.

-23, Nov. 2008.

11. -Task Computing on Grids and Supercomputers (SC-MTAGS), 2009. 12. -686, 2008. eline and Batch Sharing

13. Thain, J. Bent, A.C. Arpaci-Dusseau, R.H. Arpacipp. 152-161, 2003. 14.

rkshop Integrated Research in Grid Computing (CGIW), pp. 431-442,2008. 15. http://www.cs.huji.ac.il/labs/parallel/ workload/logs.html, Jan. 2009. 16.Y.17.

nagement Architectures -Par Conf. Parallel Processing, pp. 13-25, 2008.

through Trace-

61

SELF MOTIVATED ROUTE OPTIMIZATION FOR AODV IN MOBILE ADHOC NETWORK

Vishnu Mishra Abstract Mobile Adhoc Network is a collection of wireless mobile.nodes forming a temporary network without the aid of any established infrastructure. It has been a big challenge to develop routing protocol that can meet different application needs and optimize routing paths according to the topology change in mobile ad hoc networks. In this paper, we will discuss a new routing technique for AODV, called self healing routing technique. Using this technique, all the neighboring nodes monitor the route and try to optimize it if and when a better local sub-path is available. Thus this technique enhances the packet delivery rate with respect to node speed and no. of nodes. This technique adopt different schemes to obtain and maintain local topology information on data traffic demand. The aim of this work is the study of various routing techniques for MANET nodes to obtain the route through which data is to be transferred to the destination with maximum throughput. In this routing technique we have adopted self healing routing methodology for ad hoc on demand routing protocol with shortest path using the Glomosim.The popular protocols like DSDV, DSR, AODV and TORA is configured in the MANET. Keywords: MANET, DSDV, AODV, DSR, TORA, Reconfigurable Architecture, FPGA (Field Programmable Gate Array) etc Introduction: Mobile Adhoc Network (MANET) is a collection of independent mobile nodes that can communicate to each other via radio waves. The mobile nodes that are in radio range of each other can directly communicate, whereas others needs the aid of intermediate nodes to route their packets. Ad hoc networks are networks are not (necessarily) connected to any static (i.e. wired) infrastructure. An ad-hoc network is a LAN or other small network, especially one with wireless connections, in which some of the network devices are part of the network only for the duration of a communications session or, in the case of mobile or portable devices, while in some close proximity to the rest of the network.

62

Fig 1: Mobile wireless ad hoc network In this fig.1, all nodes are not in direct connection with each other but can use other nodes as relays in order to transmit to a destination. The figure also illustrates another important property of the shown ad hoc network, the inaccessibility to servers or centralized administration. "Mobile Ad Hoc Network" (MANET) is an autonomous system of mobile routers (and associated hosts) connected by wireless links - the union of which forms an arbitrary graph. The routers are free to move randomly and organize themselves arbitrarily; thus, the network's wireless topology may change rapidly and unpredictably. MANet is a distributed system with no base stations. The network is comprised of a set of mobile devices, each equipped with a radio transceiver. The devices use radio transmissions to communicate directly with other devices that are located within a small neighbouring region. To communicate with another device that is not found in the neighbouring region, a device sends its message with the expectation that it will be forwarded through the network and eventually arrive at the intended destination. It has been a big challenge to develop routing protocol that can meet different application needs and optimize routing paths according to the topology change in mobile ad hoc networks. We have developed two self-adaptive on-demand geographic routing schemes. The local topology is updated in a timely manner according to network dynamics and traffic demands. Our route optimization scheme adapts the routing path according to both topology changes and actual data traffic requirements. Each node can determine and adjust the protocol parameter values independently according to different network environments, data

63

We propose two self-adaptive on-demand geographic routing protocols. To summarize, our contributions in this work include: Analyzing the effect of outdated position information on the performance of geographic routing. Introducing route optimization schemes. To our best knowledge, this is the first geographic routing scheme that adapts the path to the change of network topology and traffic demand. Designing an efficient position distribution mechanism that can adapt its behavior under different dynamics and according to the routing requirements to reduce the control verhead and provide more accurate and updated position information for efficient routing. Adapting parameter settings in both protocols according to network environments, data In this section, we motivate and describe the role of SHORT in enhancing performance and power conservation of MANETs. Shortest Path Based Routing Paths generated by the on-demand ad hoc routing protocols can deviate far from the shortest path. Because of the mobility of the nodes in ad hoc networks, the shape of routing paths may change significantly while the connectivity is intact. Most of the previously proposed on-demand routing schemes do not initiate a new path discovery process until there is a link failure (a node fails or moves out of range). The changes in shape can be exploited in deriving better routing paths if we can avoid any significant overheads (at least avoid extra path discovery processes). Problem Description Consider a routing path from a source node A to a destination node I as shown in Figure 1(a). This initial path is determined through the path discovery process, in which the distance between the source and destination is the shortest in terms of the number of hops, or very close to it. A packet takes eight hops while getting routed from A to I. During the course of time, the mobility of the nodes may make the shape of the routing path similar to the one shown in Figure 1(b) while retaining the connectivity. In this new shape, J is in the transmission range of A, and E is in the transmission range of J. Similarly H is in the transmission range of F. However, because of the usage of route caches and the validity of the existing routing information, the routing table entries are not updated. Although functionally adequate, using the routing paths of Figure 1(b), a packet still takes eight hops to reach from node A to node I. Ideally, the shortest path from A to H needs only five hops as shown in Figure 1(c).

64

Figure 1: An example of the changes in routing Paths

The goal of this paper is to identify such situations and self-heal and optimize the paths dynamically by modifying the entries of the routing tables. The primary goal of the solution approach is to discover short-cut routing paths as and when feasible. Literature survey : The performance comparison of various protocols in literature reveals that

considered. ead, low BW (Bandwidth) and power constraint network.

65

environment. conditions. al routing strategy depends on the underlying network topology and its rate of change and traffic pattern. The above performance comparison of protocols was the result of the experiments and simulations conducted using software at Huaxhong University of Science and Technology, Wuhan, China. There is no best ad hoc routing protocol suitable for all circumstances . Hence, some new routing strategy like adaptable routing has to be applied to MANET node for higher and consistent throughput. In fact, what we need is a single protocol that gives the best features of all in a suite of reactive and proactive protocols. Therefore, three routing technique are proposed. The first technique DRR (Dynamic Reconfigurable Routing) is to study and implement the change over of routing protocols from DSR to AODV and vice versa using Glomosim. In the second routing technique we have adopted self healing routing methodology to find shortest path for ad hoc on demand routing protocol. We have proposed the third routing technique which is an adaptable routing strategy for MANET node with reconfigurable FPGA.The proposed reconfigurable node will be based on effective selection of any of four protocols, most suitable for existing network condition. The selection is done by another algorithm called ATC.

Figure 1: Node speed vs Throughput for AODV and DSR Implementation of DRR The first proposed technique is simulated using Global Mobile Simulator (Glomosim). DRR method monitors the throughput continuously. When the 66

throughput value plunges down due to sudden change in mobile speed or due to increase in number of nodes in real time scenario, Glomosim is modified to adapt to different protocol to maintain the throughput. With the dynamic adaptation of DSR or AODV, DRR maintains the throughput level of value 3600 bytes per second in which 256 bytes correspond to a packet. In the performance graph shown in figure 1, between node speed and throughput, the DRR goes to maximum using DSR upto 4m/s and using AODV from 4m/s speed to 20m/s. In the figure 2, between no of nodes and throughput, the DRR goes to maximum using AODV upto 200 nodes and using DSR from 200 to 700 nodes .Therefore by manipulating configure.sys file in Glomosim, we are able to achieve max throughput for all the speed upto 20m/s. Likewise performance are studied for throughput versus terrain size and pause time etc.

Figure 2: Number of Nodes vs Throughput for AODV and DSR Self healing routing technique :

Second routing technique proposed is called self healing routing technique of ad hoc on demand routing protocol . This explains an algorithm that, whenever for a route already obtained using a routing protocol, a short path is also available subsequently, the new route will be considered so that further transmission of packets will increase throughput. AODV protocol is used with this approach. Paths generated by the on-demand ad hoc routing protocols can deviate far from the shortest path. Because of the mobility of the nodes in ad hoc networks, the shape of routing paths may change significantly while the connectivity is intact. Most of the previously proposed on-demand routing schemes do not initiate a new path discovery process until there is a link failure (a node fails or moves out of range). This leads to reduction in throughput and increases end to end delay. The proposed self healing routing technique will monitor the network for topology changes and gradually start self healing the link before it is getting broken. Different types of routing protocols work in different manner, we need to have different algorithms for short-cut identification and reaction. Path aware self healing distance vector (Pa-sh-dv) algorithm works with AODV. The self healing technique is simulated using 67

Glomosim and the results are obtained for packet delivery rate versus node speed between 0 m/s and 35m/s. Both AODV and AODV-SH are simulated. The results show the increase in delivery rate for both single source single destination and multi source multi destination category.

Figure 3: Delivery rate vs Node speed in m/s In this figure at lower speed for both protocols delivery rate may be same but once speed picks up, delivery rate for plain AODV goes down whereas the delivery rate is maintained for AODV- SH. Also in the figure 4, the same effect is witnessed in the graph for multi source multi destination (MSMD). Conclusion And Future Work : In this paper, we proposed a framework of self-healing and optimizing routing techniques for mobile ad hoc networks. Using this technique, all the neighboring nodes monitor the route and try to optimize it if and when a better local sub-path is available. Thus this technique enhances the packet delivery rate with respect to node speed and no. of nodes. This technique adopt different schemes to obtain and maintain local topology information on data traffic demand. The aim of this work is the study of various routing techniques for MANET nodes to obtain the route through which data is to be transferred to the destination with maximum throughput. In this routing technique we have adopted self healing routing methodology for ad hoc on demand routing protocol with shortest path using the Glomosim . SHORT improves routing optimality by monitoring routing paths continuously, and gradually redirecting the path towards a currently more optimal one. The basic idea is to let neighboring nodes of a routing path, together with the on-route nodes, monitor the route, so that up-to-date information about relative local opology and link quality is exploited. When more optimal sub-path occurs, and is estimated to be stable, it will be utilized to redirect the route. We have analyzed and evaluated various.We can use other metrics in self-optimizing operations, such as the QoS requirements. This will be the theme of our future work. In

68

addition, we plan to evaluate the effectiveness of SHORT for optimizing multicast tree/mesh constructions and maintenance. References 1.

Based Route Maintenance Journal of Computer Science, vol. 4, no. 3, pp. 172

2.

180, 2008. M. Al-Shurman, S.-

3.

ACM-SE 42: Proceedings of the 42nd annual Southeast regional conference. New York, NY, USA: ACM, 2004, pp. 25 30. S. Banerjee, and A. Mi -hop Wireless Networking(MOBIHOC), 2002.

4.

5.

-Life Curve for a Wireless Ad Hoc Networking(MOBIHOC),2001. J. Broch, D. A. Maltz, D. B. Johnson, Y.Hop Wireless Ad Hoc Network Routing Protoc Computing and Networking (MOBICOM), 1998.

-

6. -ietf-manet-dsr-01.txt, Dec. 1998 (work in progress.) 7.

8.

Wireless Communication & Mobile Computing (WCMC): Special issue on Mobile Ad Hoc Networking: kshop(HSN),June, 2001. -Aware Routing -gomez-paro-manet-00.txt, work in progress, IETF, Mar. 2001.

69

ISSUES AND CHALLENGES OF E-COMMERCE IN CONTEMPORARY WORLD Krishan Kant Yadav Mahendra Singh Dharmendra Singh Abstract The number of Internet users around the world has been steadily growing and this growth has provided the impetus and the opportunities for global and regional e-commerce. However with Internet, different characteristics of the local environment, both infrastructural and socio-economic, have created a significant level of variation in the acceptance and growth of e-commerce in different regions of the world. Electronic commerce is a process of doing business through computer networks. In this growing age of globalization, online businesses are trying hard to please customers and generate more and more profits. For developing countries like India, e-commerce offers great opportunity both for customers and businesses. The emergence of the Internet as a general communication channel has opened the opportunity for E-Commerce (EC) to expand worldwide. Electronic commerce utilizes information and communication technologies to carry out market transactions among two or more parties-usually businesses and consumers. How the e-commerce help to expand the business unit, how ecommerce reduce the cost of product and services, how e-commerce helps to social cost benefit, how it helps to government to provides the health care services and serve to society. E-commerce has revolutionized business, changing the shape of competition with internet. The computer communication network creating an e-commerce market place for consumers and business. India is showing tremendous growth in the field of e-commerce. The main objective of this paper is to explain the role of e-commerce in the contemporary world. Keywords: E-commerce, Computer networks, Social cost benefits, Infrastructural and socio-economic, Information and communication technologies, Competition with internet. 1.

Introduction

-Commerce briefly known as sharing of business information, maintaining business relationships and conducting business transactions by using computers interconnected by a technologies such as mobile commerce, electronic funds transfer, supply chain management, Internet marketing, online transaction processing, electronic data interchange (EDI), inventory management systems, and automated data collection systems. Electronic commerce or e-commerce refers to a wide range of online business activities for products m of business transaction in which the parties interact electronically rather than by physical exchanges or direct physical contact. Hitesh Khurana, et al (2011) It is believed that low cost of personal computers, a growing installed 70

base for Internet use, and an increasingly competitive Internet Service Provider (ISP) market will help fuel eIn the wake of globalization and liberalization of Indian economy, there has been a sweeping transformation in almost all spheres of trade, industry and commerce. In this scenario, the organizations have to face new challenges, threats and opportunities in terms of technology, quality, fierce competition, customer relations, human resource development, hedging of financial risk and so on. In this growing age of globalization, online businesses are trying hard to please customers and generate profits. In the wake of globalization and liberalization of Indian economy, there has been a sweeping transformation in almost all spheres of trade, industry and commerce. In this scenario, the organizations have to face new challenges, threats and opportunities in terms of technology, quality, fierce competition, customer relations, human resource development, hedging of financial risk and so on. In this growing age of globalization, online businesses are trying hard to please customers and generate profits. In the emerging global economy, ecommerce and e-business have increasingly become a necessary component of business strategy and strong catalyst for economic development. The integration of information and communications technology (ICT) in business has revolutionized relationships within organizations and those between and among organizations and individuals. 1.1 E-commerce Electronic commerce is the paperless exchange of business information using electronic data interchange (EDI), e-mail, electronic bulletin boards, fax transmissions, and electronic funds transfer. It refers to Internet shopping, online stock and bond transactions, the and business-to-business transactions. The concept of e-commerce is all about using the Internet to do business better and faster. E-commerce can occur within and between three basic participant groups government, and Individuals.

Figure 1: E-Commerce Participant Groups 1.2 E-Commerce VS. E-Business

71

business,

E-commerce refers to online transactions - buying and selling of goods and/or services over the Internet. While e-business covers online transactions, but also extends to all Internet based interactions with business partners, suppliers and customers such as: selling direct to consumers, manufacturers and suppliers; monitoring and exchanging information; auctioning surplus inventory; and collaborative product design. These online interactions are aimed at improving or transforming business processes and efficiency. E-commerce and e-business interchangeably, they are distinct concepts. In e-commerce, information and communications technology (ICT) is used in inter-business or interorganizational transactions (transactions between and among firms/organizations) and in business-to-consumer individuals).In e-business, on the other hand, ICT is used to business. It includes any process that a business organization (either a forprofit, governmental or non-profit entity) conducts over a computer-mediated network. Bhaskar Bharat (2009). E-commerce and e-business both address these processes, as well as a technology infrastructure of databases, application servers, security tools, systems management and legacy systems. And both involve the creation of new value chains between a company and its customers and suppliers, as well as within the company itself. 1.3 Role of E-Commerce in Contemporary World Numbers of transaction and events depend upon the eactivities carrying out by the Ecommerce just to reduce the time, energy in short reducing Cost. There are many applications of e- commerce; some of these are as follows: 3.1 Electronic Auction 3.2 Electronic Banking 3.3 Education and Learning 3.4 Marketing 3.5 Supply Chain Management 3.6 Electronic Trading

72

Figure 2: Role of E-Commerce

1.3.1Electronic Auction Auctions have been well established market mechanism for trading items at a market negotiated price, based upon demand and supply. Traditional auctions had limited participation of people who turned up at the place of auction. Today, the same auction mechanism can be implemented using electronic commerce technologies, allowing people connected to the internet to bid. Auctions have been utilized as a useful economic mech trading community, pioneered person-to-person online trading. 1.3.2

Electronic Banking The increase in penetration of personal computers in home segments has led to the emergence of several financial management software packages such as Quicken, Microsoft Money, and peachtree. Software packages such as Quicken permits users to organize interpret and manage personnel finances. Using Quicken, users record and categorize all financial transaction on a PC. The user can later use the software to balance the checkbook, summarize credit card purchases, track stocks and other investments. ICICI Bank, Citibank, HDFC Bank, and Indus-Ind Bank have been offering internet banking services for the past few years. 1.3.3Education and Learning The internet has lately been used as a delivery vehicle for training and learning as well. The web technology provides a uniform delivery mechanism for textual multimedia and 73

animated contents. The market research group IDC defines e- learning as the concept of delivering training over the internet to the desktop. E-learning has already taken powerful roots and is emerging most predominantly in the information and technology universe, presumably, because It professionals are more comfortable working with the new technology. Training and continuing education in the field of information and technology has evolved from what was once defined by a necessity of spending hours outside an office in a classroom, or hours in front of a computer reviewing flat, computer based training(CBT) presentations to a flexible anytime anywhere convenience mode. 1.3.4Marketing Traditional marketing practices have relied upon on one way communication due to the nature of the media. Surveys to steer the direction of a company, to gauge consumer preferences, inclinations and barriers took time to collect, process, and published. Internet and electronic commerce technologies have been utilized mitigating some of these problems. I internet enables marketing is not a substitute for traditional marketing, but has emerged as a good augmenting mechanism. With the interactivity offered by the internet, the marketing communication needs not to be a one way mode anymore. The internet can be as media by itself for delivering communication including advertisements. Several new models have already emerged and have given rise to a multibillion dollar internet advertising industry. 1.3.5Supply Chain Management The inter-organizational business process that changes the manufacturer, logistics companies, distributers, suppliers, retailers and customers together for facilitate order generation, execution and fulfillment, has evolved over the past quarter of a century. In addition to product quality, customers deal with businesses depending upon their ability to execute the handling and delivery reliably and promptly. Supply Chain Management deals with three issues: 1. Coordinating all the order processing activities that originate at the customer level, such as the process of order generation, order acceptance, entry into order processing system, prioritization, production, and material forecast; 2. Material related activities such as scheduling, production, distribution, fulfillment and delivery; and 3. Financial activities such as invoicing, billing, fund transfer, and accounting. Electronic trading, in short is a mechanism that utilizes the power of electronics and communication media, such as the internet, to bring together geographically dispersed buyers and sellers on a virtual common trading platform. The common platform offers aggregated information to all participants in a fair manner. The platform facilitates access to aggregate information, order booking, and fulfillment. 1.3.6Electronic Trading 74

In the context of stock market, e-trading means buying and selling equity online through electronic means. The buyers and sellers registered for electronic trading, rather than relying on phone conversations to track and collect information followed faxed orders to buy or sell, can use the do it-yourself paradigm. Investors can access their accounts with the broker by logging on to the network. Bhaskar Bharat (2009) 2.

E-Commerce Issues

In the above section of this review paper we have discuss number of points that show tremendous growth and opportunity of e- business and e-commerce but could not discuss about its issues. Number of issues of emerging e-business companies are facing now-adays. These are 2.1.Legal Issues 2.2.Technical Issues 2.3.Ethical Issues 2.4.Legal Issues

Figure 3: E-commerce issues There is a chance of a crime over the internet when buyers and sellers do not know each other and cannot even see each other. Committed fraud over internet is a immense legal issue. Fraud on the Internet E-commerce fraud popped out with the rapid increase in popularity of websites. It is a hot issue for both cyber and click-and-mortar merchants. The swindlers are active mainly in the area of stocks. The small investors are lured by the promise of false profits by the stock promoters. Auctions are also conductive to fraud, by both sellers and buyers. The availability of e-mails and pop up ads has paved the way for financial criminals to have access to many people. Copyright The copyright laws protect Intellectual property in its various forms, and cannot be used freely. It is very difficult to protect Intellectual property in E-Commerce. For example, if you buy software you have the right to use it and not the right to distribute it. The distribution rights are with the copyright holder. Also, copying contents from the website also violates copy right laws. Domain Names the competition over domain names is another legal issue. Internet addresses are 75

known as domain names and they appear in levels. A top level name is qburst.com or microsoft.com. A second level name will be qburst.com/blog. Top level domain names are assigned by a central non-profit organization which also checks for conflicts or possible infringement of trademarks. Problems arise when several companies having similar names competing over the same domain name. The problem of domain names was alleviated somewhat in 2001 after several upper level names were added to com. Another issue to look out for is Cyber squatting, which refers to the practice of registering domain names with the desire of selling it at higher prices. Security features such as authentication, non-repudiation and escrow services can protect the sellers in e-commerce. One needs to be careful while doing e-commerce activities. The need to educate the public about the ethical and legal issues related to e-commerce is highly important from a buyer as well as seller perspective. 2.2 Technical issues Privacy is a serious issue in electronic commerce, no matter what source one consumes. privacy the control over ones personal data and security the attempted access to data by unauthorized others are two critical issues for both e-commerce and sites alike. Mark S. Ackerman and Donald T. Davis (2014) Even the world's largest sites, such as Amazon and eBay, have fallen victim to concerted hacking attacks. Even accessing internet you should ensure that information is never stored in an unprotected directory accessible from the web. 2.3 Ethical issues Online commerce has been growing rapidly since the liberalization and globalization came into existence and every business operation of any size has a website for the sale of its goods/services. With the anonymity of the Internet, however, it's very difficult for a buyer to really know and trust the seller. It's not difficult for computer hackers to access vital personal information online. Customers use credit cards; they transmit card numbers, expiration dates and security codes over the Internet, where it's vulnerable to theft. A debit card number and expiration can provide direct access to a bank account, although the unauthorized user in most cases would also need a PIN. To avoid these issues, e-commerce sites need to use updated security software to encrypt personal information. Online sellers can ship damaged or counterfeit goods to customers, or fail to ship any goods at all. They may refuse returns or fail to give credit to the customer who in good faith returns the purchase. They may fail to protect goods in shipment and refuse to take any responsibility when the goods are damaged. Unresponsiveness is another common complaint in the e-commerce world; websites may offer a customer help line but never answer it or divert the customer to the wrong number. 3.

E-Commerce: Challenges

India one of the most attractive emerging markets for ecommerce. But India is far from being a bed of roses. Here are the few challenges that e-commerce businesses face are given below. 76

5.1 Payment Mode 5.2 Failure rate of payment gateway 5.3 Internet penetration is low 5.4 Postal addresses are not standardized 5.5 Logistic problems 3.1 Payment Mode Low credit card penetration and low trust in online transactions has led to cash on delivery being the preferred payment option in India and worldwide. Unlike electronic payments, manual cash collection is painstaking, unsafe, and expensive. 3.2 Failure rate of payment gateway As if the preference for cash on delivery was not bad enough, Indian payment gateways have an unusually high failure rate by global standards. E-commerce companies using Indian payment gateways are losing out on business, as several customers do not re-attempt -commerce companies to reduce the failure rate of payment gateway as much as earlier. 3.3 Internet access Internet access is challenging task for emerging e-commerce companies because, the quality of connectivity is poor in several regions and it disappears anytime and anywhere. Although that day is not far when connectivity issues would not feature in a list of challenges to ecomme (2014) These are common challenges that every e-commerce companies are facing and few others are standardized postal addresses and logistic problem are very common one. If postal addresses are not correct, Logistic problem will automatically get higher; if postal addresses are not correctly declare. 4.

E-Commerce Applications: Issues and Prospects Various applications of e-commerce are continually affecting trends and prospects for business over the Internet, including e-banking, e-tailing and online publishing/online retailing. A more developed and mature e-banking environment plays an important role in ecommerce by encouraging a shift from traditional modes of payment (i.e., cash, checks or any form of paper-based legal tender) to electronic alternatives (such as e-payment systems), thereby closing the e-commerce loop. Benefits of e-Commerce 77

Expanded Geographical Reach Expanded Customer Base Increase Visibility through Search Engine Marketing Provide Customers valuable information about your business Available 24/7/365 - Never Close Build Customer Loyalty Reduction of Marketing and Advertising Costs Collection of Customer Data 5.

Conclusion Even though there have been earlier studies that have tried to understand and address issues related to e-commerce, there have been very few that have focused on the impact of culture and non-infrastructure related issues. We were partially able to address this deficiency by conducting primary research on development and acceptance of e-commerce in a developing country that has very unique cultural characteristics. Our findings show that, even though a developing country government may make the necessary investments in infrastructure. E-commerce is continuously progressing and it is becoming more and more important to business as technology continues to advance and is something that should be taken advantage of and implemented. It helps to consumer in all-time available and supports, quicker delivery of products, virtual auctions, provides substantial discounts to customers, and hence it is better option to the society also, reducing cost of products, reducing sales price, helps government to deliver public services like health care, education and so on. So it concludes that electronic commerce is playing very vital role for the development of nation.

6. References 1. 2. 3. 4. 5. 6. 7. 8.

Ajeet khurana" 8 challenges for e-commerce in India", December 23, 2013 is accessed fromhttp://blogs.pb.com/ecommerce/2013/12/23/8-challenges-ecommerce-india/ on 0ctober 19, 2014 Bhasker, Bharat. (2009), Electronic commerce: Framework Technologies and Applications, Classification of e-commerce (3rd Edition), Tata McGraw Hill Education Private Limited, New Delhi, India. E-business versus E-commerce is accessed from http://www.austrade.gov.au/e-business-versus-ecommerce/default.aspx on October 18, 2014. E-commerce is accessed from http://en.wikipedia.org/wiki/E-commerce on October 18, 2014 Hitesh Khurana, Manoj Kr. Goel, Hardeep Singh and Leena Bhutani "E-Commerce: Role of E-Commerce in -IJBMR, Vol. 1 (7), 2011, 454-461, ISSN No.2231-248X Introduction is accessed from http://link.springer.com/chapter/10.1007/978-1-4614-4142-7_4#page-1 on October 18, 2014. Legal issues is accessed from http://blog.qburst.com/2011/03/e-commerce-ethical-and-legal-issues/on October 19 2014 accessedfromhttp://web.eecs.umich.edu/~ackerm/pub/03e05/EC-privacy.ackerman.pdf on October 19,2014

78

9.

10. 11. 12. 13.

Commerce)", Episteme: an online interdisciplinary, multidisciplinary & multi-cultural journal Bharat College of Commerce, Badlapur, MMR, India, September 2013,Volume 2, Issue 2 -commerce technology (Eastern Economy Edition), PHI Learning Private Limited, New Delhi, India. The Definition of E-Commerce is accessed from http://www.businesstown.com/internet/ecomm-definition.asp on October 18, 2014. The Ethical Problems In E-Business is accessed from http://smallbusiness.chron.com/ethical-problemsebusiness-62037.html on October 19, 2014 www.libguides.radford.edu/content.php?pid=49958&sid=558025.

79

A FIRST DATA MINING MODEL FOR PREDICTING CUSTOMER PROFITABILITY Dr. Madhur Srivstava Dr. Dharmendra Badal Abstract The purpose of this paper is to walk you through a complete, real-life scenario for using data mining to predict customer profitability. Imagine you are a commercial organization, continually receiving requests for goods or services from new customers. You want to predict how much business a new customer will bring you in the longer term. This might help you determine which customers deserve special consideration. To do this you will use your history of new customer orders and how the customers developed in the longer term. In this walkthrough, you will use a classic data mining method to analyze real-life data. Sample Data for Data Mining Data mining was once a very expensive technology that was used by small elite teams in head offices of a few very large companies. Now that data mining algorithms have become a commodity, anyone can employ data mining algorithms at a very low cost. For example, data mining tools have been included in SQL Server since 1998, and new tools, such as the Data Mining Add-Ins for Excel, can be downloaded for free. In data mining, you really need to have real data to discover interesting relationships. So for this paper we will use some real data that I have, although it is not customer data. The data is the HTTP web logs from my Internet site. The structure of the Internet log data is similar to what a commercial organization might have for new customer requests. Customer requests would typically have attributes such as the customer name and demographics, product and channel, whereas the http requests have such attributes as the client, the location from which the request was issued, agent, resource requested, etc. While a commercial organization might want to predict profitability, we will predict http response times. Predicting Customer Profitability Essentially our challenge is that we have a collection of attributes, some discrete, some continuous, and we want to predict a continuous variable (a number). A typical customer profitability model would use customer demographics, channel, product requested, volume etc to predict future profitability. Our model will use geographic location, client agent, resource etc to predict response time. There are thousands of ways that you could use a very similar structure to predict a continuous number for a different real-world application. These applications include predicting delivery time, project time, project revenue, employee tenure, lease duration, residual value etc. Therefore, once you have gone through this exercise, it should be very easy for you to build your own customer profitability model or other similar model using

80

We will build a data mining model using the following attributes to predict response time City Country Client operating system Client agent (browser and version) HTTP status HTTP operation Referring server Target resource Target resource type (e.g., .htm, .jpg, .zip) Bytes out In a customer profitability prediction model, the attributes would be different, but in both cases they are a mix of discrete variables (such as country) and continuous variables (such as bytes out and ordered volume). In any data mining exercise, one of the first tasks is to identify the input variables and the output (predicted) variable(s). Note, in some data mining tasks there is no desire to predict a variable, for example some clustering exercises do not make predictions, they just cluster. Designing the Data Mining Model in SQL Server 2008 Building the Data Mining Model 1. Start the SQL Server Business Intelligence Development Studio from Start/Programs/SQL Server 2008 2. From the File menu, click New/Project and select Analysis Services Project. 3. Add a Data Source that points to the SQL Server database that you have downloaded and restored. 4. Add a Data Source View, and add to it all the tables and views from the SQL database. 5. Right click Mining Structures and then click New Mining Structure. 6. Accept the default data mining technique of Decision Trees. Later you can add other models, based on different algorithms, and apply them to the same data.

81

7. Select InternetLog as the case table InternetLog is the RID column. 8. Select all other columns as input columns, except ResponseTime and LogTime. We ones are useful, we can include them all as candidate columns. 9. Check the Predict column for ResponseTime. The diagram below show the dialog boxes provided by BIDS for choosing columns and setting the content types for each column. 10. Click Next and accept the default training size of 30 percent. If your model is taking too long to process, or if you are using a server without much memory or a slow CPU, you will want to change the model settings to reduce processing time. One way to reduce processing time is to increase the holdback percent, meaning you reduce the amount of data that is used for training. In a large data set, you can increase the size of holdback up to as high as 90 percent. Another way to speed up processing is to reduce the number of input variables. For example, bytes-in and city are probably of little use so you can remove them from your model. 11. Accept the default data mining structure name, but add DT to the end of the model name. This is to help you remember that that this model is using a decision trees algorithm. 12. Click Finish and process the data mining model.

Figure 1 Viewing the Data Mining Model The data mining model has been processed and we can now view what the model has found.

82

Because our model is a decision tree structure, we can view the model in its tree view. There is one tree for every discrete outcome. Because this model has one continuous outcome (ResponseTime), there is only one tree. Note that your decision tree may look a little different from this picture, since SQL Server has randomly selected which records to hold back from training, which will be used to validate the model.

Figure 2 Step 1. Click on the Mining Model Viewer tab and select the nested Decision Tree tab. You will see the first five levels of the decision tree structure. Each node in the tree is represented in the viewer as a box with a decision heading such as ResourceType=gif. Also, if you focus on any one of the nodes, you can see the outcome at that node. If you focus on one of the nodes, you can read information about this node. For example, in the node labeled ,I can read the following: Criteria for this node : Bytes Out < 1821932 and OK not = 'NOK' and Resourcetype not = '.gif' and Resource = '/IndicatorImage.aspx' Support for this node: Existing Cases: 4122. This tells you how many records in the training data satisfy the criteria above. The prediction of our output variable: ResponseTime.Notice how the prediction is not just a number.This is because we had continuous input columns (BytesIn and BytesOut).Essentially the decision tree is using these continuous input variables as regressors to assist with the prediction.It is predicting the ResponseTime at this particular node to be 474.474+0.085*(Bytes In-1,017.390)-0.190*(Bytes Out-556.796).It is logical to me that it would use BytesIn and BytesOut in this way.The larger the number of bytes transferred, the longer response time, given other attributes are the same. Note that the units of ResponseTime are milliseconds, the same as the source data. ResponseTime = 474.474+0.085*(Bytes In-1,017.390)-0.190*(Bytes Out-556.796). 83

Step 2. Now click on the Dependency Network nested tab. This tab provides an extremely simplified view of the relative importance of the input variables in predicting our output variable. That is, the graph indicates how useful each of the variables is in predicting response time. I say it is extremely simplified because the attributes do not work alone, but they work in combinations. However, the diagram can be very useful: for example, I find this type of graph particularly useful in a basket analysis model, where it will highlight the dependencies between products.

Figure 3 Step 3. Click on Response Time. All of the input variables will change color to indicate that they help predicting response time. Step 4. Now drag the slider bar on the left downward. Notice how some of the input variables are shaded out. These are the less important variables. Step 5. Keep sliding the bar downward until you isolate the most important input variable. In my model it happens to be ResourceType. That is logical. For example, the response time will vary largely depending on whether the HTTP resource is a .jpb, aspx, zip. SQL Server makes it very easy to try other algorithms for this model. I suggest that you try creating some additional models after you have completed this paper. To add a new model to existing data, simply click on the Create a related mining model button in the Mining 84

Models algorithms for this particular model. For example, the time series algorithm is not an nput and/or output variables. If you get an error, you can choose to ignore continuous input variables, or you might choose to discretize the continuous variables. SQL Server can help you do this. Determining Model Accuracy Microsoft SQL Server 2008 comes with tools to help you determine the accuracy of your preceding analysis.

Figure 4 In this response time model, we are predicting a continuous number. So how do we know if number? A good way to validate and compare models is to determine the correlation coefficient between the actual response time and the predicted response time. The higher the correlation coefficient, the stronger the correlation, and therefore the better the model. When testing the validity of your model it is important to test the model with data that it has model, by default SQL Server will hold back 30 percent of the data for testing, though you can change this amount by setting the HoldOutMaxPercent property. To generate an 85

accuracy chart, SQL Server will use this use your own dataset. For this walkthrough, the sample database contains a table called InternetLogNew, which contains new HTTP records that the model also has not used for training. InternetLogNew is a view over log records that the model has not been trained on, and the view will randomly select 50 records from this table. Therefore, each time you refresh the scatter chart, a new 50 records will be selected, so the results might be slightly different depending on which records are chosen. 1. Click on Mining Accuracy Chart tab. In the Input Selection nested tab you are able to select the test data you wish to use. By default, it will use the data held back, but for this walkthrough, select another table, InternetLogNew. 2. Click on the Lift Chart and select Scatter Plot. SQL Server will plot predicted values against actual values for your test data against the model. If the data mining model was a perfect predictor, all the dots will be along the diagonal. If the data mining model is no good as a predictor, the dots will tend to be scattered randomly across the chart. Generally you will find that the dots will be scattered, with a tendency to group along the diagonal. However, do not expect to always see the data tightly clustered along the diagonal. Data mining models can be helpful, even though their correlation does not look very strong. Any correlation above 0 means that the model can help you make predictions more accurately than random. Querying the Data Mining Model This is an area where I believe Microsoft SQL Server has taken data mining to a new level. SQL Server makes it very easy to dynamically query the data using numbers entered at the client, or data coming from a relational database. The following procedure describes how to create queries to get predictions or get more details about the patterns in the model. 1. Click on Mining Model Prediction tab. 2. Click on Select Case Table button and select InternetLogNew. 3. In the lower grid, click on the first row below Source. In the drop down combo list select InternetLogNew table. In the Field column leave it at RID. 4. Repeat the above step for every column in the InternetLogNew table. Although it can be tedious, I suggest that you use the wizard this first time to build the basic query. After you learn how to edit the data mining query, you can bypass the UI and type the column names directly if you want. 5. Add one more row. This time, for source, select Prediction Function and in the Field column select Predict. There are actually two Predict functions, one for scalar and one for

86

6. Using your mouse, drag Response Time from the data mining model (not the SQL table) on the top left of your screen to the Criteria/Argumentcolumn. The wizard will fill out the field.

Figure 5 7. Click on the Switch to query result view button at the top left of the screen. You will now see the data from the new Internet records and the associated data mining prediction. not complex. Click on the top left button and select query view. You will see the DMX query that the wizard has generated for you. The DMX language has been designed to be as close to the available in SQL, but I think you will find DMX a relatively easy language to learn. In my opinion it is much simpler to learn than MDX, which is the OLAP cube query language. 87

Note that the OpenQuery requests data from the relational database using the view, InternetLogNew. This view is a query that gets the latest Internet log records (or in our demonstration randomly selects log records that are not in the training data). The T-SQL statement for the view is provided here. You can see that it will get the top 50 records and that it is ordered by a random key, so that a different 50 records are retrieved on each request. In your production implementation, you would not order by a random key, but rather would select the last 50, or the top n records ordered by their predicted profitability, etc. Transact-SQL CREATE view [dbo].[InternetLogNew] as SELECT top 50 l.rid ,l.LogTime ,l.ResponseTime ,l.BytesIn ,l.BytesOut ,g.Country ,g.State ,g.City ,t.resource ,t.resourcetype ,ca.clientagnt ,ca.os ,case when l.ResponseTime>60000 and BytesOut=0 then 'NOK' else hc.OK end OK ,hc.http_status ,o.operation ,r.referringServer ,ch.ClientHost

88

FROM InternetLogTableNew l inner join ClientHosts ch on ch.clienthostid=l.clienthostid inner join IPCountry ipc on ipc.IPCountryID=ch.IPCountryID inner join Geographies g on g.GeographyID=ipc.GeographyID inner join Targets t on t.Targetid=l.targetid inner join ClientAgents ca on ca.clientagentid=l.clientagentid inner join HTTP_codes hc on hc.httpstatuscode=l.Status inner join Operations o on o.operationid=l.operationid inner join Referrers r on r.ReferrerID=l.ReferrerID ORDER BY RAND(convert(float,bytesin)*DATEPART(ms,GETDATE())/1000.0) +RAND(convert(float,bytesout)*DATEPART(ms,GETDATE())/1000.0) This particular query makes predictions for testing; however, another way customers are using data mining models is to make predictions in bulk. The predictions are then stored back in the data warehouse or in an OLAP cube for aggregated analysis. For example, you might predict summarized revenue by region, salesperson, or product group etc. Creating Reports Sometimes data mining models are created only to perform an analysis and are then discarded. However, increasingly organizations are building data mining models, like the one above, that become part of the everyday applications available to a wide number of staff and/or customers. Because SQL Server has generated the query for you, it is a very easy task to put the query into a Reporting Services report, and you have a data mining prediction report. Use of data mining reports is transforming the way that data mining is employed. It is taking data mining from the exclusive preserve of elite head office teams to the whole organization. For example, I created the following report based on the query above, and then created a scatter chart by using the Reporting Services charting component.

89

Figure 6 To create a Reporting Services report using the data mining model to make predictions from your model, follow these instructions. This will add a Reporting Services project to your data mining solution. I like the way BIDS (Business Intelligence Development Studio) can keep these projects in the same solution. It is not uncommon to have a BIDS solution with three projects: Integration Services for data extraction and load) Analysis Services for cube and/or data mining models) Reporting Services). 1. In your Visual Studio project, right click on the solution and click Add New Project. Select Report Server Project, provide a name and click OK.

90

2. Right click on Shared Data Sources, and set up a new data source that points to the Analysis Services where your data mining model is deployed. 3. Right click on the Reportsfolder and select Add New Report. 4. Click Next. 5. Click Query Builder. 6. Click the right most button Design Mode. 7. Now paste the data mining query, the wizard created for you at step 8 above, into the query pane. 8. Click OK. 9. Click Next in the Design the Query pane. 10. Select Tabularas the Report Type. 11. Add all the Available Fields to the Details section of the report by highlighting the fields and clicking on the Details button. 12. Select your favorite Table Style, or leave to the default and click Next. You can change 13. Enter a name for your report, such as Response Time Prediction. 14. Click Finish. 15. You are now able to preview your report in the Preview pane. You can change the formatting, edit the DMX query, add a scatter Plot chart etc to your report and deploy to your reporting services server. 16. Notice how fast the data mining report is to make 50 predictions. Generally, you will find that data mining predictions are very fast, it is just the model processing that can take some time. Deploying and Updating the Data Mining Solution When you create your production data mining model, you may want to schedule a regular update of the model. This is necessary because the factors that predict a profitable customer will change over time, and you want the data mining model to adapt to those changes. So, for example, you might have the mining model reprocess at the end of each week or month. During reprocessing, the original data mining model remains available for online querying.

91

Conclusion In this short time we have created a data mining model, validated the model, and used the model to create a user-friendly report. When you embed data mining in a report, it also becomes accessible to non-technical or customer-facing employees who know little about data mining but can benefit from the predictions. This white paper was intended to help you get started with data mining. Data mining is a wide and exciting area of business intelligence, and many organizations have yet to appreciate how they can profit from its use. I anticipate the application of data mining to grow exponentially over the next few years. By and large, the application of data mining is constrained only by our imagination. Bibliography 1. 2. 3. 4. 5. 6. 7.

http://richardlees.com.au/ (real time data mining demonstrations) http://richardlees.blogspot.com/ (one of many data mining bloggers) http://msdn.microsoft.com/en-us/sqlserver/cc511476.aspx (Microsoft Technical Resources for Data Mining) http://www.wiley.com/WileyCDA/WileyTitle/productCd-0470277742.html (good text about SQL Server Data Mining) http://www.microsoft.com/sqlserver/2008/en/us/data-mining.aspx (Microsoft data mining marketing page) http://social.msdn.microsoft.com/Forums/en-US/sqldatamining/threads (SQL Server data mining forum) http://www.sqlserverdatamining.com/ etc)

92

REVIEW ON FEATURE EXTRACTION BASED ON DIAGONAL DIRECTION FOR HANDWRITTEN RECOGNITION SYSTEM USING NEURAL NETWORK Samta Jain Goyal Abstract An off-line handwritten character recognition system using multilayer feed forward neural network is described in the paper. A new method feature extraction based on diagonal directions is introduced for extracting the features of the handwritten characters. The proposed recognition system performs quite well yielding higher levels of recognition accuracy compared to the systems employing the conventional horizontal and vertical methods of feature extraction. This system will be suitable for converting handwritten documents into structural text form and recognizing handwritten names. Keywords: Handwritten character recognition, Image processing, Feature extraction, feed forward neural networks. 1. Introduction Handwriting recognition has been one of the most fascinating and challenging research areas in field of image processing and pattern recognition in the recent years [1] [2]. It contributes immensely to the advancement of an automation process and can improve the interface between man and machine in numerous applications. Several research works have been focusing on new techniques and methods that would reduce the processing time while providing higher recognition accuracy [3]. In general, handwriting recognition is classified into two types as off-line and on-line handwriting recognition methods. In the off-line recognition, the writing is usually captured optically by a scanner and the completed writing is available as an image. But, in the on-line system the two dimensional coordinates of successive points are represented as a function of time and the order of strokes made by the writer are also available. The on-line methods have been shown to be superior to their off-line counterparts in recognizing handwritten characters due to the temporal information available with the former [4] [5]. However, in the off-line systems, the neural networks have been successfully used to yield comparably high recognition accuracy levels .Several applications including mail sorting, bank processing, document reading and postal address recognition require off-line handwriting recognition systems. As a result, the off-line handwriting recognition continues to be an active area for research towards exploring the newer techniques that would improve recognition accuracy [6] [7]. The first important step in any handwritten recognition system is pre-processing followed by segmentation and feature extraction. Pre-processing includes the steps that are required to shape the input image into a form suitable for segmentation [8]. In the segmentation, the input image is segmented into individual characters and then, each character is resized into m x n pixels towards the training network.

93

The Selection of appropriate feature extraction method is probably the single most important factor in achieving high recognition performance. Several methods of feature extraction for character recognition have been reported in the literature [9]. The widely used feature extraction methods are Template matching, Deformable templates, Unitary Image transforms, Graph description, Projection Histograms, Contour profiles, Zoning, Geometric moment invariants, Zernike Moments, Spline curve approximation, Fourier descriptors, Gradient feature and Gabor features. An artificial neural Network as the backend is used for performing classification and recognition tasks. In the off-line recognition system, the neural networks have emerged as the fast and reliable tools for classification towards achieving high recognition accuracy [10]. Classification techniques have been applied to handwritten character recognition since the 1990s. These methods include statistical methods based on Bayes decision rule, Artificial Neural Networks (ANNs), Kernel Methods including Support Vector Machines (SVM) and multiple classifier combination [11], [12]. U. Pal et al, have proposed a modified quadratic classifier based scheme to recognize the offline handwritten numerals of six popular Indian scripts [7]. Multilayer perceptron has been used for recognizing Handwritten English characters [13].The features are extracted from Boundary tracing and their Fourier Descriptors. Character is identified by analyzing its shape and comparing its features that distinguish each character. Also an analysis has been carried out to determine the number of hidden layer nodes to achieve high performance of back propagation network. A recognition accuracy of 94% has been reported for handwritten English characters with less training time. Dinesh et al [14] have used horizontal/vertical strokes and end points as the potential features for recognition and reported a recognition accuracy of 90.50% for handwritten Kannada numerals. However, this method uses the thinning process which results in the loss of features. U. Pal et al [15] have proposed zoning and directional chain code features and considered a feature vector of length 100 for handwritten numeral recognition and have reported a high level of recognition accuracy. However, the feature extraction process is complex and time consuming. In this paper, a diagonal feature extraction scheme for the recognizing off-line handwritten characters is proposed. In the feature extraction process, resized individual character of size 90x 60 pixels is further divided into 54 equal zones, each of size 10x10 pixels. The features are extracted from the pixels of each zone by moving along their diagonals. This procedure is repeated for all the zones leading to extraction of 54 features for each character. These extracted features are used to train a feed forward back propagation neural network employed for performing classification and recognition tasks. Extensive simulation studies show that the recognition system using diagonal features provides good recognition accuracy while requiring less time for training. The paper is organized as follows. In section II, the proposed recognition system is presented. The feature extraction procedure adopted in the system is detailed in the section 94

III. Section IV describes the classification and recognition using feed forward back propagation neural network. Section V presents the experimental results and comparative analysis. In section VI, the proposed recognition system in Graphical User Interface is presented and finally, the paper is concluded in section VII. 2. The Proposed Recognition System The proposed recognition system is described in this section. A typical handwriting recognition system consists of pre-processing, segmentation, feature extraction, classification and recognition, and post processing stages. 2.1. Image Acquisition In Image acquisition, the recognition system acquires a scanned image as an input image. The image should have a specific format such as JPEG, BMT etc. This image is acquired through a scanner, digital camera or any other suitable digital input device. 2.2. Pre-processing The pre-processing is a series of operations performed on the scanned input image. It essentially enhances the image rendering it suitable for segmentation. The various tasks performed on the image in pre-processing stage. Binarization process converts a gray scale image into a binary image using global thresholding technique. Detection of edges in the binarized image using sobel technique, dilation the image and filling the holes present in it are the operations performed in the last two stages to produce the pre-processed image suitable for segmentation [16]. 2.3. Segmentation In the segmentation stage, an image of sequence of characters is decomposed into subimages of individual character. In the proposed system, the pre-processed input image is segmented into isolated characters by assigning a number to each character using a labeling process. This labeling provides information about number of characters in the image. Each individual character is uniformly resized into 90X60 pixels for classification and recognition stage. 3. Proposed Feature Extraction Method In this stage, the features of the characters that are crucial for classifying them at recognition stage are extracted. This is an important stage as its effective functioning improves the recognition rate and reduces the misclassification [17]. Diagonal feature extraction scheme for recognizing off-line handwritten characters is proposed in this work. 4. Classification And Recognition The classification stage is the decision making part of a recognition system and it uses the features extracted in the previous stage. A feed forward back propagation neural network having two hidden layers with architecture is used to perform the classification. The hidden 95

layers use log sigmoid activation function, and the output layer is a competitive layer, as one of the characters is to be identified. The feature vector is denoted as X where X = (f1, f2, fd) where f denotes features and d is the number of zones into which each character is divided. The number of input neurons is determined by length of the feature vector d. The total numbers of characters n determines the number of neurons in the output layer. The number of neurons in the hidden layers is obtained by trial and error. The most compact network is chosen and presented. The network training parameters are: _ Input nodes: _ Hidden nodes: _ Output nodes: _ Training algorithm: _ Perform function: _ Training goal achieved: _ Training epochs: _ Training momentum constant: 5. Discussion About An Idea The recognition system will be implemented using Matlab7.1.The scanned image is taken as dataset/ input and feed forward architecture is used. The structure of neural network includes an input layer with two hidden layers and an output layer. The gradient descent back propagation method with momentum and adaptive learning rate and log-sigmoid transfer functions is used for neural network training. A recognition system using two different feature lengths is built. The number of input nodes is chosen based on the number of features. After training the network, the recognition system will be tested using several unknown dataset and then the results will compare/obtained. Two approaches with three different ways of feature extraction are used for character recognition in the proposed system. The three different ways of feature extraction are horizontal direction, vertical direction and diagonal direction. In the first approach, the feature vector size is chosen without row wise and column wise features. The criteria for choosing the type of feature extraction are: (I) the speed of convergence, i.e. number of epochs required to achieve the training goal and (ii) training stability. However, the most important parameter of interest is the accuracy of the recognition system. Study of Review results that the diagonal feature extraction yields good recognition accuracy compared to the others types of feature extraction. 6. Conclusion 96

A simple off-line handwritten characters recognition system using a feature extraction, namely, diagonal based feature extraction is proposed. This approach is chosen to build the Neural Network recognition system. To compare the recognition efficiency of the proposed method of feature extraction, the neural network recognition system is trained using the vertical and horizontal feature extraction methods. The diagonal method of feature extraction is verified using a number of test images. The proposed off-line hand written character recognition system with better-quality recognition rates will be eminently suitable for several applications including postal/parcel address recognition, bank processing, document reading and conversion of any handwritten document into structural text form. References 1.

Proc. of IEEE, vol. 80, pp. 1029-1058, July 1992.

2. 3. 4.

5.

International Journal Pattern Recognition and Artificial Intelligence, Vol. 5(1-2), pp. 1-24, 1991. V.K. Govindan and A.P. Shivapr Pattern Recognition, vol. 23, no. 7, pp. 671- 683, 1990 -line and off- line handwritten character recognition: A IEEE. Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 1, pp. 63-84, 2000. N. Arica and F. Yarman-line IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, 2001, 31(2), pp. 216 - 233.

6. IEEE Transaction on Pattern analysis and machine intelligence, vol.31, No.3, pp.444-457, 2009. 7.

Ninth International conference on Document Analysis and Recognition ICDAR 07, Vol.2, pp.749-753, 2007. 8. R.G. Casey and E.Lecolinet IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 18, No.7, July 1996, pp. 690-706. 9. -A S Pattern Recognition, vol. 29, no. 4, pp. 641-662, 1996. 10. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 18, No.7, July 1996, pp. 690-706. 11. Int. Workshop on Neural Networks and Learning in Document Analysis and Recognition, Seoul, 2005. 12. Bortolozzi, A. S. Brito, Luiz S. Document Analysis, Umapada Pal, Swapan K. Parui, Bidyut B. Chaudhuri, pp 1-30. 13. International Journal of Computer Science & Communication.vol. 1, No. 2, July-December 2010, pp. 141-144. 14. recognition using structural feature and K-2007, pp-125 -129. 15. N. Sharma, U. Pal, F. Kimura, "Recognition of Handwritten Kannada Numerals", 9th International Conference on Information Technology (ICIT'06), ICIT, pp. 133-136. 16. Rafael C. Gonzalez, Richard E. woods and Steven L.Eddins, Digital Image Processing using MATLAB, Pearson Education, Dorling Kindersley, South Asia, 2004. 17. handwritten numeral recognition of four popular southJournal of Theoretical and Applied Information Technology, JATIT vol.4, no.12, pp.1171-1181, 2008.

97

PERFORMANCE EVOLUTIONS OF PROACTIVE, REACTIVE & HYBRID ROUTING PROTOCOLS IN ACMANET Satyendra Soni, Margi Patel Ashish Saxena AbstractAd-hoc wireless networks due to their quick and economically less demanding deployment, find applications in several areas. Ad-hoc networks can be very useful in establishing communication among various groups running various applications in an area where setting up of a fixed infrastructure may not be possible. An ad-hoc wireless network consists of a set of mobile nodes connected by wireless links. The topology of these network changes randomly. A variety of routing protocols for ad-hoc wireless networks has been proposed in the past. This work analyses the performance of AODV, DSR ,GRP and OLSR protocols on the application based cluster network where the whole network is divided into various clusters based on the various applications running on them. Every cluster has a different application running on it. This analysis concludes that OLSR outperforms AODV,GRP and DSR in such an environment. Performance metrics used for this comparison are throughput, network load, delay and retransmission attempts. Keywords- Application Cluster, AODV, DSR, GRP, OLSR, OPNET 1. Introduction Ad-hoc networks have gained popularity in recent years because of its dynamic nature and low cost of deployment. In ad-hoc network every node acts as a node as well as a router. Every node in a network runs an application on it i.e. FTP, Email, HTTP, etc. Communication among nodes creates a mesh of applications. This work analyzes the performance of various routing protocols (AODV,GRP, DSR and OLSR) on a network where various applications are running, each in a different cluster. Various clusters based on applications running on them have been identified and the performance of these protocols has been studied. This work concludes that OLSR, even though having network load comparable to AODV,GRP and DSR and number of retransmission attempts more than AODV,GRP and DSR, increases the network throughput. 2. Routing Protocols Dynamic Source Routing (DSR) DSR was developed at Carnegie Mellon University. It is simple and efficient reactive routing protocol which is specially designed for multi-hop ad hoc network of mobile nodes. The nodes in the network easily join or leave the network without any information. The network using DSR is not requiring existing network infrastructure or administration. The node desiring to transmit a packet define route for the packet because it is based on source routing. DSR works for ad-hoc network of approximately 200 nodes. Each node 98

participating in ad-hoc network should forward packets and discard the erroneous packets (corrupted). DSR has two mechanisms: route discovery and route maintenance. Route discovery The source starts a route discovery when sending data packet to the destination but have no routing information. To set up a route, the source floods RREQs message with a distinctive request ID. When the destination receives this request message or a node which has destination route information then it transmits RREP message back to the source with route information. Route Maintenance In LAN routing the main improvement of DSR is in route maintenance and monitoring in the attendance of mobility. DSR based on the acknowledgments of data packets sent to adjacent nodes to monitors the validity of existing routes. This monitoring is achieved by inactively listening for communication of the adjacent to the next hop or sitting a bit in a packet to ask for open acknowledgment. The RERRs packet is sent to the creative sender to raise a new route discovery stage when a node fails to accept an acknowledgment. Nodes receive a REERs message remove any route entry (from their route cache) which uses the out of order link. Ad Hoc On-demand Distance Vector (AODV) The Ad hoc On-Demand Distance Vector (AODV) is a routing protocol. AODV is designed for ad hoc mobile networks and of both routing, that is uni-cast and multicast routing. AODV establish routes between different nodes as needed by source nodes. AODV maintain these routes as well as form trees which connect different multicast group members. The group members compose the trees and the members are connected by the nodes. In an ad hoc network when two nodes want to make a connection with each other, AODV enable multi-hop routes within the nodes. Ad hoc On-demand distance vector is free loop. DSN (Distance Sequence numbers) is used by the AODV to avoid counting to infinity, and this is one of the most important quality and feature of this algorithm. In a network the requested nodes send the DSN with other routing information from the source to the destination. It has also the feature to select optional route which is based on the sequence number. [14] There are three messages which are defined by AODV. These messages are Route Errors (RERRs), Route Request (RREQs) and Route Replies (RREPs). For discovering and maintaining routes in the network these three messages are used, by using UDP packets from source to destination. A node uses its IP address as the source address in the IP header of a message when it request for a route, and for broadcast 255.255.255.255. In the ad hoc network the number of hops a particular routing message is determined by the TTL (Time-To-Live). The RREQ is broadcasted by the requested node when a route is needed to be created to the destination. When the next hop node received the message a new route is determined, or itself by the destination [16]. Routes of the PREQ from the originator to all the stations that receive message are cached in these stations. A 99

RERR message is delivered or generated when there is a failure in the link. The message has all the important information about the node which is not reachable because of the failure. The IP address of the nodes is also mentioned in the message as the next hop for the destination. AODV is table based. All the information about the routes in the network is stored in this table. The routing table has the following entries i.e. DSN, flag, next hop, IP address, State, hop count, the list of precursors, Life time and network interface. Gathering Based Routing Protocol (GRP) GRP is a Hybrid routing algorithm based on the combination of reactive and proactive and used in MANETs to improve the scalability. Highly dynamic mobile ad hoc networks can be used by GRP. It is an adaptive routing protocol and used in multi-hop networks. It makes scaled routes between source and destination and Directed Acyclic Graph (DAG) is used to build in the destination node. A data packet goes from up flow to down flow according the height difference between nodes. GRP has the capacity that many nodes can send packets to a given destination It also guarantees that all routes are loop free and provide the best features of reactive and proactive routing . OLSR OLSR [10] is a modular proactive hop by hop routing protocol. It is a modular protocol which consists of an always required core, and a set of auxiliary functions. It is a proactive approach, so it continuously tries to find routes to all possible destinations in the network. Proactive and link state behavior could increase congestion in the network due to the routing traffic generated. However, due to its proactive basis, it has the advantage of having routes immediately available whenever they are required. In order to reduce the amount of routing traffic generated by the protocol and thus optimize the algorithm to meet the requirements of a mobile WLAN, OLSR introduces Multipoint Relays (MPR). A MPR is a set of selected nodes which forward messages during the flooding process. Only nodes selected as MPR members can forward routing and control traffic. Using this technique traffic generated at the flooding process is highly reduced, making this technique a sort of selective flooding. A node selects its MPR node members out of its neighbors located at one hop distance from it. A node which selects another node as a MPR node member is also called MPR Selector of that node. Following these guidelines, neighbors of a given node not included in its MPR set receive and process control messages, but do not forward them. MPR set covers all nodes located two hops from the node. Obviously, the smaller a MPR set, the lower control traffic generated in the network. In order to establish a communication process between nodes running a protocol instance, OLSR uses a unique packet, in which more than one message can be encapsulated. OLSR packets can carry three different message types, each one for a specific purpose: HELLO messages, which perform the task of link sensing, neighbor detection and MPR signaling; TC (Topology Control) messages, which advertise link states and MID (Multiple Interface Declaration) messages, which perform the multiple interface declaration on a node. Once all the information has

100

been acquired through the message exchange, OLSR calculates the route table for each node. 3. Opnet Modeler OPNET Modeler [4] [5] is commercial network simulation environment for network modeling and simulation. It allows the users to design and study communication networks, devices, protocols, and applications with flexibility and scalability. It simulates the network graphically and gives the graphical structure of actual networks and network components. The users can design the network model visually. The modeler uses object-oriented modeling approach. The nodes and protocols are modeled as classes with inheritance and specialization. The development language is C. It provides a variety of toolboxes to design, simulate and analyze a network topology, routing protocols on the basis of various network parameters. MANET toolbox has been used in this work to simulate the network. Components used for designing of the network are MANET Station (mobile), application configuration which decides the type of application running in the network, profile configuration for configuring the type of profile on the network. In profile configuration start time and stop time of the application can be set and pause time between the nodes is set. Mobility configuration will decide the mobility model of every node which is selected as random waypoint for this simulation. Attributes of workstation will set the routing protocol used for the simulation. 4. Simulation Environment This scenario has been modeled using OPNET [4] [5] modeler. Protocols used for this study are AODV, GRP, DSR and OLSR. Figure shows a sample network formed with 30 nodes, whose behavior has to be analyzed when nodes move in the network with respect to time to determine the effecting features of each protocol. The objective of this work is to analyze the performance of these protocols in a network which has various application based clusters. Parameters used for this comparison are throughput, network load, delay, and retransmission attempts. In order to evaluate the performance of a generic scenario in ad-hoc networking, when analyzing mobile networks, modeling the movement of the set of nodes forming a MANET is essential. Random waypoint model [6] of mobility has been studied. The Random Waypoint model has been selected to be used in all simulations presented in this document. Using Random Waypoint model, nodes go moving until they arrive at a random destination. Once there, they get still for a period of time, called the pause interval. Once passed the pause interval, a new movement is calculated, with a random direction and speed.

101

( MANET Scenario) 5.Simulation Parameter Static

Value

Scenario size

1000mx1000m

Simulating time

180s

Number of nodes

30

Number of cluster

3

Number of application

3

Cluster size

10

Data rate

1mbps

Mobility factor Routing protocols

Random way point AODV,DSR,OLSR,GRP

Mac type

802.11

6. Simulation Results and Analysis Figure depicts the network delay for various routing protocols for a given simulation scenario. At a first glance, results demonstrate that proactive routing protocols introduce a lower delay in the network as they have routes before their demand. However, since they continuously search for routes to all possible destinations, which increases routing routes and search them when they are needed. This increases the Network delay in packet transmission. 102

(Network Delay in FTP) Figure depicts the number of Throughput in AODV, DSR,GRP and OLSR. The results demonstrate that the number of retransmission attempts in case of OLSR is more as compared to AODV ,GRP and DSR. We can also say that for a given network scenario, number of retransmission attempts in case of proactive routing is more than that of reactive and hybrid routing.

(Throughput in FTP) 7. Conclusion In this work, the analysis of 4 ad-hoc routing protocols i.e. AODV, GRP,DSR and OLSR has been done on the basis of delay, network load, and throughput. From the obtained simulation results it can be concluded that performance of OLSR is better than AODV ,GRP and DSR in such an application based clustered network. References 1. 2.

PERKINS, C. E.; BELDING-ROYER, E.; DAS, S. R. (2003). Ad Hoc on-demand distance vector routing. RFC 3561. JOHNSON, D. B.; MALTZ, D. A.; BROCH, J. (1999). DSR: The dynamic source routing protocol for multi-hop wireless Ad Hoc networks.

103

3. 4. 5.

6.

PARK, V. D.; CORSON, M. S. (1997). A highly adaptive distributed routing algorithm for mobile wireless networks. Wireless Networks, Volume 1 Issue 1. OPNET TECHNOLOGIES, INC. (2005, 25 September). OPNET: Making networks and applications perform. Bethesda (USA):OPNET Technologies, Inc. http://www.opnet.com CAVIN, D.; SASSON, Y.; SCHIPER, A. (2002). On the accuracy of MANET simulators. Proceedings of the second ACM international workshop on Principles of mobile computing. OPNET University Program: CAMP, T.; BOLENG, J.; DAVIES, V. (2002). A survey of mobility models for Ad Hoc network research. Colorado School of Mines. Colorado (USA).

7. 8.

Computing, Kulwer, 1996, pp.152-81. S.R. Chaudhry, A. Al-Khwildi; C.Y. Aldelou, H. AlMulti on Demand Routing in Wireless A Wireless And Mobile Computing, Networking And Communications, Vol. 3, Aug. 2005 pp.9

9. Eighth IEEE Symposium on Computers and Communication, Vol.1, 2003 pp.203 208. 10. CLAUSEN, T.; JACQUET, P. (2003). Optimized link state routing protocol. RFC 3626 11.

104

16.

PRIVACY AND CHALLENGES: DATA STORAGE SECURITY ISSUES IN CLOUD COMPUTING Rakesh Prasad Sarang Abstract In last decade, there are several survey point out over privacy and services has risen and increasingly becoming central point for IT industries. Almost every day, IT sector, network security and sensitive data are being broken. This uses more advanced techniques to protect their information. Privacy is a very essential issue for cloud computing. Hence there is a need of an effective privacy and security that protect data, information and technology resources adequately. This survey focus on cloud computing, its architecture, services and identifies the cloud computing privacy. In this paper, we are presenting a service models to deal with the privacy and security problems in cloud computing. We elaborate privacy and service models as considered in cloud computing resources and the proper preventing their data. There are multifarious security and privacy that need to be understood. The paper includes some privacy and techniques that show the motivation for the adoption of cloud computing. Keywords - Cloud computing, service models, privacy and challenges. Introduction Cloud computing emerges as a computing platform for the next generation of the new computing technology. Cloud computing is a technology which provided you as a service with users can access all the database resources and software through the internet from on your system, it is provided only for services by another company and accessed using a browser over the internet [1]. In this paper privacy & services issues of cloud computing are reviewed. The cloud computing techniques has several applications and components in different areas including carryout implementation of security applications. However it is mandatory that the service models need to be inbuilt and enabled for the desired objectives. Cloud computing is a system, where the resources of a data center is shared using virtualization technology, which also provides elastic, on demand and instant services to its customers and associate customer usage as grid computing [2]. Cloud computing technique is also advantageous in control of datacenter information security policy, information security infrastructure, security of third party. Cloud Computing aids virtualization and Grid Technology analysis for security activities. Virtualization and grid technology analysis are some of the techniques which are especially useful in diverse fields such as IT Technologies, Services Provider Applications, Information security, Control network segment and sharing resources etc.

105

In this paper we present, the privacy and service issues in cloud computing environment. We would also investigate challenges in cloud computing, and the uses of cloud computing. This paper primarily aims to highlight the major privacy and service models in current existing cloud computing environments and users analyzing the way that eliminate these potential security. In this works, we provides an overview of the most important privacy regulations in the cloud-computing environment, cloud providers, in traditional IT environments, clients connect to multiple servers located on company site. Clients need to connect to each of the servers separately. In cloud computing clients connect to the cloud, the cloud contains all of the applications and infrastructure and appears as a single entity. Cloud computing allows for dynamically reconfigurable resources to supply for changes in demand for load, more efficient use of the resources [3]. In this paper we present a technique for privacy in cloud computing. Here we develop a service models with combination of different techniques to prevent threats and to secure data on cloud computing, high security is one of the major barrier in clouds, the sensitive applications and data are moved into the cloud data centers, and run on virtual computing machine. This exclusive aspects, however poses many tangible and intangible security challenges like accessibility vulnerabilities, virtualization vulnerabilities, and web applications. All challenges relate to cloud server having physical control of data and manage documentation [4]. This paper is organized as follows: in Section 1 present Introduction of privacy and security issues in cloud computing. Section 2 presented architecture and addressing in cloud computing. Section 3 present better service models. Section 4 presents privacy and security in cloud computing. Section 5 presents cloud challenges. Finally section 6 presents conclusions. Architecture Of Cloud Computing In this cloud computing we used to define five major architectural modules and their

the computing power available on server to improve the overall workload. A front end interface such as a gateway allows a user to select a service from a grid. This request gets passed to the system management which finds the correct resources and then calls the provisioning services which allocates resources in the cloud. The provisioning service may deploy the requested control server or software application through authorization ondemand.

106

Fig. 1 General Cloud Computing Architecture Client: cloud for data computation, consist of both individual clients and organizations. Cloud Service Provider (CSP): CSP, who has significant storage resources to maintain owns and operates cloud computing systems. The storage service provider, which occurrences complex failures occasionally, may decide to hide the data errors from the clients for the benefit of their own. What is more serious is that for saving funds and storage space the service provider might desert to keep accessed data files which belong to an ordinary client. Cloud Storage Server (CSS): an entity, which is managed by cloud service provider (CSP), files. Outsource Data Files: how the data file is outsourced in cloud storage and users have no control over it. This also gives observation of the problem with the storage and to ensure the reliability of the data in the cloud storage. In cloud data storage, a client stores his data through CSP into a set of cloud servers, which are running in a simultaneous, cooperated and distribu Thereafter, application for the purpose of client interacts with the cloud storage servers through CSP to access or retrieve his data file [5]. Cloud Computing Service Models A. Software as a Service (SaaS) SaaS is a model of software deployment where an application is hosted as a service to computing resources as a service and Application Service Provider (ASP). Where applications are hosted and delivered online via a web browser offering traditional desktop functionality. SaaS consist of software running on the cloud infrastructure. It provides storage that the consumer is used including bandwidth requirements for the storage. The 107

client contains a multiple browser to access the application (on-demand) via thin client over the internet. Examples of SaaS are Google, Docs and Salesforce.com [6]. B. Platform as a Service (PaaS) PaaS is another application model, where supplies all the resources required to build (develops, test and deploy). It also includes operating system and required services for particular applications. PaaS providers offer a predefined combination of OS, development tools, Integrated Development Environment (IDE) and application servers, such as LAMP platform (Linux, Apache, MySql and PHP). These platforms include data security, backup, recovery, application hosting, and scalable architecture. Examples of PaaS are Microsoft Azure, Google App Eng and Force.com [7]. C. Infrastructure as a Service (IaaS) Iaas is a service which provides an access to hardware resources for executing services, such as CPU processing, memory, data storage and network connectivity. The vendor may share their hardware among multiple customers referred to as multiple tenants, using virtualization software resource. IaaS allow customers to run operating systems and software applications. Fig 2 shows the basic cloud architecture provides the various common deployment service models with related different elements of cloud computing. The above mentioned service models, where the three cloud service models can be deployed on top of the four ts.

Fig. 2 Cloud Computing deployment Service Models

Privacy issues in cloud This paper looks at the main privacy and security an issue relevant to cloud computing, as they relate to outsourcing portions of the organizational computing environment. It point out areas of concern with public clouds that require special attention and data protection provides the necessary on security and privacy in cloud computing.

108

We have multiple security issues that need to be protected data and information. Here we comparison private and public cloud computing scenario. A public cloud works as a host of a number of virtual machines, virtual machine monitors, supporting middleware. But in a public cloud enabling a shared multi-tenant environment, as the number of users is increasing, security risks are getting more intensified and diverse. So it is necessary to identify the attack surfaces which are security attacks and mechanisms ensuring successful client-side and server-side protection. Because different security issues in a public cloud, adopting a private cloud solution is more secure with an option to move to public cloud in future if needed. Hence, security is very essential at different levels in order to manage and proper implementation of cloud computing such as: server access security, internet access security, database access security, data privacy security and program access security. Some security concerns are listed and discussed below. A. Network Level Security There are several types of securities are into Network level that are, shared and non-shared, public or private, small area or large area networks and each of them have a number of security threats to deal with. To ensure network security following points such as: privacy and integrity in the network, proper access control and maintaining security against the external third party threats should be considered while providing network level security [8, 9]. B. Application Level Security Application level security refers to the usage of software and hardware resources to provide security and applications such as the attackers are not able to get control over these applications and make desirable changes to their format. That is the outdated network level security policies allow only the authorized users to access the specific IP address. The recent technological advancement, these have higher level of security policies with high performance [10]. C. Information Level Security Information Level security, which is approved by the management, published and communicated as appropriate to all employees. It conditions the management commitment and set out the organizational approach to managing information security. Like Information security infrastructure, security of third party access, Virtualization and Grid Technologies, identity and access management, secure development lifecycle and secure activities on cloud computing [11]. D. Client-Side Protection Client-Side Protection, a successful defense both client side and web side infrastructures both are required for protection against attacks. For the former to be overlooked typical emphasis are placed on the latter. For many cloud computing services web browser act as a key element, and the various plug-ins and extensions which are available for their security problems. 109

E. Server-Side Protection Server-Side Protection, an IaaS clouds, virtual servers and applications, much like their nonvirtualized counterparts, are needed to be secured. For the occurrence of VM images for deployment following organizational policies and procedures for hardening of the operating system and applications are launched. Proper care should be taken to make adjustments for the virtualized environments so that images can run [12]. Cloud computing challenges A. Data Recovery and Availability Cloud computing should be complete in all its aspect. It should support data recovery availability and when needed, it should support the Following forms: -Appropriate clustering and fail over -Data Replication -System monitoring (Transactions monitoring, logs monitoring and others) -Maintenance (Runtime Governance) -Disaster recovery -Capacity and performance management In the absence of any one of above features the results can be disastrous. B. Regularity and Compliance Restrictions The European countries Government regulations do not allow customer's personal information and other sensitive information to be physically located outside the state or country. In order to meet such requirements, cloud providers need to setup a data center or a storage site exclusively within the country to comply with regulations. This challenge has specific goal, cloud platforms, onsite versus cloud, cloud security, moving to the cloud etc [13]. C. Data Protection Data Protection security is the core issue in cloud computing that guarantees scrutiny. An enterprise is hesitant to buy an assurance of business data security from vendor due to competitions prevalence of consumer confidence. Many times, actual storage location is not disclosed for these firewalls security option are used to protect sensitive information in the 110

area of cloud computing, service providers are responsible for maintaining data security and enterprises would have to rely on them. D. IT Challenges protection and strong support of IT planning and deployment. Internal and external IT leaders need to help identify the business and success factors. This can be managed with the proper IT resources and execution. Some of our greatest challenges are, Cloud strategy, IT resources, Business continuity and recovery, Data management, Offsite backup solutions etc. E. Security and Privacy Public cloud not only increases the privacy issue but also security concern. Information security is a main issue current cloud offerings are essentially public exposing the system to more attacks. In this cause there are potentially additional challenges to make cloud computing environments as secure in IT industries. Security and privacy affect the entire cloud computing stack, since Information requiring standard security and various privacy challenges with the specific steps to be taken in the cloud computing [14,15]. Public cloud not only increases the privacy issue but also security concern. Information security is a main issue current cloud offerings are essentially public exposing the system to more attacks. In this cause there are potentially additional challenges to make cloud computing environments as secure in IT industries. Security and privacy affect the entire cloud computing stack, since Information requiring standard security and various privacy challenges with the specific steps to be taken in the cloud computing. Conclusion Cloud computing is latest development that provides easy access to high performance computing resources and storage infrastructure through web services. This paper mainly discusses privacy and security importance of system behavior and challenges in the cloud computing, including services model. This services the theoretical foundation and practical cloud computing application. We also identity challenges and opportunities in Cloud computing. The paper addresses the issues that can arise during the deployment of cloud services model. This paper has presented the Cloud computing can provide resources in IT industries and can also help to reduce computing costs within organizations, and also new business opportunities for service-oriented models. This is particularly useful during organizational sustainability. However it can provide the basis for the deeper research on security deployment of cloud computing. References

111

1. 2.

3. 4. 5. 6. 7. 8. 9. 10.

11.

12. 13. 14. 15.

Rao, Srinivasa, and V Nageswara Rao, Theoretical and Applied Information Technology, 71-76, 2009. Khorshed, Tanzim, A B M Shawkat Ali, and Saleh A Wasimi,

, (JATIT) Journal of hreat Remediation , Future Generation

Computer Systems, 833-851, 2012. Ruiter, Joep, and Martijn Warnier, , 1-16. Sun D, Chang G, Sun L, Wang X, , Elsevier Procedia Engineering, 2852-2856, 2011. Jaikar, S. P., & Nimbalkar, M. V., Journal of Computer Engineering (IOSRJCE), 43-49.Vol 1, Issue 6, July-Aug 2012. Sultan, Nabil, , (IJIM) International Journal of Information Management, 30 109-116, 2010. San Francisco, CA Chappell and Associates Chappell, D. A. Enterprise, Aug. 2008. Bhadauria, Rohit, Chaki, Rituparna, Nabendu, Sanyal, Sugata, , Electronics and Communications. Jens, F. Jones, M. T. Cloud computing with Linux , Sept. 2008. Scalable Security Solutions, Check Point Open Performance Architecture, Quad Core Intel Xeon Processors, -Level Security at Data Centre Performance L IntelCorporation,2008.http://download.intel.com/netcomms/technologies/security Kumar, Pardeep, Sehgal, Vivek, Chauhan, Durg, Singh, Gupta, P K, and Diwakar, Manoj , (IJCSI) International Journal of Computer Science Issues, vol.8, 412-421, May. 2011. Jansen, Wayne A, , 44th Hawaii International Conference on System Sciences, 1-10, Oct. 2011. Torry Haris, An Overview. Buyya, Rajkumar, James Broberg, and Andrzej Goscinski, Cloud Computing Principles and . Leavitt, N. IEEE Computer, 15 20, Jan. 2009.

112

Abstract E-commerce Security is a part of the Information Security framework and is specifically applied to the components that affect e-commerce that include Computer Security, Data security and other wider realms of the Information Security framework. Encryption technology is vitally important to support secure e-commerce on the Internet. Encryption is a key technology to secure electronic transactions. In this paper we discuss the Advanced Encryption Standard (AES) cryptography to secure and fast web transaction, Lowering of the cost of operation, increase in the speed of transactions, and easy global reach to customers to overcome security issues in E-Commerce. 1. INTRODUCTION The Internet is a global network, which contains many networks. Therefore, the Internet is a network of networks. Connecting a business to the Internet implies a global reach. In other words, a company can reach anyone who has an access to the Internet such as customers, suppliers, on-line banks, mediators, etc. As an electronic commerce exponentially grows, the number of transactions and participants who use e-commerce applications has been rapidly increased. Since all the interactions among participants occur in an open network, there is a high risk for sensitive information to be leaked to unauthorized users. However, cryptographic techniques used to secure ecommerce transactions usually demand significant computational time overheads, and complex interactions among participants highly require the usage of network bandwidth beyond the manageable limit. People need to be sure that their Internet communication is kept confidential. When the customers shop online, they need to be sure that the vendors are authentic. When the customers send their transactions request to their banks, they want to be certain that the integrity of the message is preserved. Secure Electronic Transaction is a very comprehensive security protocol which uses cryptography to provide security services to the transaction. Encryption/Decryption process, in modern days is considered combination of three types of algorithms. They are: A. Symmetric-key algorithms cryptography such as Data Encryption Standard (DES), Advanced Encryption Standard (AES), Ron's Code (RCn), and Triple DES. B. Asymmetric-key algorithms such as Rivest, Shamir, & Adleman (RSA), Elliptic Curve(EC), Diffi-Hillman(DH) , and C. Hashing. Integrity of data is ensured by hashing algorithms. 2. Problem: In modern time most important thing in E-commerce is security, speed, space, time. There are many number of cryptography are use in now day according to requirement but these are more 113

complex to calculate the operation. Advanced Encryption Standard (AES) fulfill all thethese things security, speed, space and time. 3. Introduction of Advance Encryption Standard (AES) : The Advanced Encryption Standard (AES) is a specification for the encryption of electronic data established by the U.S. National Institute of Standards and Technology (NIST) in 2001. AES is based on the Rijndael cipher developd by two Belgian cryptographers, Joan Daemen and Vincent Rijmen, Who submitted a proposal to NIST during the AES selection process. Rijndael is a family of ciphers with different key and block sizes. The Advanced Encryption Standard AES is a symmetric block cipher. It operates on 128-bit blocks of data. The algorithm can encrypt and decrypt blocks using secret keys. The key size can either be 128-bit, 192-bit, or 256-bit. The actual key size depends on the desired security level. The algorithm consists of 10 rounds (when the key has 192 bits, 12 rounds are used, and when the key has 256 bits, 14 rounds are used). Each round has a round key, derived from the original key.

3.1 History In 1997 NIST called for proposals for a new Advanced Encryption Standard (AES). Unlike the DES development, the selection of the algorithm for AES was an open process administered by NIST. In three subsequent AES evaluation rounds, NIST and the international scientific community discussed the advantages and disadvantages of the submitted ciphers and narrowed down the number of potential candidates. In 2001, NIST declared the block cipher Rijndael as the new AES and published it as a final standard (FIPS PUB 197). Rijndael was designed by two young Belgian cryptographers.

Within the call for proposals, the following requirements for all AES candidate submissions were mandatory: block cipher with 128 bit block size three key lengths must be supported: 128, 192 and 256 bit security relative to other submitted algorithms efficiency in software and hardware The invitation for submitting suitable algorithms and the subsequent evaluation of the Successor of DES was a public process. A compact chronology of the AES selection process is given here: The need for a new block cipher was announced on January 2, 1997, by NIST. A formal call for AES was announced on September 12, 1997. Fifteen candidate algorithms were submitted by researchers from several countries by August 20, 1998. 114

On August 9, 1999, five finalist algorithms were announced: Mars by IBM Corporation RC6 by RSA Laboratories Rijndael, by Joan Daemen and Vincent Rijmen Serpent, by Ross Anderson, Eli Biham and Lars Knudsen Twofish, by Bruce Schneier, John Kelsey, DougWhiting, DavidWagner, Chris Hall and Niels Ferguson.

Understanding Cryptography by C. Paar and J. Pelzl, Copyright Springer-Verlag 89 On October 2, 2000, NIST announced that it had chosen Rijndael as the AES. On November 26, 2001, AES was formally approved as a US federal standard.

It is expected that AES will be the dominant symmetric-key algorithm for many commercial applications for the next few decades. It is also remarkable that in 2003 the US National Security Agency (NSA) announced that it allows AES to encrypt classified documents up to the level SECRET for all key lengths, and up to the TOP SECRET level for key lengths of either 192 or 256 bits. Prior to that date, only non-public algorithms had been used for the encryption of classified documents. 3.2 Overview of the AES Algorithm The AES cipher is almost identical to the block cipher Rijndael. However, the AES standard only calls for a block size of 128 bits. Hence, only Rijndael with a block length of 128 bits is known as the AES algorithm.

fig. 1 There are four basic steps, called layers that are used to form the rounds: 115

a) The Byte Sub Transformation (S-Box): This non-linear layer is for resistance to differential and linear cryptanalysis attacks. it provide confutation. b) The Shift Row Transformation (SR): This linear mixing step causes diffusion of the bits over multiple rounds. it is provide diffusion. c) The Mix Column Transformation (MC): This layer has a purpose similar to ShiftRow. d) Key Addition layer A 128-bit round key, or subkey, which has been derived from the main key in the key schedule, is XORed to the state. Note: last row does not have mixColumn layer.

3 .3 Some Mathematics: A Brief Introduction to Galois Fields: In AES, Galois field arithmetic is used in most layers, especially in the S-Box and the MixColumn layer. Hence, for a deeper understanding of the internals of AES, we provide an introduction to Galois fields as needed for this purpose before we continue with the algorithm in Sect. 4.4. A background on Galois fields is not required for a basic understanding of AES.

Figure 2

inscription/decryption process model

. 116

Extension Fields GF(2m) In AES the finite field contains 256 elements and is denoted as GF(28). This field was chosenbecause each of the field elements can be represented by one byte. For the S-Box and MixColumn transforms, AES treats every byte of the internal data path as an element of the field GF(28) and manipulates the data by performing arithmetic in this finite field.

3.4 Internal Structure of AES: In the following, we examine the internal structure of AES. Figure 4 shows the graph of a single AES round. The 16-byte input A0, . . . ,A15 is fed bytewise into the S-Box. The 16-byte output B0, . . . ,B15 is permuted byte-wise in the ShiftRows layer and mixed by the MixColumn transformation c(x). Finally, the 128-bit subkey ki is XORed with the intermediate result. We note that AES is a byte-oriented cipher.

fig. 4 Internal Structure of AES

117

3.4.1 Byte Substitution Layer As shown in Fig. 4, the first layer in each round is the Byte Substitution layer. The Byte Substitution layer can be viewed as a row of 16 parallel S-Boxes, each with 8 input and output bits. Note that all 16 S-Boxes are identical, unlike DES where eight different S-Boxes are used. In the layer, each state byte Ai is replaced, i.e. Substituted, by another byte Bi:

S(Ai) = Bi. The S-Box is the only nonlinear element of AES, i.e., it holds that ByteSub(A)+ ByteSub(B) ByteSub(A+B) for two states A and B. The S-Box substitution is a bijective mapping, i.e., each of

the 28 = 256 possible input elements is one-to-one mapped to one output element. This allows us to uniquely reverse the S-Box, which is needed for decryption. In software implementations the S-Box is usually realized as a 256-by-8 bit lookup table with fixed entries, as given in Table 2. Table 2. AES S-Box: Substitution values in hexadecimal notation for input byte (xy) ShiftRows Sublayer The ShiftRows transformation cyclically shifts the second row of the state matrix by three bytes to the right, the third row by two bytes to the right and the fourth row by one byte to the right. The first row is not changed by the ShiftRows transformation. The purpose of the ShiftRows transformation is to increase the diffusion properties of AES. If the input of the ShiftRows sublayer is given as a state matrix 118

B = (B0,B1, . . . ,B15):

The output is the new state:

MixColumn Sublayer The MixColumn step is a linear transformation which mixes each column of the state matrix.Since every input byte influences four output bytes, the MixColumn operation is the major diffusion element in AES. The combination of the ShiftRows and MixColumn layer makes it possible that after only three rounds every byte of the state matrix depends on all 16 plaintext bytes. In the following, we denote the 16-byte input state by B and the 16-byte output state by C:

MixColumn(B) =C, Where B is the state after the ShiftRows operation as given in Expression. Now, each 4-byte column is considered as a vector and multiplied by a fixed 4×4 matrix. matrix contains constant entries. Multiplication and addition of the coefficients is done GF(28). As an example, we show how the first four output bytes are computed:

The in

The second column of output bytes (C4,C5,C6,C7) is computed by multiplying the four input bytes (B4,B9,B14,B3) by the same constant matrix, and so on. Figure 4. shows which input bytes are used in each of the fourMixColumn operations. We discuss now the details of the vector matrix multiplication which forms the MixColum operations. We recall that each state byte Ci and Bi is an 8-bit value representing an element 119

from GF(28). All arithmetic involving the coefficients is done in this Galois field. For the constants in the matrix a hexadecimal notation is GF(28) polynomial with the coefficients (00000001), i.e., it is the element 1 of the Galois the polynomial with the bit vector (00000010), i.e., to the polynomial x; polynomial with the bit vector (00000011), i.e., the Galois field element x+1. 8 The additions in the vector matrix multiplication are GF(2 ) additions, that is simple bitwise XORs of the respective bytes. For the multiplication of the constants, we have to realize multiplications with the constants 01, 02 and 03. These are quite efficient, and in fact, the three constants were chosen such that software implementation is easy. Multiplication by 01 is multiplication by the identity and does not involve any explicit operation. Multiplication by 02 and 03 can be done through table look-up in two 256-by-8 tables. As an alternative, multiplication by 02 can also be implemented as a multiplication by x, which is a left shift by one bit, and a modular reduction with P(x) = x8 +x4+x3 +x+1. Similarly, multiplication by 03, which represents the polynomial (x+1), can be implemented by a left shift by one bit and addition of the original value followed by a modular reduction withP(x). Key Addition Layer The two inputs to the Key Addition layer are the current 16-byte state matrix and a subkey which also consists of 16 bytes (128 bits). The two inputs are combined through a bitwise XOR operation. Note that the XOR operation is equal to addition in the Galois field GF(2). 3.4.4 Key Schedule The key schedule takes the original input key (of length 128, 192 or 256 bit) and derives the subkeys used in AES. Note that an XOR addition of a subkey is used both at the input and output of AES. This process is sometimes referred to as key whitening. The number of subkeys is equal to the number of rounds plus one, due to the key needed for key whitening in the first key addition layer, cf. Fig. 4. Thus, for the key length of 128 bits, the number of rounds is nr = 10, and there are 11 subkeys, each of 128 bits. The AES with a 192-bit key requires 13 subkeys of length 128 bits, and AES with a 256-bit key has 15 subkeys. The AES subkeys are computed recursively, i.e., in order to derive subkey ki, subkey ki 1 must be known, etc. The AES key schedule is word-oriented, where 1 word = 32 bits. Subkeys are stored in a key expansion array W that consists of words. There are different key schedules for the three different AES key sizes of 128, 192 and 256 bit, which are all fairly similar. We introduce the three key schedules in the following. Key Schedule for 128-Bit Key AES

120

The ll subkeys are stored in a key expansion arraywith the elementsW[0], . . . ,W[43]. The subkeys are computed as depicted in Fig. 6. The elements K0, . . . ,K15 denote the bytes of the original AES key. First, we note that the first subkey k0 is the original AES key, i.e., the key is copied into the first four elements of the key arrayW. The other array elements are

121

Fig. 6. AES key schedule for 128-bit key size

122

computed as follows. As can be seen in the figure, the leftmost word of a subkey W[4i], = 1, . . . ,10, is computed as:

where i

W[4i] =W[4(i 1)]+g(W[4i 1]). Here g() is a nonlinear function with a four-byte input and output. The remaining three words of a subkey are computed recursively as: W[4i+ j] =W[4i+ j 1]+W[4(i 1)+ j], where i = 1, . . . ,10 and j = 1,2,3. The function g() rotates its four input bytes, performs a bytewise S-Box substitution, and adds a round coefficient RC to it. The round coefficient is an element of the Galois field GF(28), i.e, an 8-bit value. It is only added to the leftmost byte in the function g(). The round coefficients vary from round to round according to the following rule: RC[1] = x0 = (00000001)2, RC[2] = x1 = (00000010)2, RC[3] = x2 = (00000100)2,

. . . RC[10] = x9 = (00110110)2. The function g() has two purposes. First, it adds nonlinearity to the key schedule. Second, it removes symmetry in AES. Both properties are necessary to thwart certain block cipher attacks. Key Schedule for 192-Bit Key AES AES with 192-bit key has 12 rounds and, thus, 13 subkeys of 128 bit each. The subkeys require 52 words, which are stored in the array elements W[0], . . . ,W[51]. The computation of the array elements is quite similar to the 128-bit key case and is shown in Fig. 6. There are ight iterations of the key schedule. (Note that these key schedule iterations do not correspond to the 12 AES rounds.) Each iteration computes six new words of the subkey arrayW. The subkey for the first AES round is formed by the array elements (W[0],W[1], W[2],W[3]), the second subkey by the elements (W[4],W[5],W[6],W[7]), and so on. Eight round coefficients RC[i] are needed within the function g(). They are computed as in the 128-bit case and range from RC[1], . . . ,RC[8]. Key Schedule for 256-Bit Key AES AES with 256-bit key needs 15 subkeys. The subkeys are stored in the 60 words W[0], . . ,W[59]. The computation of the array elements is quite similar to the 128- bit key case.The key schedule has seven iterations, where each iteration computes eight words for the subkeys. (Again, note that these key schedule iterations do not correspond to the 14 AES rounds.) The subkey for the first AES round is formed by the array elements (W[0],W[1],W[2],W[3]), the second subkey by the elements (W[4],W[5],W[6],W[7]), and so on. There are seven round coefficients RC[1], . . . ,RC[7] within the function g() needed, that are computed as in the 128-bit case. This key schedule also has a function h() with 4-byte input and output. The function applies the S-Box to

123

all four input bytes. In general, when different approaches exist:

implementing any of the key schedules, two

3.5 Decryption: Because AES is not based on a Feistel network, all layers must actually be inverted, i.e., the Byte Substitution layer becomes the Inv Byte Substitution layer, the ShiftRows layer becomes the Inv ShiftRows layer, and the MixColumn layer becomes InvMixColumn layer. However, as we will see, it turns out that the inverse layer operations are fairly similar to the layer operations used for encryption. In ad dition, the order of the subkeys is reversed, i.e., we need a reversed key schedule. A block diagram of the decryption function is shown in Fig. 2. Since the last encryption round does not perform the MixColum operation, the first decryption round also does not contain the corresponding inverse layer. All other decryption rounds, however, contain all AES layers. In the following, we discuss the inverse layers of the general AES decryption round (Fig. 4.9). Since the XOR operation is its own inverse, the key addition layer in the decryption mode is the same as in the encryption mode: it consists of a row of plain XOR gates.

124

Inverse MixColumn Sublayer After the addition of the subkey, the inverse MixColumn step is applied to the state (again, the exception is the first decryption round). In order to reverse the MixColumn operation, the inverse of its matrix must be used. The input is a 4-byte column of the State C which is multiplied by the inverse 4×4 matrix. The matrix contains constant entries. Multiplication and addition of the coefficients is done in GF(28).

The second column of output bytes (B4,B5,B6,B7) is computed by multiplying the four input bytes (C4,C5,C6,C7) by the same constant matrix, and so on. Each value Bi andCi is an element from GF(28). Also, the constants are elements from GF(28). The notation for the constants is hexadecimal and is the same as was used for the MixColumn layer, for example: 3 0B = (0B)hex = (00001011)2 = x +x+1. Additions in the vector matrix multiplication are bitwise XORs. Inverse ShiftRows Sublayer In order to reverse the ShiftRows operation of the encryption algorithm, we must shift the rows of the state matrix in the opposite direction. The first row is not changed by the inverse ShiftRows transformation. If the input of the ShiftRows sublayer is given as a state matrix B = (B0,B1, . . . ,B15)

The inverse ShiftRows sublayer yields the output:

125

Inverse Byte Substitution Layer The inverse S-Box is used when decrypting a ciphertext. Since the AES S-Box is a bijective, i.e., a one-to-one mapping, it is possible to construct an inverse S-Box such that:

Ai = S 1(Bi) = S 1(S(Ai)), Where Ai and Bi are elements of the state matrix. The entries of the inverse S-Box are given in Table 3. For readers who are interested in the details of how the entries of inverse S-Box are constructed, we provide a derivation. However, for a functional understanding of AES, the remainder of this section can be skipped. In order to reverse the S-Box substitution, we first have to compute the inverse of the affine transformation. For this, each input byte Bi is considered an element of GF(28). The inverse affine transformation on each byte Bi is defined by:

Table 3. Inverse AES S-Box: Substitution values in hexadecimal notation for input byte (xy)

126

where (b7, . . . ,b0) is the bitwise vector representation of Bi(x), and (b 7, . . . ,b 0) the result after the inverse affine transformation. In the second step of the inverse S-Box operation, the Galois field inverse has to be reversed. For this, note that Ai = (A 1i ) 1. This means that the inverse operation is reversed by computing the inverse again. In our notation we thus have to compute

Ai = (B i)

1

GF(28)

With the fixed reduction polynomial P(x)= x8+x4+x3+x+1. Again, the zero element is mapped to itself. The vector Ai =(a7, . . . ,a0) (representing the field element a7x7 +· · ·+a1x+a0) is the result of the substitution:

Ai = S 1(Bi). 4. Advantage of Use of AES(the Advanced Encryption Standard) 4.1. Comparison of Exiting Algorithms In this section, we compare the existing symmetric algorithms on the basis of different parameters as shown in table, which includes Block Size, Key Length, Security, and Speed.

Speed Analysis

127

4.2 ANALYSIS With any cryptographic system dealing with 128 bit key, the total number of combination is 2 128 .The time required to check all possible combinations at the rate of rate 50 billion keys / second is approximately (5 *1021) years thus AES is very strong and efficiency to used in ecommerce. 5. CONCLUSION Satisfying security requirements is one of the most important goals for e-commerce system security designers; in this paper we give the protocol design for securing e-commerce transaction by using hybrid encryption technique. This hybrid encryption method surely will increase the performance of cryptographic algorithms. This protocol will ensure the confidentiality, integrity and authentication. Encryption technology discussed in this paper is the key technology to make online transaction over the Internet secure. Of course no one can guarantee 100% security. Fraud exists in current commerce systems: cash can be counterfeited, checks altered, credit card numbers stolen. Yet these systems are still successful because the benefits and conveniences outweigh the losses. Similarly fraud will still exist in ecommerce even though encryption technology is good enough to protect electronic transactions, but at least a good encryption technology can reduce fraud significantly.

6. References: 1. 2.

3.

4. 5.

6. 7.

A Review on Ecommerce Security: Mrs. Sunita S. Padmannavar/ International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com The study of E-Commerce Security Issues and Solutions: Niranjanamurthy M 1, DR. Dharmendra Chahar 2, Assistant Professor Dept. of MCA, MSRIT, Bangalore, INDIA1 HOD. Dept. of CS & IT, Seth G. B. Podar College, Nawalgarh (Jhunjhunu) -333042, INDIA2 ISSN (Print) : 2319-5940, ISSN (Online) : 22781021. SECURING ELECTRONIC TRANSACTIONS TO SUPPORT E-COMMERCE: Mohammad Nabil Almunawar, Faculty of Business, Economics & Policy Studies Universiti Brunei Darussalam, e-mail: [email protected] The Advanced Encryption Standard (AES): Cryptography by C. Paar and J. Pelzl, Copyright SpringerVerlag, Affiliated Professor at the University of Massachusetts at Amherst, USA., [email protected] Efficiency of Modern Encryption Algorithms in Cloud Computing: 1Omer K. Jasim, 2Safia Abbas, 3ElSayed M. El-Horbaty and 4Abdel-Badeeh M. Salem Department, Anabr, Iraq, 2, 3, 4 Faculty of Computer and Information Sciences, Ain Shams University Cairo, Egypt, International Journal of Emerging Trends & Technology in Computer Science (IJETTCS) Web Site: www.ijettcs.org Email: [email protected], [email protected] Volume 2, Issue 6, November December 2013, ISSN 2278-6856. AdvancedEncryptionStandard(AES),http://en.wikipedia.org/wiki/Advanced_Encryption_Standard (8:01 pm 30/01/2015). Analysis of Security Algorithms in Cloud Computing: Randeep Kaur1 ,Supriya Kinger2 1Student Masters Of Technology, Shri Guru Granth Sahib World University, Fatehgarh Sahib. 2Assistant Professor, Shri Guru Granth Sahib World University, Fatehgarh Sahib. International Journal of Application or Innovation in Engineering & Management (IJAIEM)

128

8. 9. 10. 11.

12.

13. 14.

Web Site: www.ijaiem.org Email: [email protected] Volume 3, Issue 3, March 2014 ISSN 2319 - 4847. MODELING USER PERCEPTIONS OF E-COMMERCE SECURITY USING PARTIAL LEAST SQUARE (Mohanad Halaweh) UNIVERSITY OF DUBAI, [email protected] ISSN #1042-1319. Cryptography Based E-Commerce Security(Shazia Yasin1, Khalid Haseeb2, Rashid Jalal Qureshi3) ISSN (Online): 1694-0814. A Survey of Cryptographic Algorithms for Cloud Computing(Rashmi Nigoti1, Manoj Jhuria2 Dr.Shailendra Singh3) ISSN (Print): 2279-0047 ISSN (Online): 2279-0055. Security Enhancement in Secure Electronic Transaction Protocol (SETP) , Satyanshu Srivastava, Rakesh Bharti Department of Computer Science & Engineering United Institute Of Technology, Allahabad 211006 India. ISSN: 2277-3754. SECURING ELECTRONIC TRANSACTIONS TO SUPPORT E-COMMERCE (Mohammad Nabil Almunawar) Faculty of Business, Economics & Policy Studies Universiti Brunei Darussalam e-mail: [email protected]. HYBRID MODEL FOR SECURING E-COMMERCE TRANSACTION(Abdul Monem S. Rahma1, Rabah N. Farhan2, Hussam J. Mohammad3) ISSN: 2231-1963. http://radio-weblogs.com/0105910/2003/10/26.html.(11:34 15/02/2015)

129

Section II IT Applications for Libraries

130

CLOUD COMPUTING: A NEW BUZZ IN 21ST CENTURY LIBRARY SERVICES Dr. Pawan Kumar Sharma, Prof. Hemant Sharma Satya Prakash Pandey Abstract Cloud computing is generally related to the abstraction of information technology software and services from the hardware they run on. The National Institute of Standards and Technology (NIST) expands this definition by examining specific characteristics (e.g. selfservice, resource pooling, and elasticity), management models (e.g. service, platform, or infrastructure focus), and employment models (e.g. public, private). This article gives you the detailed information about Cloud Computing, their definitions, cloud application in different library systems and some live examples of libraries using cloud computing at national and international level. Keywords: Cloud Libraries, Cloud Computing, Automated library, Digital library, Virtual library. Introduction Library and information centers are always adopting the latest technologies which are available globally and trying to transform/ upload all the data into the new systems i.e. automated library, digital library, virtual library and hybrid library. With the invention of cloud computing the process has become much easier and data can be transferred simply from local servers to cloud servers and users can access data globally via World Wide Web. Mitchell (2011) says that, one of the first big shifts in how libraries manage resources was the move from print-journal purchasing models to database-subscription and e- journal purchasing models. Libraries found this type of method helped them in scaling their resources and provide better service just by thinking a bit differently about how they provided journals. Likewise, current cloud-computing initiatives allow technologists to think about how we handle our computing resources. Shifting to cloud solutions gives libraries an opportunity to save time and resources and also re-allocate resources to improve service. Cloud Computing Means Chellappa gave the first academic definition of the term Cloud Computing in 1997 and later on in 2007 the term cloud computing came into popularity and was first used in the context when Kevin Kelly opined that "eventually we'll have the interNational Institute of Standards and Technology (NIST), defines Cloud Computing as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction (Mell & Grance, 2011). 131

Genesis of Cloud Computing The term Cloud Computing has been derived from the use of a cloud image to represent the

Compaq: George Favaloro, 1996 Compaq business plan was the first document known to

2006, when then Google CEO Eric Schmidt introduced the term to an industry conference. Why Clouds in Libraries With the emergence of World Wide Web, the service and information provision of libraries have been restructured recognizing that future libraries shift focus from huge collection building to networked services. Rather they should emphasize on referral services to (Lancaster, 1997). This prediction has come true and libraries have extended their link and connection to wide networks and full-content databases through web (Cohn et al., 2002). The issues of storage, network security, operating system upgrades, hardware costs and all of the various activities associated with maintaining a local computing infrastructure are pushed out to the service provider. Also the cost of purchasing individual or bulk licenses for software products is mitigated because the application in centrally installed and utilized. Now it is time for libraries to adopt cloud computing to reduce amounts spent on infrastructure so that they can focus efforts in other areas that were previously cost prohibited (Fox, 2009). (Romer, 2012) described the features of cloud computing and its usefulness in information delivery services and how it can be used in a professional environment. The author found software over the internet relying on new technologies such as virtualization, programming techniques such as multi-tenancy and/or scalability, load balancing that ensured relatively Liu et al, (2013) studied the use of cloud computing in university employment information library by examining the relationship between companies and graduates in order to build a cloud of labor market information, namely cloud computing database. (Yuvraj, 2013) in his survey explored the librarians' inquisitiveness in adoption of cloud computing in libraries of Indian Central Universities. Moreover, he studied the tools and techniques of Cloud Computing used in their daily library services and found that librarians are heavily reliant on cloud computing tools and majority of them are using various devices for improving the quality of library and information services though they are a bit worried about the security prospect of the system. 132

(Singh & Veralakshmi, 2012) pointed out that the cloud computing model could help libraries and information centers to maintain proper administration and control over the data assimilation, storage and dissemination providing utmost customer's satisfaction. Cloud application in different library systems 1. Automated Library In automated library all the records i.e., Acquisition, Cataloguing, Classification, OPAC, etc. will be entered through library management software based platform and the database will be stored in the backend server which is hosted locally and available through local area network/ intranet. Now with the concept of cloud computing this database can be put in the cloud servers which can be accessed globally by using web browser platform. 2. Digital Library In digital library mostly the data will be in a form of digital formats i.e., DVD, CD-ROM or electronically created records which will be stored in one center point server hosted locally and available through both intranet and internet. Now by using cloud computing data can be hosted through public cloud servers and the library services can be provided at a low marginal cost. 3. Virtual Library A virtual library consists of information material from different open resources. The information is organized in a virtual space by using computer networks. For hosting a virtual library website one should register for domain name & a dedicated space required for the website. Now, with the help of cloud computing the website and its related files can be hosted by using the public cloud servers. Some live examples of libraries using cloud computing 1. OCLC Online Computer Library Center has been functioning as a cloud computing vendor. They provide cataloguing tools over the internet and allow member institutions/ organisations to draw on their centralized data store. This centralized database allows for the sharing of catalog records between libraries and greatly reduces the time spent in cataloging incoming material (OCLC, 2011). WorldCat is another example of cloud computing architecture drawing on the union catalog infrastructure they have built up over the years. 2. Library Thing One of the sites that combine aspects of social networking and cloud computing is Library Thing, originator of which is Tim Spalding. Library Thing offers services which are just like a social networking site, authorizes people to contribute information and suggestions about 133

books and allows them to interconnect globally to share interests. This site also contributes web services for libraries; after paying a nominal fee it allows libraries to draw on the vast database of recommendations and other users available in Library Thing (https://www.librarything.com/). 3. Reed Elsevier Reed Elsevier is a service provider for scientific information working with hospitals to provide point in time information to medical technicians as they need the information. It is capitalizing on the cloud computing model. There is the possibility to place monographic and article content or even technical manuals so that technicians and other medical personnel can get assistance exactly when they need it. This utilizes cloud computing model in the way that computers and other devices used in the medical profession can be tied into the data and application provided by Elsevier from anywhere (http://www.reedelsevier.com/Pages/Home.aspx). 4. Amazon and Google Amazon has been developing a large web services architecture from years and they now offer hosting services for data which are priced at gigabytes-month and CPU hour rates. We basically pay what we actually use (http://www.amazon.in). Google has been working from years for the dissemination of information, also taking for application within their server farms, and on massive and highly redundant storage systems(https://cloud.google.com/why-google/). IBM are showing curiosity in the field and 5. Kindle and Mobile Me services . If one has wireless connection, one can purchase and read a rapidly growing list of books and periodicals from Kindle, no matter which location. With this service largest text can be downloaded in seconds. distributed calendaring and messaging irrespective of what device you are using. Modifications made via one device are instantly reflected on all of the devices and for example, with the library acting as the gatekeepers, institutions could provide mobile access to say, a list of articles to their students simply by selecting them and giving them a code which would bring up the lists of articles from a vendors cloud. The same cloud works for preprint archives, data archives and digital object repositories. 6. SeerSuite Seer suite was developed as a result of extensive research and development, with the goal of enabling efficient dissemination of scientific information and literature. SeerSuite refers to a 134

collection of open source tools that provide the underlying application software for creating academic search engines and digital libraries such as CiteSeerX etc. (Bhuvan Urgaonkar, Pradeep Teregowda & Lee Giles,C 2010). 7. DuraCloud DuraCloud is an open source technology project for preserving and archiving digital content. In 2009, The Library of Congress National Digital Information Infrastructure and Preservation Program (NDIIPP) and DuraSpace announced a joint pilot program to test the use of cloud technologies to enable perpetual access to digital content with DuraCloud (Library of Congress, 2009). The pilot program entered a second phase in 2010. Several open source releases of the DuraCloud software led to the public launch of the managed service on November 7, 2011(Dura Space, 2011). 8. Polaris Library Systems Polaris is one of the cloud based library automation system available in the market. The company also provides standard acquisition and processing system. Also, with a Polaris ILS Client License, the library can integrate various PC and print management systems at no extra cost. The systems uses number of well know standards like MARC 21 for bibliographic data, XML, Z39.50 for information retrieval, Unicode etc. (http://www.polarislibrary.com/) 9. Ex Libris4 Ex Libris is a well known cloud service provider based in USA. They are providing cloud solution in the field of library with all the software and hardware support needed to provide services to the users. Ex Libris is available for all type of libraries and also for consortia. Ex Libris is built on various standard and contains number of features like compatibility with Unicode font, flexibility, migration of data, customization etc. Cloud Computing application in Indian Libraries Few attempts have been made by the some of the Government autonomous bodies and some Government organizations to provide Service and Application based cloud computing models being used by Indian libraries. 1. Meghraj On 4th February 2014, Government of India has launched a GI Cloud Initiative called very of e-services in the country, while optimising information and communications technology (ICT) spending of the government. Meghraj is just a name coined for the purpose (Megh=Cloud, Raj=Rule i.e. Rule of Cloud Computing). According to the govt, the National Cloud will ensure optimum utilisation of the infrastructure and speed up the development and deployment of eGov applications in the country (Press Information Bureau, 2014). 2. University of Mysore 135

The library of University of Mysore (UoM) at Manasagangotri has been providing quality services to users. With On-line Public Access Catalogue (OPAC), public can access more URL http://mopac.mysore-univ.org, the UoM has become one of the early among the few universities in the country to adopt the mobile technology for library services. Most arguably this would be the first university library in India to use cloud hosting facility (from Amazon) for automation related activities of all its constituent libraries situated in deferent cities that too by creating union catalogue accessible to all through cloud hosting. 3. CVRS service of INFLIBNET INFLIBNET started Collaborative Virtual Reference Services (CVRS) for academic community in colleges and universities. The users in colleges and universities (or even general public) would be free to post their questions on the CVRS website. The questions on CVRS would be grouped into a number of subject categories, and user would be required to choose his subject category before posting his question. Answer to the questions would be available from CVRS website by the volunteering librarians. Libraries may specify the field of specialization on which they would like to answer the questions. 4. Shodh Gangotri The Shodhganga @INFLIBNET Centre provides a platform for research students to deposit their Ph.D. theses and make it available to the entire scholarly community in open access. The repository has the ability to capture, index, store, disseminate and preserve ETDs submitted by the researchers.

http://shodhgangotri.inflibnet.ac.in/ 5. Open Journal Access System (OJAS)

136

This service of INFLIBNET provides a common application and infrastructure for hosting the open access journals by the Indian academic institutions. The platform enables the libraries to access, preserve and digitize the contents without maintaining the servers at the local level.

http://www.inflibnet.ac.in/ojs/index.php/index/index 6. Union Catalogue of INFLIBNET/DELNET IndCat: Online Union Catalogue of Indian Universities is unified Online Library Catalogues of books, theses and journals available in major university libraries in India. The union database contains bibliographic description, location and holdings information for books, journals and theses in all subject areas available in more than 157 university libraries across the country. A Web-based interface is designed to provide easy access to the merged catalogues. The IndCat is a major source of bibliographic information that can be used for inter-library loan, collections development as well as for copy cataloguing and retroconversion of bibliographic records. The IndCat consists three components available in open access to users and librarians. A Web-based interface is designed to provide easy access to the merged catalogues. The IndCat is a major source of bibliographic information that can be used for inter-library loan, collections development as well as for copy cataloguing and retro-conversion of bibliographic records. The IndCat consists three components available in open access to users and librarians. Databases Books Theses Serials

No. of Records 1,28,36,579 2,65,351 33,184

Universities 160 309 213

http://indcat.inflibnet.ac.in/ Delnet maintains an online union catalogue of books available with its Member-Libraries. The union catalogue is continuously updated and is growing in size. It can be accessed by author, title, subject, conference, series, etc. and also Boolean Operators can be used. It contains about 1,81,51,784 bibliographic records at present. Inter-library loan requests for books are placed online. Requests are also received through e-mail. Databases

Records

137

Union Catalogue of Books

CCF

Union List of Current Periodicals Union Catalogue of Periodicals Database of Periodical Articles CD-ROM Database Union List of Video Recordings Union List of Sound Recordings Database of Theses and Dissertations Database of E-books

1,81,51,78 4 36,940 20,235 9,22,042 22,234 6,000 1,025 70,293 1613

http://delnet.nic.in/database-statistics.htm 7. Digital Library of India The primary long-term objective is to capture all books in digital format. It is believed that such a task is impossible and could take hundreds of years, and never be completed. Thus, as a first step it is planned to demonstrate the feasibility by undertaking to digitize 1million books (less than 1% of all books in all language sever published) by 2005. It is believed that such a project has the potential to change how education is conducted in much of the world. A secondary objective of this project will be to provide a test bed that will support other researchers who are working on improved scanning techniques, improved optical character recognition, and improved indexing. The corpus this project creates will be one to three orders of magnitude larger than any existing free resource. National Mission on Education through ICT (NMEICT), Ministry of Human Resource Development (MHRD), Government of India , Library Automation & Resource Sharing Network project. The project will involve Koha, a free and open source Integrated Library System as a tool to Create a Union Catalogue on Sakshat server at IGNOU Campus. 8. Knimbus Knimbus was founded in November, 2010 and seen impressive growth year after year. Knimbus is now used in over 1,200 institutions year. Knimbus does not license the content but it helps users discover licensed content as well as openly accessed information. Knimbus provides users with a link to the content that resides on the publishers' Web platforms. Free model of Knimbus provides flexibility for subscribers to get started without paying, while the premium version offers institutes the ability to customize the platform with wider content access and library tools. Conclusion With the invention of cloud computing technology the process has become much easier and data can be transferred simply from local servers to cloud servers. Cloud Servers will be located in a remote location for which we don't need to provide specific space, local servers, 138

installing high firewall software's which will effectively minimize our cost for maintenance. Few attempts have been made by the some of the Government autonomous bodies and some Government organizations to provide Service and Application based cloud computing models being used by Indian libraries. The data stored in cloud servers is accessible through internet from anywhere by using web browser. Library services such as Acquisition, Technical Processing, Maintaining OPAC, Circulation, Serial Control, News Alerts etc. is accessible by using internet. The librarians can upload, download and modify information as per their privileges provided by the institution. References 1. 2. 3. 4. 5. 6. 7. 8. 9.

10. 11. 12. 13. 14. 15. 16. 17. 18. 19.

20. 21.

Bhuvan Urgaonkar,Pradeep Teregowda & Lee Giles,C, 2010.Cloud computing: A digital library perspectives. Accessed Dec 18, 2014. http:// http://clgiles.ist.psu.edu/pubs/ICCC2010-cloud.pdf. CiteSeer . Accessed December 19, 2015.http://en.wikipedia.org/wiki/CiteSeer Cohn, J.M., et al., 2002. Planning for Integrated Systems and Technologies: A How-to-Do-it- Manual for Librarians, Facet Publishers, London. Duraspace, 2011. Accessed December 31, 2014. http://duraspace.org/duraspace-launchesopen-sourcecloud-service E. T. Mitchell, "Cloud computing and your library". Accessed January 18, 2014. htp://web.tech.lib accessed on December 20th 2014. Ex Libris: The Bridge to Knowledge. Accessed December 05, 2014. http://www.exlibris.co.il Fox Robert ,2009. Digital libraries: the systems analysis perspective library in the clouds. OCLC systems & services: International digital library perspectives. 25(3):156-161. Knimbus. Accessed December 18, 2014.http://www.knimbus.com/user/auth.do Lancaster, F.W. ,1997. Artificial intelligence and expert system technologies: prospects. In Raitt, D.I. (Eds), Libraries for the New Millennium: Implications for Managers, Library Association, London, Ed, 19-38. Library of Congress, 2009. Accessed November 18, 2014. http://www.loc.gov/today/pr/2009/09140.html Liu, C., Zhao, X. M., & Liu, Y.(2013). Building of cloud computing in university employment information library. Journal of Convergence Information Technology, 8(6), 434-441. Meghraj, 2014. Accessed November 18, 2014. http://deity.gov.in/content/gi-cloud-initiative- meghraj Mell, P., & Grance, T. (2011). The NIST Definition of Cloud Computing: Recommendations of the National Institute of Standards and Technology. Maryland. Moore, Geoffrey: Core Content and the Cloud. Accessed February 18, 2014.http://www.youtube.com/watch?v=0swJCYLH2C National Institute of Standards and Technology (NIST) . Accessed December 20, 2014. htp://www.nist.gov/itl/cloud/ OCLC (2011). Accessed December 18, 2014. http://www.oclc.org/content/dam/research/publications/library/2011/2011- 01.pdf?urlm=162949 Polaris Library Systems. Accessed December 21, 2014. http://www.gisinfosystems. Press Information Bureau, Govt. of India, 2014. http://pib.nic.in/newsite/erelease.aspx?relid=102979 Singh, S.P., & Veralakshmi, R. S. R. (2012). Cloud computing: A promising economic model for library and information centers. DESIDOC Journal of Library and Information Technology, 32(6), 526-532. University of Mysore WebOPAC hosted on the cloud. Accessed December 26, 2014. http://libcat.mysoreuniv.org Yuvaraj, M.(2013). Cloud Computing Applications in Indian Central University libraries: A study of librarians` use. Library Philosophy and Practice. Accessed January 18, 2014. http://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=2397&context=libphilprac

139

DIGITAL OBJECT IDENTIFIER: AN OVERVIEW Shraddha Shahane Manjula Chauhan Abstract Digital object Identifier (DOI) is a concept that helps to identify the needs of the end-users using technology on a digital environment. DOIs can point to documents, images, sounds, video clips, and parts of works, gateways and works under development, evolving works and invoice screens. Virtually anything that a URL might point to now could be handled by the DOI system. DOIs are designed for use in any digital network, not just the World Wide Web, which is only one recent aspect of the evolution of digital networks and the use of digital objects within them. DOI may not be used in true sense, information which is collected, stored and retrieved using digital media be it CD-ROMs, tape drives or web based applications are based on certain identification which may be in the form of numbers generated by database themselves or by manual inputs. Therefore, one can easily identify the news item hosted under various categories for either modification or for effective retrieval. This paper explores the development, scope, organizational authority, structure of DOI which is in managing, storing, retrieving, indexing, classification of information and also sees a need for DOI as a tool to reach to different target audiences and end users. Keywords: Information management, digital media, digital object identifier, digital rights management. Introduction The Digital Object Identifier (DOI) is an identification system for intellectual property in the digital environment. A digital object identifier (DOI) is a unique alphanumeric string assigned by a registration agency to identify content and provide a persistent link to its location on the Internet. The publisher assigns a DOI when article is published and made available electronically. The International DOI Foundation (IDF), (based in Washington, DC, and Geneva.) a non-profit organization, is the governance body of the DOI system. of content on digital networks, using a federation of registries following a common specifica (Paskin, N., 2013, p.1586-1592). Paskin Inc. (abbreviated to IDF). The preferred usage, to avoid ambiguity, is with a qualifier to refer to either specific component computer-sensible form through assignment, resolution, referent description, adp.1586-1592). In other words, the Digital Object Identifier (DOI) is an Internet based global naming and resolution system that provides for the precise identification, retrieval, and trading of digital items in the form of articles, books, images, bibliographies, supporting data, videos, charts, tables, audio, and other electronic files. 140

Review of Literature Paskin (2010) explored in his study that The DOI system provides identifiers which are persistent, unique, resolvable, and interoperable and so useful for management of content on digital networks in automated and controlled ways. He found that assignment of a DOI name requires the registrant to record metadata describing the object to which the DOI name is being assigned. The metadata describes the object to the degree that is necessary to distinguish it as a separate entity within the DOI system. Tamizhchelvan (2012) found that DOI is an effective way to organize media library in managing information, storing, retrieving, indexing, classification and overall management of news flow in a media setup and also sees a need for DOI as a tool to reach to different target audiences and end users. From a media library point of view, with so much of news flowing in a media organization, one needs to be both subject as well as system expert to identify, organize, store and also retrieve information. Many attempts have been made to organize the newspapers over the years, from bound volumes to the recent full-text newspaper database to identify the news or information. Simons (2013) studied that The DOI system provides a framework for persistent identification, managing intellectual content, managing metadata, facilitating commerce and linking customers with content. DOIs are an implementation of the Handle System for persistent identifiers and seamlessly transport the user from one interface to another without requiring specific software. Information about a digital object may change over time, including where to find it and who owns it, but its DOI will not change. The Objectives of Digital Object Identifier Clifford NISO standards-development process of the DOI, as library representatives are on the NISO committees. Recently the Coalition for Networked Information has been asked "to help increase understanding of the DOI's objectives and roles, particularly as they related to library services, and to help to suggest ways in which the DOI might be made more useful to (p.56-62) -friendly. -friendly. -organized content in electronic format benefits archival and easy retrieval. usiness processes, improved decision support

Developments of Doi

141

Development of the DOI system began in 1996 when content creators and technologists jointly recognized that information and entertainment objects could not be commercially distributed on the Internet unless there was a common system of unique identification for those objects. The DOI system originated in a joint initiative of following three trade associations in the publishing industry: 1. International Publishers Association 2. International Association of Scientific, Technical and Medical Publishers 3. Association of American Publishers. Although originating in text publishing, the DOI was conceived as a generic framework for managing identification of content over digital networks, recognizing the trend towards digital convergence and multimedia availability. The system was announced at the Frankfurt Book Fair 1997. The International DOI® Foundation (IDF) was created to develop and manage the DOI system, also in 1997. The Corporation for National Research Initiatives (CNRI) worked with the IDF as a technical partner, and developed the Handle System as the digital network component of the DOI system. CNRI remains a technical partner of the IDF. In 2000, the Cross Reference Registration Agency, used the DOI system first time for citation linking of electronic articles and NISO was also standardized the syntax of the DOI. The DOI system was approved as an ISO standard in 2010. Organizational Authority of Doi System The International DOI Foundation (IDF) safeguards all intellectual property rights relating to the DOI system, manages common operational features, and supports the development and promotion of the DOI system. The IDF is controlled by the elected Board members of the Foundation, and an appointed Managing Agent who is responsible for co-coordinating and planning its activities. Membership is open to all organizations whose interest are in electronic publishing and related enabling technologies. IDF appointed several Registration agencies for provided several services to DOI registrants:1. They allocate DOI prefixes 2. Register DOI names, and 3. Provide the necessary infrastructure to allow registrants to declare and maintain metadata and state data. 142

4. Promote the widespread adoption of the DOI system. 5. Engages in marketing, training, development, etc. for their chosen community. International DOI Foundation is maintained a list of current Registration agencies. The IDF holds annual open meetings on the topics of DOI and related issues. Registration agencies generally charge a fee to assign a new DOI name. Parts of these fees are used to support the IDF. The DOI system operates on a not-for-profit cost recovery basis by the IDF. Registration Agencies must comply with the policies and technical standards established by the IDF, but are free to develop their own business model for running their businesses. Scope of Doi some classic dilemmas for publishers, as digital material is simultaneously both remarkably constant and amazingly protean. It can be copied effortlessly and quickly an indefinite number of times with absolute fidelity, and just as easily be cut and pasted and otherwise modified. Even more alarmingly, from a rightsholder's perspective, digital material can be freely distributed, with or without authorization, (Davidson & Douglas, 1998.p.23-29) The DOI system is an abstract framework which does not specify a particular context of its application, but is designed with the aim of working over the Internet. A DOI name is permanently assigned to an object, to provide a persistent link to current information about that object, including where it, or information about it, can be found. The principal focus of assignment is to content related entities; that term is not precisely defined but is exemplified by text documents; data sets; sound carriers; books; photographs; serials; audio, video, and audiovisual recordings; software; abstract works; artwork, etc., and related entities in their management, for example, licenses or parties. A DOI name is not intended as a replacement for other identifier schemes, such as ISBN, ISSN, ISAN, ISRC, etc., or Digital Object Identifier (DOI® ) System 1587. Parts of Doi System All DOI numbers begin with a 10. The DOI system has two main parts:1. The identifier 2. A directory system and 3. Logical component of the DOI system has a database. 1. The Identifier: This part is made up of two components .first element is the Prefix and second element is the Suffix which are separated by a slash. A. The Prefix: - It is a unique number of four or more digits assigned to organizations or publishers by a registration agency. There may be multiple registration agencies to serve separate geographical regions or for each intellectual property sector (such as text publishing, photographs, music, software, etc.). However, at this stage there is only one 143

registration agency and Directory Manager. Prefixes all begin with 10 to designate the DOI directory manager, followed by a number designating the publisher who will be depositing the individual DOIs, which ensures that a publisher can designate its own DOIs without fear of creating duplicate numbers. Publishers may choose to request a prefix for each imprint or product line, or may use a single prefix. B. The suffix: - This is assigned by the publisher and was designed to be flexible with publisher identification standards. The suffix can be as simple as a sequential number or a publisher's own internal numbering system. 2. The directory: The DOI system uses a central directory. When a user clicks on a DOI, a message is sent to the central directory where the current web address associated with that DOI appears. This location is sent back to the user's Internet browser with a special message telling the system to "go to this particular Internet address." In a split second the user sees a "response screen" -- a Web page -- on which the publisher offers the reader either the content itself, or, if not, then further information about the object, and information on how to obtain it. When the object is moved to a new server or the copyright holder sells the product line to another company, one change is recorded in the directory and all subsequent readers will be sent to the new site. 3. The database: Information about the object is maintained by the publisher. DOI system will also collect some minimum level of associated metadata to enable provision of automated efficient services (e.g. look-up of DOIs from bibliographic data, citation linking). Thus information about the object identified (metadata) might be distributed over several databases. It might include the actual content or the information on where and how to obtain the content or other related data. From these database systems is generated the information that the user has access to in response to a DOI query, forming the third component of the DOI system. Example: 10.1000/289 Prefix 10.1000

Suffix 289

Where: 10

Denotes the DOI registry.

1000

denotes the registrant.

289

denote item ID or single object.

APPLICATIONS OF DOI SYSTEM DOI system include following major applications: 1. Persistent citations in scholarly materials i.e. Journal articles, through Cross Ref. ,a consortium of around 3,000 publishers;

144

eBooks,

books,

etc.

2. research datasets through Cite, a consortium of leading research libraries, technical information providers, and scientific data centers; 3. Permanent global identifiers for commercial video content through the Entertainment ID Registry, commonly known as EIDR. 4. European Union official publications through the EU publications office Conclusion Publishing on the Internet requires new tools for managing content. Where traditional printed texts such as books and journals provided a title page or a cover for specific identifying information, digital content needs its own form of unique identifier. In the fastchanging world of electronic publishing, there is the added problem that ownership of information changes and location of electronic files changes frequently over the life of a work. This is important for both internal management of content within a publishing house and for dissemination on electronic networks. The network environment creates an expectation among users that resources can be linked and that these links should be stable. The DOI system provides a way to identify related materials and to link the reader or user of content to them. So DOI is a registered trademark of the International DOI Foundation, Inc. which avoided ambiguity of information in virtual world. References 1.

Clifford,L.(1997).Identifiers and Their Role in Networked Information Applications ARL: A Bimonthly Newsletter of Research Library Issues and Actions.194,p.56-62. Retrieved online from http://www.arl.org/newsltr /194/identifier.html On 20/1/ 2015 2. Davidson, L.A. & Douglas, K.(1998).Digital Object Identifiers: Promise and Problems for Scholarly Publishing. Journal of Electronic Publishing, 4(2), p.23-29..Retrieved online from http://dx.doi.org/10.3998/3336451.0004.203 on 4/12/2014 3. Paskin,N.(2013).Digital Object Identifier System .Encyclopedia of Library and Information Sciences,15861592.Retrieved online from www.doi.org on 15/1/2015 4. Simons,N.,Searle,S.& Lee,S.(2013).Digital Object Identifiers (DOIs): Introduction and management guide ,p.1-8.Retrieved online from www.ands.org.au/cite-data/griffith_doi_guidelines-4.pdf on 16/12/14 5. Tamizhchelvan, M., Ganesh,A.C., & Swaminathan,S.-Digital Object Identifier In Effective Media Library Management An Indian Perspective,p.140-49.Retrieved online from www.researchgate.net on 20/12/2014 6. olabout.wiley.com/Wiley CDA/Section/id-406097.html 7. www.doi.org 8. Wikipedia.org/wiki 9. www.apastyle.org 10. www.doi.org/ doi-handbook-introduction.html.

145

ROLE OF INSTITUTIONAL REPOSITORIES & OSS FOR E-RESOURCE MANAGEMENT Aslam Ansari Abstract An IR is a collection of digital research documents such as articles, book chapters, conference papers, and data sets. E-prints are the digital texts of peer-reviewed research articles, before and after refereeing. Before refereeing and publication, the draft is called a "pre-print". The refereed, accepted, final draft is called a "post-print". The term e-prints include both pre- and post-prints. With the increasing use of information and communication technologies (ICTs) and availability of open sources software most of the institutions are maintaining such repository or archive to collect, preserve, and make accessible the entire intellectual product created by the scholarly communities of that institutions. For the creation of IR in context of e-resource management, some issues like content management, user interface, user administration, system administration, interoperability etc. are needed to be resolved. This paper describes the basic features of IRs, ERM and explains the relationship between digital library software and Information Retrieval. This paper also provides different parameters for the selection and evaluation of digital library software for creating IRs. Four digital library software were selected for comparative study on the basis of their popularity and sustainability among users. The comparative study of software will help in the selection of software for the creation of IRs, accessibility of assets, access control, metadata, search and browsing, collection support and their relationship. The paper also describes the essential element of IRs and their policies like Digital Right Management, Contributors, metadata creation, migration of data and user support. Keywords: Institutional repositories, Digital assets, Digital Library, E-Library, Digital Library Software, D-Space, Greenstone, Fedora, EPrints, Metadata, OAI-PMH, E-Resource. E-Resource Management (ERM), Open Source Software (OSS). Introduction: Institutional Repositories (IR) are online archive for storing, preserving and disseminating the digital assets (information) of intellectual in institutions and provide open hat a university offers to created by the institution and its community members. It is most essentially an organizational commitment to the stewardship of these digital materials, including longSwan, Alma. (2013) find out the main advantages and befits of institutional repositories are To collect and curate digital assets; To manage and measure research and teaching activities; To encourage and enables interdisciplinary approaches to research; To Opening up outputs of the institution to a worldwide audience; 146

To Maximizing the visibility and impact of these outputs as a result; To Showcasing the institution to interested constituencies prospective staff, To be prospective students and other stakeholders; To Providing a workspace for work-in-progress, and for collaborative or large-scale projects; To Facilitating the development and sharing of digital teaching materials and aids, To Supporting student endeavors, providing access to theses and Dissertations and a location for the development of e-portfolios. E-Resource Management : ERM (electronic resource management) emerged during 2001-

especially internet-based resources such as electronic journals, databases, and electronic books. The development of ERM became necessary in the early 2000s as it became clear that traditional library catalogs and integrated library systems were not designed to handle metadata for resources as mutable as many onlin Basic features of ERM submitted by Feather, Celeste (2007) as: Supporting acquisition and management of licensed e-resources May be integrated into other library system modules or may be a standalone system May have a public interface, either separate or integrated into the OPAC Providing descriptions of resources at the package (database) level and relate package contents (e.g. e-journals) to the package record Encoding and perhaps publicly displaying licensed rights such as e-reserves, course packs, and interlibrary loan Tracking electronic resources from point of order through licensing and final access Providing information about the data providers, consortial arrangements, access platform Providing contact information for all content providers Logging problems with resources and providers Providing customizable e-mail alerting systems (e.g. notices to managers when actions are expected or required) Linking license documents to resource records Supports retrieval of SUSHI (Standard Usages Statistics Harvesting Initiative) usage statistics. Managing e-resources through OSS software have greatly evolved during last few year, these management system provides appropriate framework both for the production and management of e-resources IR system. The OSS digital library software provides extensible features to administrator and full right of software under GPL; these provide access to different type of information sources as variety of formats. A digital library may contain simple metadata or catalogs of information resources, like as OPACs, or many contain the full text of documents, images, audio, and video materials. Digital library may define as searchable collection of digital format or it may creation, organization and retrieval services bring together collections, services and people in support of the full life cycle of creation, dissemination, use, and preservation of data, informati 147

including the specialized staff, to select structure, offer intellectual access to interpret, distribute, preserve the integrity, of and ensure the persistence over time of collections of digital works so that they are readily and economically available for use by a defined An electronic library is a library in which collections are stored in electronic media formats and accessible via computers. The electronic content may be stored locally, or accessed remotely via computer Gladney, H.H. and et.al. (1994) provides most comprehensive definition of digital library , which computing , storage and communications machinery together with the content and software needed to reproduce, emulate, and extend the services provided by conventional libraries based on paper and other material means of collecting, cataloguing, finding and disseminating information. A full service digital library must accomplish all essential services of traditional libraries and also exploit the well-known advantages of digital Content management, User interface, user administration, System administration, interoperability etc. and providing access to variety of information resources through institutional repositories, users may choose one or more resources or collections on just one query, which passed on to the various resources or collections by the digital library interface; results are brought back after the search is carried out. And much better approach is now available is cross-database searching facility through one place (window) without requiring the user to know about the specific collection or database to search. Choudhary, G.G. (2010) describes a conceptual design of digital library for institutional repository to provide access to different types information sources in a variety of formats: The user does not need to search resources one by one, so it may be a better approach from their perspective, they formulate one searching query and get results from all different resources. In developing process of IR the, main issue raises relates to selection procedure of software, as Pyrounakis, George, and Nikolaidou, Mara. (2009), they suggested specific characteristic of digital library software to choose characteristics:

Object model; Collection and relation support; 148

Metadata and digital content storage; Search and browse; Object management; User Interface; Access control; Multiple language support; Interoperability features; and Level of customization. Architecture of Software for e-resource management: Clements, Paul. & Northrop, Linda. (1996) find the user infer to have some acquaintances with common knowledge about terms and concept of software but not aware about technical aspects. Although architecture mean assured architectural style such as client-server of a particular system. ude gross organization and global control structure; protocols for communication, synchronization, and data access; assignment of functionality to design elements; physical distribution; composition of design elements; scaling and performance; and selectio

Clements, Paul. & Northrop, Linda. (1996) find out three main reasons to developing of architecture of software that, mutual communication; early design decision; and transferable project has not achieved a system architecture, including its rationale, the project should not proceed to full-scale system development. Specifying the architecture as a deliverable specific software are for developing IR to manage the electronic resources like (DOAR, 2014 (annexure)): D-Space: D-Space is an open source digital library software, which is joint project of HP (Hewlett Packard) lab in and MIT (Massachusetts Institute of Technology) libraries in Cambridge and it release in year 2002, is freely available to research institutions and an open source system than can be customized and extended. Some characteristics are given bellow: It has service model for open and digital achieve for perennial access; 149

Provides a platform for framing of Institutional Repository, searchable collection and web retrieval. It also helps to make institutional-based scholarly material in digital formats. It written on JAVA Platform. Besides these characteristics, some specific features are: BSD open source lenience Using qualified metadata standard (DC) Web based user interface Availability on multiplatform like Unix, Windows, LINUX (some version) Field based Indexing (Metadata) Full text and hierarchical browsing Support UTF Unicode for multilingual documents and interfaces Compiles with OAI-PMH (Open Achieve Initiative- Protocol for metadata harvesting) Service model for open access Level of customization. Greenstone: Greenstone OSS suit for building and distributing digital library collections for promoting digital library objective. It produced by New Zealand Digital Library (NZDL) project at University of Waikato, headed by Dr. Ian H. Witten and sponsored and distributed cooperation with UNESCO and Human Info NGO (Belgium) under the terms of GNU Public License. Software coding written in JAVA, Perl and C ++ programming languages. It provides way to build up, maintaining and distribution of digital collections.

Simply, the main features of GSDL are: Full text, field and flexible web searching Create access structure automatically PlugCustomization of collection of presentation Collection may contains like text, graphics, audio and video Support to Multilanguage for user interface Collection support multi-formats Accessible via web browsing Supports to multi gigabytes collection 150

Uses advanced compression techniques Publication facilities of collection either Internet or on CD-ROM Administrative Login capabilities Fedora: Flexible Extensible Digital Object and Repository Architecture (FEDORA) based upon repository management system digit and digital asset management (DAM) architecture. It began as DARPA and NFS (National Foundation of Science) funded rary Research Group in 1997 and initial release under GNU Public License in May of 2003. According to Project, the key feature of Fedora is listed below: - Digital objects are stored as XML-encoded files that conform to an extension of the Metadata Encoding and Transmission Standard (METS) schema. Parameterized disseminators - Behaviors defined for an object support user-supplied options that are handled at dissemination time. for both full text and field-specific queries across these metadata fields. OAI Metadata Harvesting - The OAI Protocol for Metadata Harvesting is a standard for sharing metadata across repositories. Every Fedora digital object has a primary Dublin Core record that conforms to the schema. This metadata is accessible using the OAI Protocol for Metadata Harvesting, v2.0. Batch Utility - The Fedora repository system includes a Batch Utility as part of the Access Control and Authentication - Although Advanced Access Control and Authentication are not scheduled until Phase II of the project, a simple form of access control has been added in Phase I of the project to provide access restrictions based on IP address. IP range restriction is supported in both the Management and Access APIs (Application Programming Interfaces).In addition, the Management API is protected by HTTP Basic Authentication. Default Disseminator - The Default Disseminator is a built-in internal disseminator on every object that provides a system-defined behavior mechanism for disseminating the basic contents of an object. Searching - Selected system metadata fields are indexed along with the primary Dublin Core record for each object. The Fedora repository system provides a search interface But it is notable that the after year by year of the project there is no significant changes in the features. EPrints: It is also open and free source software developed by School of Electronics and Computer Science (University of Southampton) in year 2000 and project was early started in 1999, released under GPL license. But later released Micro Soft version in year 2010 of 17th may (ROAR Software Version Listing). It is preprint and post print of research journal articles: eprints=preprint + post print therefore means that it enables the community to deposit their preprints; post prints and other scholarly publications using a web interface, and organizes these publication for easy retrieval (Tramboo, Sankar and et.al, 2012) and most widely used; most functional of available OA IR software. It is command line application and web-based on LAMP architecture, but written in PERL rather than PHP and 151

run under Linux, Mac and Solaris OS (Sponsler, Ed and Van de Velde, Eric F., 2001). Eprints using databases: MySQL, Oracle, PostgreSQL and Cloud; and web server Apache. Tramboo, Sankar and et.al (2012) describe specific features; Boolean and Nested searching, Accessibility via web-browser; OAI-PMH support (Open Archives protocol allows sites to programmatically retrieve or 'harvest' the metadata from several sources, and offer services using that metadata, such as indexing or linking services. Such a service allows e-prints servers create the potential for a global network of cross-searchable research information, by allowing the contents of servers around the world searched simultaneously by using the OAI (Open Archives Initiative) protocol) ; support UTF-8 & UTF-16 (Unicode for Multilanguage) format; file formats ; Statistics for admin; preview of items; customization and administrative functions. It supports metadata formats like, Dublin Core & METS; end user deposition also. Characteristics Object Model

D-Space 4

Greenstone 3

Fedora 5

Eprints 2

Collection supports & relations Metadata and digital content storage Search & browse Object management User Interface Access control Multiple language support Inter-operability features Level of customization Comparative analytical structure of model

4 4

5 3

4 5

1 3

4 4 4 5 3 5 3 40

4 2 4 2 4 4 4 35

3 2 2 4 3 5 5 38

4 4 4 2 4 5 3 32

In other hand Pirounakis, George and Nikolaidou Mara (2009) find out at the level of comparison between DL software are selected as their features/characteristics for understand to how they are effective and efficient to select as build up institutional repositories systems in context of e-resource management for organizations or individuals to manage electronic resources, and selection of software may dependable upon grading system, which is 1 for lower and 5 for upper: Besides that library should focus on needs of users and make resources more easily accessible, and need have ERM (Sadeh & Mark, 2005) suggested knowing the different process of life cycle of e-resources:

152

Conclusion: While the ERM issues resolving by many commercial systems, but it may be interaction with e-resources, such interaction would be based on library products such as link server and metadata search (OPAC). The use indicate by Sharma (2009), that practical use of e-resources is not up-to the worth in comparison to investments made in acquiring these resources; infrastructure and training programs should also be revised as per requirements. It is observed that the availability of e-resources is almost sufficient for all the existing disciplines but the infrastructure to use these resources is not adequate and can OSS for available to create IRs for ERM. Annexure Courtesy: OpenDoar (2014) (http://www.opendoar.org/find.php?format=charts)

References: 1. 2.

"ROAR Software Version Listing(2014)" (http://roar.eprints.org/view/software/) International Workshop on Architecture for Software Systems. Seattle: Washington.

3. 4.

5.

6. 7.

8. 9. 10. 11. 12. 13.

retrieval. 3rd ed., London: Facet publication. p. 457. Clements, Paul., & Northrop, Linda. (1996). Software Architecture: An Executive Overview (CMU/SEI-96TR-003). Retrieved June 22, 2014, from the Software Engineering Institute, Carnegie Mellon University website: http://resources.sei.cmu.edu/library/asset-view.cfm?AssetID=12509 Electronic resource management. (2014, September 1). In Wikipedia, the Free Encyclopedia. Retrieved 14:52, December 14, 2014, from http://en.wikipedia.org/w/index.php?title= Electronic_ resource _management&oldid=623757900 Feather, Celeste (2007 Library, Columbus, OH: OhioNET. Gladney, H.H.; Fox, E.A.; Ahmed, Z.; Asany, R; Belkin, N.J. and Zemonkova, M. (1994). Digital Library: gross structure and requirement: reports from a march 1994 Workshop (http://www.csdl.tamu.edu/DL94/paper/fox.html) http://en.wikipedia.org/wiki/digital library http://en.wikipedia.org/wiki/Fedora_Commons http://shodhganga.inflibnet.ac.in/bitstream/10603/3731/17/17_chapter%208.pdf/ Fedora Digital Library software evaluation http://www.dspace.org http://www.fedora-commons.org http://www.greenstone.org

153

14. http://www.opendoar.org/find.php?format=charts (data received till Sept. 4, 2014) 15. http://www.wikipedia.org 16. Jose, Sanjo (2007) Adoption of open source digital library software packages: a survey. Ahmadabad: CALIBER. Available on http://eprints.rclis.org/8976/1/Sanjojose.pdf. 17. Library Solution, edited by Prasad, H.N. ;Tripathi, Aditya; Mishra, Rajani. New Delhi :EssEss Publication, p.71-85. 18. Lynch, C.A. (2003). Institutional repositories: Essential structure for scholarship in the digital age. ARL Bimonthly Report, No 226. 19. Journal of Library Science 6, p.51-69. 20. Pyrounakis, George, and Nikolaidou, Mara. (2009)."Comparing Open Source Digital Library Software." Handbook of Research on Digital Libraries: Design, Development, and Impact.p51-60. 21. tal library : a comparative study of GSDL and D-314. 22. Sadeh, Tamar & Ellingsen, Mark (2005). Electronic resource management systems: The need and the realisation. New Library World. 106(5/6), p. 208-18. 23. Sharma Pandey, S.K. (1993). Library computerization: theory and practice. New Delhi: EssEss Publication,.p91. 24. -Resources at Guru Gobind Singh Indraprastha University Electronic Journal of Academic and Special Librarianship.10(1), available on : http://southernlibrarianship.icaap.org/content/v10n01/sharma_c01.html. 25. and Knowledge Engineering. Vol 1.River Edge, NJ: World Scientific Publishing Company. 26. Shaw, M., & Garlan, D. (1996). Software architecture: perspectives on an emerging discipline (Vol. 1, p. 12). Englewood Cliffs: Prentice Hall. 27. Sponsler, Ed and Van de Velde, Eric F. (2001) (http://resolver.caltech.edu /CaltechLIB:2001.004) Eprints.org Software: a Review. SPARC E-News. August-September 2001. 28. D-Lib Magazine 9(4), April 2003. 29. Swan, Alma. (2013) "Open Access institutional repositories: A Briefing Paper" (http://www.openscholarship.org/upload/docs/application/pdf/200901open_access_institutional_repositories.p df). Open Scholarship. Retrieved on Sept. 2013. 30. Tramboo, Sankar ; Hum Applications. 59(16). 31.

IR (http://www.clir.org/pubs/issues/issues04.html)

154

Issues

4

(July/August)

DEVELOPMENT OF KNOWLEDGE THROUGH E-LEARNING Raghvendra Tripathi, Chanchal Gyanchandani Anamika Shrivastava Abstract E-learning means learning through electronic sources or media. The present generation is taking interest in e-learning sources than traditional sources. This has caused the establishment of e-libraries in various institutions in the world. However e-library provides e-learning facilities to its clients with its modern techniques, and application. E-library has totally changed the users learning habits in present time. This paper highlights the elearning habits among different groups like children, students, adult etc. It also explains about the e-learning, objectives of e-learning, functions of e-learning, e-learning habits among the students as well as disabled students. The study also focused the effect of elearning in distance education students. The study clear that e-learning approach has good efficiency in learning and improves the student's achievement and attitudes toward this new systematic way of learning using the new technology based on computer and multimedia tools. Keywords: E-Library; E-Learning; Distance Education; Higher Education; Information Technology. Introduction E-Learning or electronic learning is a kind of non-conventional education method where regular physical attendance and eye-to-eye contact with the instructor in not required and learning and learning can be done from anywhere, at anytime according to convenience of student and at a place suitable to him/her. E-Learning is a technology which support education and learning via ICT like internet, CD ROM or a standalone computer. It is an online teaching method of interactive presentation, videos, chat, online lectures, note, quiz, test etc. In the era of information explosion the enormous amount of information is produced on daily basis. That has led to a constant state of information overload for all of us. The reading habit is also changing in this electronic age. It changes the reading habit from printed information resources into electronic resources. In 21 st century, People have no time to carry or refer any physical material to collect information. The reading habits also allow large numbers of students to access education. The constraints of the face-to-face learning experience, that is, the size of the rooms and buildings and the students/teacher developed countries as well as city area of developing countries, most of the people shifted towards e-reading. This is mostly follows among the students of school, colleges, and Universities etc. To collect any reading material their prominent preference is e-materials though the Internet. The impact of e-reading has affected the growth of researches. In the earlier times collecting articles or journal was a very difficult task; students were totally depended upon traditional libraries. But now most of the research articles are available online, which are easily accessible through the Internet. Collecting articles or journals in 155

their printed format is very much money consuming. Carrying hundred copies of printed books is impossible but thousands of e-books we can easily carry in our pocket via mobile device. In educational institutions teachers are also to using electronic resources and tools to take classes and also to provide e-reading materials to the students for further references. Literature Review Biswas, Vhokto. (2013) discussed in his study the impact of e-learning to accelerate Education system, the development of e-learning, and how it gears up the entire economic development of a nation, and he considered that new form of e-learning environment and the effect of that on learning culture. Information Technology is one of the revolutionary words which have a major impact on human lives. He also consider that that e-learning can help increase student engagement, motivation, and attendance key requisites for learning. SEN, Saswati (2009) focused on academic libraries and their rapidly developing use of information and communication technology and how the academic libraries influence the changes to teaching and learning that will result from an e-education environment. She is also points how the modern academic libraries provide technology based information anywhere, anytime besides providing resources for innovative and lifelong learning. Mestri, M. & Goudar, P. K.(2002) considered in their research that the number of studies on E-learning technology and its applications, content development services, impact are done by various researchers in India & abroad they studied about the various areas of library and information services in which E-learning and its applications can be used because Elearning provides many ways of approaches to old ways of sharing knowledge such as traditional, classroom based education training. Mirjan Radovic-Markovic (2010) tried to measure the role of online learning and how much it is accepted among students and entrepreneurs in Serbia. He conclude that in order to change the existing prejudice, it is necessary to point out to the general public all the advantages of online education, so that both future students and their prospective employers could get the real picture by using a modern and flexible form of education. Internet education will soon become the dominant form of education worldwide, which is to reach its peak in a few years. Main component of e-Learning The content prepared for e-learning aims at delivering a course in an interesting manner with help of all possible media support such as text, animation, simulation, graphics, etc. As such, the tast of designing e-learning material is highly specialized and requires domain knowledge, expertise in different types of software tools that can be used to enhance the content with multimedia and expertise in instructional technology. As such, the process of content creation requires marriage of two types of expertise, i.e. domain knowledge/ subject matter experts and web / instructional designers or multimedia script writter. A combined team of these personnel can make the content livelier and also facilitate representation of the content in logical sequence. The logical sequencing and devivery of content is as important as the content itself since the objective is to assemble content that is easy to comprehend and 156

easier to remember. The content should not be a copy of book or lecture notes delivered in a class. These four quadrants are given below and depicted in figure. First Quadrant: First quadrant defines the structure of course along with textual content. It comprises of basic description of a module, prerequisites, introduction, objectives, keywords, summary, textual content. Second Quadrant: The second quadrant comprises of multimedia enrichment of content that may include audit or video clips, animation, simulations, virtual etc. Third Quadrant: The third quadrant provides links for external resources available on the web as well as supporting material. For example: Did you know? Points to ponder, Glassary, FAQs, link to Wikipedia, other websites, blogs, discussion forum etc. Fourth Quadrant: Fourth quadrant includes the self assessment material. Assessment and evalution questions may be different format like multiple choice questions, true & false statements and sequencing match the columns, problems, quizzes etc. What is E-Learning E-learning or electronic learning or reading the content through an electronic device such as computer screen, an electronic book reader or some other devices. E-learning means learning through electronic sources or Media. E-learning includes numerous types of media that deliver text, audio, images, animation, and streaming video, and includes technology applications and processes such audio or video tape, satellite TV, CD-ROM, and computerbased learning as well as local intranet/extranet and web-based learning. The present generation is taking interest in e-learning sources than the traditional sources. Now a day's users are higher education today. It is very comfortable, flexible, accessible and continually up-to-date with the wider structure of higher education Electronic learning has been considering as an alternative to the face-to-face teaching method or as and complement to it often referred to as hybrid or blended learning. So many researchers have defined the term e-learning in different ways. Some important definitions are as: According to Clark Adrich (2004) "A broad combination of processes, content, and infrastructure to use computers and networks to scale and/or improve one or more significant parts of a learning value chain, including management and delivery". Originally aimed at lowering management cost while increasing accessibility and for measurability of employees, e-learning is increasingly being used to include advanced learning techniques such as simulation and communities of practice and to include customers and vendors as well. According to aptitude media (2006) "E-learning is learning that involves the acquisition generation and transfer of knowledge using information and communication technolohu (ICT)". Laurillard (2006) defined e-learning as "Use of any of the new technologies or applications in the service of learning or learner support". Laurillard (2006) argues that e-learning can make a significant difference in the process of learning and teaching for all the stake holders which include students, tutors and academic 157

institutions. E-Learning can improve the process of learning by measuring how quickly learners acquire a particular skill with reasonable ease and retaining their interest. This ser of complex and new technologies will make an impact on the culture, intellectual, social and Practical experience of learning. Functions of E-Learning The functions of E-Learning are as following: Transformational faculty development must be coupled to institutional change. Transformation to e-learning centre communities, which can be achieved with learning centre technology. Course management system will be critical enabling force driving the institutional change. Faculty development transitioning to learning centre technology. Importance of E-Learning There is some importance of e-learning which clarify the importance of e-learning in the present situation as: To equip e-learning with the information skills to exploit that information. To ensure that access to high quality information is integrated into course provision. To provide appropriate advice and assistance to e-learning in information searching. To address the related communication and costing issue. Easy to accessed in any places, available around 24*7*365 the clock. Need of E-Learning The impact of e-learning has been felt at all educational levels and according to Manjunath. B (2006) e-learning is becoming an influential force in higher education today. He points out some valuable reason why e-learning in the present scenario and what importance in higher education. These are given below: Learning is self paced and gives students a chance to speed up to slow down necessary. Designed around the learner. Accommodates multiple learning styles using as verify of delivery methods geared to different learners, more effective for entrain learners. Geographical barriers are eliminated, opening up broader education options. Learning is self-directed, allowing students to choose content and appropriate to their differing interests, need and skills levels. Enhance computer and Internet skills. E-Learning Habits In the digital era e-reading is very a common work. E-reading is reading the content on a computer screen, an electronic book reader or some other electronic device. Due to modern technology, nowadays, most people prefer to use e-learning than printed material. In modern 158

society people can get all the materials from the Internet as and when they wish. Not only the reading materials but also the multimedia materials are available online. So the user changes their attitude from the traditional way. Though e-learning, one can get the updates regarding a particular topic at a much faster face. So it helps to improve one's knowledge. The term habits are basically suitable for human beings. The way people read has changed with time and it is a continual process. In the earlier time when it was a choice between newspapers, magazines and books, people do not have much choice. Nowadays computers and Internet changed the way people read. Computer screen, an electronic book reader or some other devices used for the electronic reading. The social networks help the people to read the materials vastly. These social networks like Face books, Twitter, Google, etc. The new technology makes the way of the e-learning more attractive. Day by day user habit is converting to traditional book reading to e-learning. Role of Libraries in Promoting E-Learning E-learning offers libraries a powerful medium for reaching faculty and students directly as they engage in teaching, learning research and outreach. In recent days libraries promote the e-reading habits in the public and school levels. They brought the E-books and software's, which are helpful to the students and teachers. A good number of public libraries have already started the use of e-books in their effort to promote better reading environment, keep up with client demands and address the specific need for an e-reading component in providing digital access to professional and academic articles and full-text e-books. Though many school libraries have been slow to respond to the mounting need for an e-reading component in their programs at the initial phase, more school libraries are field-testing the use of services where students and teachers can have access to e-books. Some are even adding e-readers of their collections and allowing students to borrow the e-reading and the books loaded on its hard drive. Now a day's libraries are provides information through different databases, subject gateways, material made available through subscription. Libraries provide access to online subscription based material e.g. Emerald Insight, Elsevier, and Book Za etc. It is very help full to the researchers as well as teachers. So it is clear that libraries are highly promoted e-learning to their users because now a day's users don't have time to spend the library for use the library materials. Advantages of E-Learning Electronic reading has some advantages. When the reader needs the up-to-date information, he can get it through printed materials but it takes more time. For this, electronic or web media help the readers to save time as well as get the up-to-date information. The advantage of electronic books evolved from the specialized features is inherent in almost any kind of electronic document. So some advantages of E-learning are given below. It is easily sharing to the others in a few seconds. The e-resources are very easy to collect and access any time in worldwide. Easy to accessing various resources. Easy and quick reviewing, updating and editing, learning material books. Save the Time for the users. 159

Bariars of E-Learning There are some briers of e-learning are as follows. It is required high Internet connection which is not available in every place. E-learning required technology infrastructure which may be not be available in some countries. Some students might get lost or confused about e-learning actives. Health problems like, body pain, eyes strain, headache etc. Without computer and Internet e-learning is totally off. It may be hacked. Sometime is needed for required software or plug-ins in the devices; otherwise the documents cannot be opened. Difficult to read on screen like font size, color, multi columns etc. Impact of E-Learning in the Educational System ICT has transformed our lives and reshaped the nature of everyday activities and in the contemporarily times a society has emerged which are called the "information society" or knowledge society; a society in which all activates are directly or indirectly drive by information. E-learning has emerged as a blessing to the educational system at all level. It is evolution of the chalk and talk methods to transform and enhance the participative nature of education. E-Learning is a promising tool for expanding and widening access to education. Because they relax space and time constraints, ICTs can now allow people to participate in education by increasing the flexibility of participation compared to the traditional face-toface model. People living in rural areas, non-mobile students, and even foreign students could now more easily participate in E-learning education. Following points highlight the impact of e-learning in the educational system. E-learning provides a faster way of disseminating research to the masses. Through e-learning students experience a more positive learning experienced by making it more dynamic. The concept of e-learning methods has consolidated the concept of lifelong libraries, which is definitely difficult to attain wholly but yes e-learning is a huge step towards lifelong education. Libraries as hub and provider of information feel relieved due to the timely access of information to its clientele. E-learning has definitely overcome the language border factor. Conclusion E-learning stand to provide it right! Through the integration of its various media especially multimedia, E-learning has made the process of learning fun, interesting, involving and more understandable. E-learning has involved day by day and speeding its roots into the education sector it has provided innumerable benefits to it. In the era of information technology students can get easily their course material via Internet. E-learning comprises 160

the new form of learning environment and the effect of that on learning culture. E-learning needs new and emerging, simple and scalable technologies, it can provide an alternative teaching and learning solution, with the potential to simultaneously reach thousands of learners in schools and communities around the world. The study provides that e-learning approach has good efficiency in learning and improves the student's achievement and attitudes toward the way of learning using the new technology based on computer and multimedia tools. References: 1. 2.

3. 4.

5. 6. 7.

8.

Biswas, Vhokto. " E-Learning and its Impact on Education: A Conceptual Study. " ASA University Review 7.1 (2013): 1-17. -Teaching and e. International conference on Academic Libraries. Delhi. 2009. pp 176-179 .available at http://crl.du.ac.in/ical09/papers/index_files/ical29_46_135_1_LE.pdf Mestri, M. &Goudar ,P. K.(2002) .E-learning and its application in library and information services. University News. Association of Indian Universities. 40(7), 13-18. Mirjan radovic-learning in comparison to traditional forms of -298. available at http://ideas.repec.org/a/pet/annals/v10y2010i2p289-298.html (18/04/2014) C.K. Sharma, Sushma Gupta and Anil Kumar. E-Library New Delhi: Shree Publication & Distributors, (2010). Chis Hall, Nicola Van Den Berg and Kemi Adamson. "What is elearning?" A guide to e-learning (2007): 1-8. Gilroy, K. (2001). Collaborative: the right approach. El-Arif, Arman Abdallah and Taha. "The Effect of elearning Approach on Students' Acievement in Biomedical Instrumentation Course at Palestine Polytechnic University." Communications of the IBIMA 9 (2009): 1-6. http:/www.ignouflexilean.ac.in

161

E-LEARNING ENVIRONMENT: THE NEW CONCEPT OF ACADEMIC LIBRARIES

Nidhi S Tiwari, Manoj Tiwari Dr. Ramnivas Sharma Abstract: As the information technologies are changing day-today and growing at a tremendous speed, the knowledge society is becoming more complex, competitive and dependent on technological changes and information explosion. The need for e-information services to the users are also growing and becoming very essential. The impact of web based e-learning and teaching environment has influenced every facet of library and information services in academic libraries and providing new opportunities and challenges to the library professional for involvement in the knowledge based society including electronic and multimedia publishing, Internet based-information services, global networking, web based digital resources etc. Librarians are charged with selecting and organizing resources and instructing patrons on how to locate and use these, and preserving information regardless of format or technology. The information revolution and the knowledge that is available on the Web have created new challenges to thse traditional professional ethics. The emerging challenges of acquiring and providing access to electronic knowledge resources require librarians to change their role from traditional librarian to information scientist by learning and applying new skills to understand the evolving technologies to manage and provide quality on-line information service to the knowledge society. So the vision of the future academic library professional must be to create a World Class Networked Global Library and Information Centre to provide timely web based quality information service to the user in time in the e-learning environment. The future visions need for changes in academic libraries, trends and challenges before the library professional in the e-learning environment and the various changing roles of the academic library professional also have been discussed in this study. Keywords: Information Technology, academic Library Professional, Web Technology, Technology challenges, e-resources, e-learning environment. Introduction: The information atmosphere around the world is changing every minute and growing at a tremendous speed due to the emergence of the web based Information and Communication Technologies (ICT), globalization of networks and Internet. Hence ensuring and organizing access to educational materials in the electronic environment is an important factor in determining realistic requests for development and advancement of education. The information revolution and the pervasive thinking that everything is available on the Web have created new challenges to the traditional library professional ethics. Acquiring and providing access to electronic knowledge resources require library professional to change their role from traditional librarian to information scientist by learning and applying new skills to understand the evolving technologies to manage and provide quality on-line information service to the patrons of the knowledge society. Since, 162

almost all the educational institutions, organizations, universities and academic associations have created their own websites with the digital repositories on Internet; the global networked environment has paved the way and opportunity to e-literacy. The impact of web based e-learning and teaching environment has influenced very much on every facets of library and information services in Academic Libraries and providing new opportunities and challenges to the library professional. Objectives: The objectives of this study are given below: 1. The primary objective of this study is to analyze and explore the changing vision and the roles of future academic library professionals accordingly to meet the changes and challenges in the e-learning environment. 2. To document the various changes and challenges evolved before the academic library professional in the e-learning environment 3. To define and explain the concept of e-literacy and digital learning environment in academic institutions which changes the role of library professional to the real situation. 4. To discuss about the various skills needed for the library professional to meet the present online and digital needs of the user. Vision of the Future Academic Library Professional: Technology will continue to change, and libraries and librarians have to use the changing technology to provide the best access and service to their patrons. Electronic information creates challenges for the library community at its very foundation, moving it away from the traditional paper-and-print format to an ethereal world of circuits and connectivity. The library is no longer defined simply as a building or a physical repository that houses information. So the essential future vision of the academic library professional to achieve the necessary information transformation and to face the digital information needs of the user should concentrate on the following: The vision of the future academic library professional must be to create a World Class Networked Global Library and Information Centre to provide web based quality information service to the user in time in the e-learning environment. The librarians must change the library environment as pathways to high quality information in a variety of electronic media and information sources. Library professional must assert their evolving roles in more pro-active ways, both in the context of their academic institutions and in the context of increasing competitive markets for information dissemination and retrieval. The vision for the 21st Century librarians must offer electronic teaching and learning both to guide and beckon the library profession as education leaders. They should shape the library programme and serve as a tool for library media specialists to use to shape the learning of students in the academic institutions. Review of Earlier Literature on Challenging Roles of Librarians in the E-Learning Environment: The concept of a digital library and its usages for faculty at the university 163

and the changing role of librarians in creating and managing digital libraries are described by Joseph Janes, Assistant Professor at the University of Washington Information School. He also presented a case study of the Internet Public Library developed between 1994 and 1995 by the then School of Information and Library Studies at the University of Michigan which illustrated how a digital library can support education9. Christine Dugdale in her presentation on Electronic Library System which offers access to electronic reservation systems, current awareness service, has shown how short loan collections can provide access to a great quantity and range of material for a larger distribution of learners6. Bonk (2004), reviews the trends in online e-literacy programmes in colleges and universities both in the United States and around the world, which describe the desire of teachers to empower the learner, the power of future developments such as simulations and virtual world technology in education4. Karen Jurasek says that libraries must uphold professional standards and a commitment to service11 Also he describes that along with its services, resources, and technology, the library is both a physical and virtual space for the 21st century. He also concludes that the academic library professional must develop a virtual electronic learning system to enhance virtual learning environments (VLEs) and the aim of his project was to integrate open library resources and closed learning environments. Also he describes that since Virtual Learning Environment contain links to resources, both licensed and free, overlaps with electronic reserve systems, and has a dynamic linking potential with library, librarians should be involved in creating and maintaining VLEs as resource managers in this new environment of web-based courses17. Kasperek, Johnson, Fotta, and Craig, found that comfort level both with the library in general students are more comfortable with librarians once they have the opportunity to get to know library responsibilities also improved his subject specialty liaison work13. Dewey likewise promotes the embedding of academic librarians into as many campus venues as possible as a collaboration and Gamble argues for the recognized presence of academic librarians on university governance committees, faculty unions, clubs and student activities as legitimate modes for providing university service that ought to be valued and rewarded by library administration. E-Literacy/Virtual Learning Environments in Academic Institutions and the Digital Future of the Academic Libraries: E-learning is a means of becoming literate, involving new mechanisms for communication, such as: computer networks, multimedia, content portals, search engines, electronic libraries, distance learning, and web-enabled classrooms. Different web based applications such as email, real-time conference; Web Cam, etc. are being used as important tools in the process of e-learning. Technological innovations have brought tremendous changes in the whole education process and have led to a paradigm shift from teacher based education to a learner based education system. Developments in the electronic networking frontier have changed the whole dimension of the education system. Internet, another cost-effective solution of reaching out to the learners at a distance, is gaining ground throughout the world. It is acting as a catalyst for change in the education 164

process. It has taken education beyond the classroom and lecture hall into a new era of networked and collaborative learning. Since the aim of e-learning environment in education is to enhance students based educational projects, in which they directly experience different cultural contexts and access a variety of digital information sources via a range of appropriate Information and communication technology, the future academic library professional should change their role by developing new standards and skills accordingly to meet the future digital information needs of the users. Today almost all the academic institutions, universities and college libraries have been automated by library software and have become connected with Internet, intranet and extranet facilities and through which they are providing access to relevant e-journals and e-books by proxy-server based networks. So the future of the academic library services may be changed accordingly to fulfill the needs of the patrons in the e-learning environment. Libraries have an outstanding potential as the third place, after home and work with learning, inspiration and entertainment. Hence it is very essential to change the environment, structure and interiors of the academic libraries according to the digital information needs of the user and the future library should not have collection storage as its main function. E-learning opportunities must be enabled by the library professionals to the user in global level to access a variety of digital information sources via a range of appropriate World Wide Web technology. E-Learning is a catch-all term that covers a wide range of instructional material that can be delivered on a CD-ROM or DVD, over a local area network (LAN), or on the Internet. It includes Computer-Based Training (CBT), Web-Based Training (WBT), Electronic Performance Support Systems (EPSS), distance or online learning and online tutorials. The major advantage to students is its easy access14. So, providing access to online e-journals and e-books through networks will enhance the self-learning knowledge of the user. Trends and Challenges before the Future Academic Library Professional in the ELearning Environment: The first and foremost challenge before the library professionals to face the future academic needs of the user in the e-learning environment is to provide electronic access to all relevant information and integrate it on networks across the world. The second challenge is to create a new physical library premises with computer network facilities, abandoning the old concept of library as a storehouse, and, the third challenge to future library professionals is to develop new standards and skills for the library profession to meet the user needs in a proactive way. In this e-learning and e-publishing environment, electronic reference services and other support services with various expertise and digital repositories are becoming a must. The most pressing and pervasive issues and challenges that the library and information science professionals face in the present digital era for providing digital information service to the knowledge society are: i) New generation of learners ii) Copyright iii) Privacy/Confidentiality iv) Online/Virtual crimes and Security

165

v) Technology challenges vi) Manpower vii) Collection of digital e-resources viii) Organizational Structure ix) Preservation / archiving of digital e-resources x) Lack of clarity in vision The New Generation of Learners They are coming to higher education with aptitude, knowledge and expectations that have been shaped by the use of the Internet, digital media, and portable communication technologies. Students often begin their search for information with Google or similar commercial or social search engines. The academic library professional must develop a virtual electronic learning an increasingly diverse group of users. Copyright An important issue that the present day library professionals are facing in providing electronic/digital information service is the large scale of piracy of software and plagiarism. The cost and timeliness in retrieving the information are also considered. When negotiating access with a publisher, the librarian must agree to certain restrictions on photocopying or distribution of electronic materials. Despite copyright notices and efforts to educate employees and users about intellectual property rights, electronic publications can be easily forwarded to people outside the licensed user group. The library is responsible for maintaining the awareness of all users about copyright issues. Privacy/confidentiality Maintaining privacy and confidentiality is another problem in accessing online information. To control pirating of software, copying or downloading all the contents of any e-resource at a time, right to obtain information and right to withhold or ban the access is essential and so there is a delicate challenge between privacy and rights to information. Now a day almost all the users are having their own e-mail accounts and they are often sending and receiving important information and even secret programmes and databases through e-mail itself and storing them for future usage. So maintaining privacy from e-mails is a great issue. Protecting one network from another to maintain confidentiality of information is another problem in securing databases on Internet and Intranet. Online/Virtual Crimes and Security

166

security, we will be able to have a positive impact o Presently, Web/cyber crimes have become a common threat on internet. To overcome this issue, compulsory Virus Proof procedures should be adopted while downloading e-information from any other system. To secure the system from viruses, the databases can be modified by hacker proof procedures. Separate login and password systems are to be compulsorily adapted to the Network systems. In the LAN environment, the real danger is the gradual erosion of individual liberties through the automation, integration, and interconnection of many small, separate recordkeeping systems, each of which alone may seem innocuous, and wholly justifiable. To overcome the above database security problems and issues, it is essential to install a database security software or firewall technology like Norton Anti-virus software and IBM e-network Firewall technology to protect the databases. Technology Challenges clearly states that everyone should have access to information. The recent explosion of information available on the Internet presents challenges to the traditional American Library Association (ALA) code of ethics that is taught in library school. Librarians make ethical decisions every day on the basis of the culture of their organizations. Some organizations limit access to particular levels of employees by requiring a username and password; others may institute behind-the-scenes filtering software or restrictive policies for providing access to the entire Internet. Because these steps challenge the very essence of librarianship, the access policies will help to clarify who has access to the Internet, under what conditions, for what purposes, and with what restrictions. Policies should consider how to integrate the new technology and how its use reflects the objectives and values of the library. Organizational Structure Technology has broken down the rigid hierarchical structure of the organizations which is another important issue in changing the roles of the librarian in the knowledge society. Far from emulating the organization of conventional libraries, the organization and structure of digital libraries, and the division of labour within them, are open to considerable experimentation. For example, as publishers and professional societies disseminate works electronically, they are testing how far their investments should incorporate the full range of library functions, and the digital libraries license content from publishers and professional societies that manage their own repositories. Collection of e-resources Collecting the materials and making it available to all current and future users is another core value of librarianship. The challenge is for the librarian to contribute to establish realistic collection-development policies covering acquisition of and provision of access to electronic resources for users now and in the future. With the increase in electronic resources, librarians and libraries are no longer just collecting and caring for print materials. Unlike a print book or a journal, electronic resources cannot be considered a permanent 167

addition to a collection. Payment for a product covered by a license is a payment to use the information product for a period of time that is usually specified in a contract. This payment is not for the outright purchase of the product or for ownership of all the rights to that product. A digitized collection means that libraries share the use of the collections with other institutions, not only locally, but also globally. It is the publisher who dictates how much access will be provided, which issues will be available, and how much that access will cost. Changing Roles of Future Academic Library Professional: The changing role of library professional implies a set of updated skills needed for facing the challenges created by the latest web technologies in the e-learning environment. The emphasis will shift from technical skills in the library to communication, facilitation, training and management skills. Although technology presents the librarian with ethical challenges, the librarian is to be ready for the role of information professional in the connected networked world and they have to acquire skills that can be contributed to success in their new roles. Leadership Role One primary role of librarians is to provide leadership and expertise in the design, development, and ethical management of knowledge-based information systems in order to meet the information needs and obligations of the patron or academic institution. In the future, as now, we can expect the virtual library to be the organization that identifies, selects, negotiates for, and provides access to an incredible range of information resources on our behalf. At present, lot of virtual libraries have been created and managed by various institutions and organizations for e-learning and teaching professional. Hence library professional should enrich their management skills to play leadership role in the digital future, for organizing, managing and disseminating e-literacy to users. Proactive Information Professional Role The modern trend is for the role of the librarian to move from that of a passive intermediary role responsible for guiding patrons to appropriate information resources, towards that of a much more proactive professional role which includes analyzing and repackaging information, content information management systems and institute digital repository management systems. Role of Information Scientists in Digital Libraries and E-Literacy Librarians have to change their role in the e-learning environment by participating in elearning experiments and becoming involved in univ -learning centers. They should invest in procuring e-learning tools and software and should develop their e-learning and ICT skills. Hans Roes addressed changes in education in general, and then focused on strategic opportunities in education for libraries. The opportunities for libraries, he mentioned, included: Developing digital libraries as natural complements to digital learning environments to support educators with respect to the selection of adequate resources for a given course; 168

Managing and indexing digital student portfolios and integrating them with other information resources offered by the library; Teaching information literacy to educate future knowledge workers, in traditional ways or via Internet-based instruction modules; Collaborating as part of multidisciplinary teams of experts to design courses; Providing a learning center to serve as a physical learning environment suitable for more active learning styles. Conclusion: The remarkable growth of Internet has made significant revolution in all the areas of science and technology. Rather than using it as a tool for searching and retrieving information, Internet has become the king of all media, by which we can access virtual information and can build a virtual library to provide timely, quality service to the users. Librarians of this digital era, are in the position to change their role as arbitrary information scientists/gatekeepers and to meet the challenges of the Internet, World Wide Web, online access in the knowledge society. So they must enrich their knowledge with special skills of the latest IT developments, to browse, access and retrieve a particular information across the global networks and to organize and manage the information by building digital libraries and by which they can provide quality e-information service to the knowledge society. Library staff must be capable of working effectively in partnership with faculty members to enhance the strength of teaching and research. To be certain, there are many staff members of this kind in academic libraries today. In this as in other respects, part of the skills, library staff must develop is the ability to educate faculty members, helping them to understand the power and applicability of e-resources. References 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17.

Abbott, Andrew(1998). Professionalism and the Future of Librarianship. Library Trends, 46.3, 430-445. Allen, Michael W.(2003). -learning. John Wiley. Arant, Wendi and Benefiel, Candace R.(2003). The image and role of the Librarian. Haworth Press. Bonk, C.(2004). The Perfect E-Storm: emerging technology, enormous learner demand, enhanced pedagogy, and erased budgets. Observatory on Borderless Higher Education, Reports, June. Dewey, Barbara(2004). The Embedded Librarian: Strategic Campus Collaborations. Resource Sharing & Information Networks 17.(1/2), 5-17. Dugdale, Christine(1999). The role of electronic reserves in serving and shaping new teaching and learning environments in UK universities. Journal of Information Science. 25(3), 183-192. Gamble, Lynne E.(1989). University Service: New Implications for Academic Librarians. The Journal of Academic Librarianship 14(6), 344-347. Horton, William(2000). Designing Web-Based Training. John Wiley. Janes, Joseph(2001). Digital Libraries as learning tools. Tilburg: Ticer B.V., p.3.1.-3.6 Joint, N.(2005). Strategic approaches to digital libraries and virtual learning environments (VLEs). Library Review, 54 (1), 5-9. Jurasek, Karen(2008). Trends and challenges before the future academic library professional that Will Shape the Future of Academic Libraries. Kasperek, Sheila, et al.(2006). Do a Little Dance: The Impact on Students when Librarians get involved in Extracurricular Activities. The Journal of Academic Librarianship 32(.6). Kinnie, Jim(2002). Making a Case for the Tenure Banjo. American Libraries 33(10), 58. Kurtus, Ron(2004). What is e-learning. (accessed on 20.06.2009) Levy, Philippa and Roberts, Sue(2005). Developing the new learning environment: the changing role of the academic librarian. Facet Publishing, 256p. Lipka, Sara(2004). The Secret Lives of Academics. Chronicle of Higher Education 51(8), A6. MacColl, John(2001). Project ANGEL: An Open Virtual Learning Environment with Sophisticated.

169

APPLICABILITY AND AVAILABILITY OF OPEN SOURCE SOFTWARE AND CLOUD COMPUTING IN LIBRARIES Dr. Sarita Verma Mrs. Rashmi Sikarwar Abstract The new technological development has brought a major change in every field. Cloud computing is a new trend that is still evolving across the information technology industry, Business and academics. Everything is available on the cloud; we can access the information, data, files, software, websites etc. from anywhere, at any time in the world. Cloud computing data centers deploy input/output and networking solutions that scale to support any level of performance. Open Source Software is a unique concept where everybody is free to change the code of software and upgrade its utility. Open source programs are typically created as a collaborative effort in which programmers offer the user a flexibility of use and share the changes within the community. Library professionals should be aware of the advantages of open source software and should involve in their development. They should have basic knowledge about the selection, installation and maintenance. This paper explore to the actual meaning of cloud computing and OSS and also discuss the applicability of OSS with cloud computing in libraries. The use of OSS and cloud computing in libraries and how both actually work is illustrated in this communication. Keywords: Open Source Software, Cloud computing, ICT, Future technology, Library system Introduction Open source is a development method for software that harnesses the power of distributed Peer review and transparency of process. The promise of open source is better quality, higher Reliability, more flexibility, lower cost, and an end to predatory vendor lock-in. Open Source Software (OSS) is a recent phenomenon that has the potential to revolutionize the software industry. It has already gained a strong foothold in the server software segment, with a leading market share worldwide in some software categories. This development culture includes hundreds of thousands of distributed programmers voluntarily producing, sharing, and supporting their software with no monetary compensation for their efforts. Hasan effort in which programmers offer the user a flexibility of use and share the changes within Parallel Computing, Grid Computing and Distributed Databases. Cloud is a term used as a metaphor for the wide area networks (like internet) or any such large networked Kotwani & Kalyani, 2013, p.4). It came partly from the cloud-like symbol used to represent the complexities of the networks in the schematic diagrams. The basic principle of Cloud Computing is making tasks distributed in large numbers of distributed 170

computers but not in local computers or remote servers. Each of the Cloud providers have their own set of pricing, billing, flexibility, support and other important parameters in their model of computing the service. Review of Related Literature Padhy and Mahapatra(2012) explored in their research about the application of cloud computing in academic libraries in Orissa. Now Information and communication technologies have become a global aspect of the entire world. It has been changed our daily life, all forms of endeavor within business and governance with the help of different technologies. Cloud computing is a new concept that is still evolving across the information technology industry and academia. The basic principle of cloud computing entails the reduction of in-house data centers and the delegation of a portion or all of the Information Technology infrastructure capability to a third party. In this study researcher discusses the problems faced with digital library and development efforts to overcome that problem. Ghosh(2012)studied that how cloud computing helps to extend Library services for better sustainability in the Academic field. It enables computer software and hardware resources to be accessed over the Internet without the need to have any detailed or specific knowledge of the infrastructure used to deliver the resources. cloud computing is a set of technologies that allows computing applications and data to be exposed as a set of services from a pool of underlying resources. Scacchi(2007) studied an initial set of findings from an empirical study of social processes, technical system configurations, organizational contexts, and interrelationships that give rise to open software. The focus is directed at understanding the requirements for open software development efforts, and how the development of these requirements differs from those traditional to software engineering and requirement engineering. Four open software development communities are described, examined, and compared to help discover what these differences may be. Eight kinds of software in formalisms are found to play a critical role in the elicitation, analysis, specification, validation, and management of requirements for developing open software systems. Hasan(2009)explored the future of OSS with the important characteristics of the OSS as per the individual requirements. He described importance of various OSS, being presently used worldwide with special reference to the popular OSS used in library and information science in India, highlighted merits, demerits OSS environment were also discussed. Objectives of Open Source Software and Cloud Computing in Libraries They promote creative developments in libraries. Those who can't afford proprietary software can download open source programs free from Internet. Money saved can be used to purchase other needed materials. 171

With the help of them we librarian can easily modify your software to suit patron's needs and your need. Open Source Software (Oss) Open Source Software (OSS) came into existence with the development of ICTs. The term that users can modify it to make it work according to their needs. OSS also helps in taking care of severe budget cuts, increased demand for services, lack of adequate staffing, etc. Open source software is written and supported by programmers, many coming from the hacker culture. OSS is just a software development methodology, according to free software advocates like Richard M. Stallman and the Free Software Foundation. The design behind OSS becomes an important phenomenon in the computer science world: there are thousands of OSS projects and millions of users of OSS systems, such as Linux, Apache or Python. It is often characterized as a fundamentally new way to develop software that poses a serious challenge to the commercial software business dominating most software markets today. It supports global cultural understanding. Programmers demonstrate creative thinking, construct knowledge, and develop innovative products and processes using OSS technology. Open source has become a key enabler for cloud computing by providing both cheap inputs (as in free) as well as rich capabilities to providers of cloud services. Educational institutions, faculties and other institutions develop software for educational and research purpose. Part of this software ends in the group of proprietary software, whiles some of it, according to parameters, and belongs to the group of open source software. Cloud Computing In the age of Information and Communication Technology, libraries are using various types of technologies to aid the services they provide. Everyday new technological advances affect the way information is handled in libraries and information centers. The impacts of computing resources (hardware and software) that are delivered as a service over a searchi software professionals in organizing the so-called 'anarchy' of the Internet network Das & Mandal,2013,p.394). Computing technology, communication technology, mobile technology and mass storage technology are some of the fields of continuous development that reshape the way that libraries access, retrieve, store, manipulate and disseminate information to users. The academic library has been from its inception an integral part of institutions of higher learning, rather than an appendix or adjunct. Libraries are facing many challenges in the profession due to applications of information technology. The latest technology trends in library science is use of cloud computing for various purposes and for achieving economy in library functions. Cloud Computing is a completely new Information Technology and it is known as the third revolution after PC and Internet in IT. Ghosh(2 computing applications and data to be exposed as a set of services from a pool of underlying in a facile 172

manner. Cloud computing entrusts services with a user's data, software and computation over a network. It is a style of computing which must essentially caters to the computing needs of dynamism, abstraction and resource sharing.

Fig.1.The Cloud Abstraction The open source project is difficult to analyze as one abstract social appearance. In addition, it is difficult to define what the part of project is and what it is not. Fortunately, we can consider and analyze open source projects thanks to their presence on the Internet and public communications. Our Libraries should focus on core competency and should not worry about security, operating system (OS), software platform, updates, patches, etc. Resource Sharing Resource sharing is the most important benefit of cloud computing. This helps the cloud providers to attain optimum utilization of resources. The services provided by Cloud Computing can be broadly categorized into 3 stacks, 1.SaaS (Software as a Service). 2. PaaS (Platform as a Service) 3. IaaS (Infrastructure as a Service) Cloud computing also incorporates Web 2.0 and other recent technology trends which have the common theme of reliance on the Internet for satisfying the computing needs of the users.

173

Software as a Service (SaaS) - Software as a Service, or SaaS, is probably the most common type of cloud service development. With the SaaS, a single application is delivered software; rather, they pay for using it. Users access an application via an API accessible over the web. Platform as a Service(PaaS)-In th of like creating an application using Legos; building the app is made easier by use of these predefined blocks of code, even if the resulting app is somewhat constrained by the type of code blocks available. Infrastructure as a Service(IaaS for short)is the foundation of cloud computing. Rather than purchasing or leasing space in an expensive datacenter, labor, real estate, and all of the utilities to maintain and deploy computer servers, cloud networks and storage, Cloud buyers rent space in a virtual data centre from an IaaS provider. They have access to the virtual data center via the Internet. This type of cloud computing provides consume, including (but not limited to) CPU cores, RAM, hard disk or storage space, and data transfer example IaaS providers include Profit Bricks, Amazon EC2, or the Rackspace Cloud. .

Fig. 2 Services provided by the Cloud

Dynamism 174

Libraries have very dynamic environment today, with computing requirements fluctuating with demand. Demand on the other hand is unpredictable and inconsistent leading to uncertainty about computing requirements. Applicability Of Open Source Software And Cloud Computing In Our Libraries Open source is the great enabler of cloud computing. From Google and Yahoo! to Amazonand eBay, the precursors of cloud computing utilized the freely available, freely modify able nature of open source to build highly customized systems on a never-beforeseen scale to power their Web-based applications. It is the success of these massive-scale systems that has led to the rise of cloud computing which is a generalization of the same techniques and technologies used by Google and others to enable developers the world over to tap into a model of computing that would otherwise not be affordable (or even available). Because open source is fundamental to cloud computing, it is not surprising that the dominant software stacks used in cloud environments are also open source. Over the past few years, the open-source community, in all its different colours and hues, has demanded that the technology world pay attention to software freedom and its tendency to lower cost, improve interoperability, and more. This freedom is accomplished through open-source licenses like the GNU General Public License. Open source licenses are licenses that comply with the Open Source Definition in brief, they allow software to be freely used, modified, and shared. To be approved by the Open Source Initiative (also known as the OSI), a license must go through the Open Source Initiative's license review process. Digital library, as we all know, is famous for its academic and technical influences. And IT technology has been the driving force of library development. What's more, librarians can keep using new technology to develop digital library and optimize library service. By collecting large quantities of information and resources stored in personal computers, mobile phones and other equipment, Cloud Computing is capable of integrating them and putting them on the public cloud for serving users. Open source has taught us to expect openness by default. Kotwani &Kalyani synergies between Open Source Software (OSS) and cloud computing. The cloud makes it a great platform on which OSS business models ranging from powering the cloud to offer made an indelible imprint on cloud computing. It gave it life by providing the raw material upon which many private and public clouds are built. It gave it a conscience by setting the industry's default principle to openness. OSS can power the cloud infrastructure similarly as it has been powering the on-premise infrastructure to let cloud vendors minimize the TCO. Not so discussed benefit of the OSS for cloud is the use of core algorithms such as Map Reduce and Google Protocol Buffer that are core to the parallel computing and lightweight data exchange. There are hundreds of other open (source) standards and algorithms that are a perfect fit for powering the cloud. Why Use Open Source Model For Librarians/Libraries? Tristan (2011) explored in his research t multifunction, adaptable software applications that allow libraries to manage, catalogue and 175

circulate their materials to patrons. In choosing ILS software, libraries must base their decision not only on the performance and efficiency of the system, but also on its fundamental flexibility to readily adapt to the future demands and needs of their read, redistribute, and modify the source code for a piece of software, the software evolves. People improve it, people adapt it and People fix bugs. And this can happen at a speed that, if one is used to the slow pace of conventional software development, seems astonishing. OSS also helps in taking care of severe budget cuts, increased demand for services, lack of adequate staffing, etc. Hasan the capability to create the software that we have always wanted - standards compliant, interoperable, extensible and scalable software that does what we want it to do. It helps customers find information quickly, conveniently, no matter where that information resides. To choose open source because it gives you the freedom to use, change or distribute the way Advantages Of Open Source Softwareand Cloud Computing Some advantages of open source software are: The source code is available to users and they have the rights to modify them. This will allow improvements to the software without having to invest large sum of the money in research and development. The modified and improved source codes can be freely redistributed. The software can be used in any way and for any legal purposes. There is no restriction in a unilateral way on how the software could be used. Open source solutions generally require no licensing fees. The logicalextension is no maintenance fees. The only expenditures are for media, documentation, and support, if required. Obtain the software once and install it as many times and in as many locations as you need.

In general, Linux and open source solutions are elegantly compact and portable, and as a result require less hardware power to accomplish the same tasks as on conventional servers (Windows, Solaris) or workstations. Potential Pitfalls Cloud with Open source software considerably reduces vendor lock-ins and interoperability issues, but still such risks are not eliminated. Some of the open source platforms need to ill do so. Open 176

source based clouds that are community driven suffer problems due to lack of solid vendor support. Some kind of organization, society or association should be formed to take care of the management and progress of such community driven projects. Security concerns regarding the data and content is a constant threat to clouds whether they are open source or proprietary. Movement of Open Source based clouds to proprietary clouds is always possible. Some Examples Of Oss And Cloud For Libraries

EucalyptusEucalyptus-is a free and open-source computer software for building Amazon Web Services (AWS)-compatible private and hybrid cloud computing environments marketed by the company Eucalyptus Systems. Eucalyptus enables pooling compute, storage, and network resources that can be dynamically scaled up or down as application workloads change.

Global nimbus Nimbus Platform is an integrated set of tools that deliver the power and versatility of infrastructure clouds to scientific users. Nimbus Platform allows you to combine Nimbus,Open Stack, Amazon, and other clouds.

Abiquo Abiquo provides an enterprise software solution that enables organizations to leverage their existing virtualized infrastructure to drive a dramatically better quality of service to all key stakeholders, including IT Operations, Application and Development teams, and the CFO/Compliance groups.

Deltacloud Deltacloud is an API (Application Programming Interface) that abstracts the differences between clouds and protects the applications from incompatibilities. It is an application

177

programming interface (API) developed by Red Hat and the Apache Software Foundation that abstracts differences between cloud computing implementations.

Puppet Puppet is an open source configuration management tool designed by Puppet Labs. Puppet is IT automation software that helps system administrators manage infrastructure throughout its lifecycle, from provisioning and configuration to orchestration and reporting. Using Puppet, you can easily automate repetitive tasks, quickly deploy critical applications, and proactively manage change, scaling from 10s of servers to 1000s, on-premise or in the cloud.

Funambol Funambol is powered by open source and provides personal cloud solutions. It is the leading mobile cloud sync solution. The Funambol open source project consists of a free and open source sync server that provides PIM (address book and calendar) synchronization, and device management for wireless devices, leveraging standard protocols such as SyncML. This open source project is also a development platform for mobile applications. It provides C++ and Java client APIs and server side Java APIs, and facilitates the development, deployment and management of mobile projects.

Drupal Drupalis open source based content management software which is currently behind millions of websites and applications round the globe. An amazing feature of Drupal is that, it is built, used and supported by an active and diverse community of people around the world. Thus, making it truly an epitome of open source.

Joyent

178

Joyent offers Iaas and Paas for large enterprises. JoyentInc is a software and services company based in San Francisco, California. The company specializes in application virtualization and cloud computing.

Zoho Zoho provides support to businesses with applications that help to get increase in sales, support the customers and thus make the business more productive. It has applications for Customer Relationship Applications. Future of Libraries With Oss And Cloud When Libraries, Educational intuitions, Organizations go on cloud then it might run advertisements on all pages to earn revenues, use Photo cloud to store pictures of users and keep only login data on their servers. This way they can earn some profit from advertisements and might be one day share and distribute their data with their userbased cloud. The future would be bright and dynamic. There is so much scope when OSS meets the cloud. In India, a recent news article highlighted that most of the upcoming websites, businesses and new innovations are the brain-child of students and working professionals who are based in small towns. Small cities are becoming the centres of innovations and experimentations. This is good news. It shows that finally Internet and technology is working towards bringing equality. We hope that OSS with cloud will bring huge benefits in terms of services, which will in turn encourage more library profession also opt for entrepreneurship against opting for traditional jobs in the Academic field. Conclusion The effectiveness of cloud computing has already been seen for some applications, more work should be done on identifying new classes of novel applications that can only be realized using cloud computing technology. With proper instrumentation of potential applications and the underlying cloud infrastructure, it should be possible to quantitative else valuate how well these application classes perform in a cloud environment. Along these same lines, experimental software engineering research should be conducted to measure how easily new cloud-based applications can be constructed relative to non-cloud applications that perform similar functions. This research should also compare the dependability of similar cloud and non-cloud based applications running in production environments. Application-focused research will help organizations make well-informed business decisions on where to apply cloud technology and give cloud technology developers guidance on what kinds of improvements to the technology. When Open Source software has emerged as one such potential area which offers companies, Institutions, Organizations cost cutting with an added advantage of flexibility. Cloud deployments also save money and help the professionals to lead a hassle free job in terms of computing services. A perfect solution for medium for all type of Academic institution, Libraries, Organizations to cut costs ,run better and increase their survival in 179

d. OSS has much potential for libraries and information centers, and there are numerous projects, including Koha, NewGenLib, Greenstone, DSpace, Ganesha, etc. that demonstrate its viability in this context. It gives library staff an option to be actively involved in development projects, and this involvement can take many forms, such as reporting bugs, suggesting enhancements and testing new versions. References:1.

2. 3. 4. 5. 6. 7.

Das,A.K. &Mandal S.(2013).Development of cloud computing in integrated library management and retrieval system. International Journal of Library and Information science,5(10),394-400 Retrieved from http://www.academicjournals.org/IJLIS Deb, S. (2006). TERI integrated digital library initiative, Electronic Library, 24(3), 366-379.Retrieved online from www.emeraldinsight.com Edwards,K. (2005). An economic perspective on software licenses: Open source, maintainers and userdevelopers, Telematics and Informatics, 22(1- 2), 111-133. th Ghosh S. (2012). How Worthy is Cloud Computing for Libraries. 8 Convention PLANNER,419-422 .Retrieved online from www.ir.inflibnet.ac.in/handle/1944/1694 Hansen, M. et al. (2002). The open source approach: Opportunities and limitations with respect to security and privacy, Computers and Security, 21(5), 461-471. Hasan,N.(2009).Issues and Challenges in Open Sources Environment with Special Reference to India.266271.Retrieved from www. Kotwani,G. &Kalyani,P.(2013).Applicability of Open Source Software(OSS)with Cloud computing.International Journal of Inventive Enginnering and Science (IJIES) 1(10),4-10,Retrieved from

8.

Lochhaas,S.&Melissa.(2010). Open Source Software Libraries. Retrieved from Graduate College SLIS B Sides 17 9. Padhy,S.K. &Mahapatra, R.K.(2012). Cloud Computing : Academic Library in Orissa .VSRD-TNTJ, 3 (3), 124-130 Retrived online from www.vsrdjournals.com 10. Randhawa,S.(2008) . Open Source Software and Libraries, 369-377Retrived online from 11. Roets,M. et.al(2007). Open source: towards successful systems Development projects in developing countries. Proceedings of the 9th international conference on social implications of computers in developing countries, sãopaulo, brazil 12. Scacchi,W. (2007) Free/Open Source Software Development : RecentResearch Results and Emerging Opportunity.Retrieved from http://www.ics.uci.edu/~wscacchi 13. Scacchi,W.(2012).Understanding the Requirements for Developing Open Source Software Systems.Retrived online fromhttp://www.ics.uci.edu/~wscacchi 14. Tristan,M.(2011).How to Choose a Free and Open Source Integrated Library System. 27(1), pp. 57-78. Retrieved from www.emeraldinsight.com/1065-075X.htm 15. hhttp://www.opensource.org/ 16. http://www.unesco.org/ 17. ttp://slis.uiowa.edu/~slochhaas/osslibraries/ 18. http:/hrushikeshzadgaonkar.wordpress.com

180

GWALIOR COLLEGE LIBRARY NETWORK (GWACLIBNET): AN INITIAL STEP FOR THE PROPOSED MACLIBNET

Dr. Anil K. Sharma Abstract The most important objective of networking of libraries is to maximize the access and availability of resources at minimum cost. No library information & resource center of the world can think to become self sufficient in resources to meet the multidimensional needs of the users. The rapid of explosion of knowledge, the rapid changes of pattern and growth of publications, price escalation of information resources, and ever increasing expectation of users compelled the library & information professional to think of networking and resources sharing. In light of the above this paper intends to provide an overall idea regarding library networking and its importance with special reference to the proposed MACLIBNET. As references the author also mention about the different library networks in India. This paper highlighted the various issues and the challenges to be encountered at the time of establishment of the proposed library network. The author proposed to go for a fullfledged MACLIBNET in phased manner and also proposed to establish a network among the college libraries in Gwalior, M.P. as an initial step for the first phase of MACLIBNET. This paper also discus on infrastructure requirement like hardware & software requirement, network connectivity and network providers, ISO-10160 & 10161 ILL protocol, Union catalogue, holding list, web based inter library photocopy request system, etc. The author also has done a survey among some selected college libraries regarding their existing hardware & software availability, components manpower and interprets their compatibility for the propose library network. The author viewed that the initiative of the government of M.P. towards building digital libraries in colleges of M.P. may be a boon for the establishment of GWACLIBNET. Finally, the author has given some suggestions for the success of GWACLIBNET. Keyword: Library network, College library network, Union catalogue, GWACLIBNET, MACLIBNET 1. Introduction No library/ information & resource center of the world can think to become self sufficient in resources to meet the multidimensional needs of the users. The rapid explosion of knowledge, the rapid changes of pattern and growth of publications, price escalation of information resources, and ever increasing expectation of users compelled the library & information professionals to think of networking and resources sharing. The most important objective of networking of libraries is to maximize the access and availability of resources at minimum cost by collaborative acquisition, avoid duplication, consortia approach to costly e-books, e-journals and databases, and maximum utilization of local resources. As there is no library network in the North East Region, it is expected that the proposed GWACLIBNET and ultimately MACLIBNET will be able to facilitate the library 181

professionals to provide better services and the user community will get required information within the time frame from the library & information centers irrespective of its physical location, which would facilitate free-flow of knowledge in this region. The immediate beneficiaries from this library network will be the institutions of higher education that are planning and implementing different development activities. They will be more cooperative in knowledge and resource sharing in their approaches and will develop capacity to tackle impediments to their development programs following the approaches of there of other who succeeded. 2. Library and Information Network in India India is one of the developing countries working towards uplifting the socio economic conditions of its citizens by forming deferent telecommunication, information and Library networks through the country. The library network in India have been developed in various directions like development of country wide networks, development of Metropolitan Area networks, development of networks for sectoral facilities, etc. Some of the library networks in India are as follows : Network Name Calcutta Library Network Developing Library Network Scientific and Industrial Network Information and Library Network Bombay Library Network Madras Library Network Ahmadabad Library Network Mysore Library Network Bangalore Library Network Education and Research Network Management Library Network Pune Library Network REC/NIT Library Network

Acronyms CALIBNET DELNET SIRNET INFLIBBET BONET MALIBNET ADINET MYLIBNET BALNET ERNET MANLIBNET PUNENET RECNET/NITNET

Year 1986 1988 1990 1991 1992 1993 1994 1995 1995 1998 1998 1999 2005

3. Aims and Objectives of the GWACLIBNET To share the resources available in the participating libraries, and thus to evolve a network of all the college libraries of Assam. To build union catalogue, Union list, database of serial articles, CD-ROM database, union list of audio/video articles, database of theses & dissertations, etc. of the college libraries of Assam and to provide better access to it. To insist to build institutional repository in every colleges of Assam and to provide better access to it through GWACLIBNET. To build database of indigenous knowledge resources and provide access to it. To establish gateways or portal for providing online access to various database available through various national and international networks. To insist collaborative acquisition, collaborative cataloguing, and to avoid duplication in acquisition as much as possible. To insist on resource sharing and document delivery service.

182

To extend the facilities to the members to interact through remote login to the database available in electronic form. To develop skilled manpower for handling e-resources in network environment. To promote consensus building among academic and development stakeholders and to set priorities on which research initiatives should focus. To establish a Regional Library Network called MACLIBNET. 4. Issues and Challenges In order to establishment of GWACLIBNET and to achieve the above aims and objectives some issues and challenges are to be encountered, such as Who will take the initiative in bringing all the concerned college libraries on to one platform? What will be the policies and guidelines for such network? How far the existing hardware and Software will be compatible for the proposed GWACLIBNET? Where from the financial involvement will be met up? What will be the architecture of the Network? Who will be the Network provider? Whether the existing library staffs are motivated towards the use ICT in libraries and have that skill? How dose the manpower and their skills be developed for operation and handling of ICT application in network environment? 5. Hardware & Software Requirement The minimum hardware requirement in Dual processor 500Mhz Server class machine, 1GB RAM, 120 GB Hard disk space; 101100 Network Interface Card (NIC), Database backup solution, UPS with at least two hour battery backup time, CD/DVD-ROM, network compatible scanner, printer, etc. the minimum software requirement is Windows 2008 server/Windows 2000 service pack 2+ security hole up, Internet Explorer 7.0 or higher version; Terminal service pack 2, Internet Information server, Microsoft SQL server 7 or higher, library application software, MS-Office packages, visual studio packages (Suresh. 2011, p.72) 6. Feasibility Study of GWACLIBNET It is a long felt aspiration of the library professionals in the MP region that a library network will be established in this region. However, no such bold initiative has so far been taken for the purpose. For establishment of library network in MP Region some where someone has to start the work. And thus it is proposed to start the work with college libraries, because, college libraries have some advantages in this regard. A survey has been conducted in the college libraries in Gwalior city (21 libraries) to know the status of the automation, availability of hardware and software, network connectivity, etc. to assess feasibility of GWACLIBNET. Table 6.1 : Status of Automation of College libraries in Gwalior City 183

S. No. 1 2 3 4 5 6 7

Particulars College libraries started automation Cataloguing completed 80%-100% Cataloguing completed 60%-80% Cataloguing completed 40%-60% Cataloguing completed 20%-30% Cataloguing completed below 30% Automation under process

No. of Libraries and %age 12(57.1%) 5(23.8%) 1(4.8%) 2(9.5%) 2(9.5%) 2(9.5%) 9(42.9%)

The table 6.1 reveals that out of the 21 college libraries 12(57%) libraries have already started the automation work and almost 38% libraries have completed 50% of the cataloguing 50% libraries automated the cataloguing and circulation works. The table 6.3 shows that the 38% of the college libraries have their own server and also using Windows Server Software. They have sufficient number of clients to continue the automation works. Most of the libraries are using SOUL 2.0 for automation. Table 6.2 : Area of Automation of College libraries in Gwalior City S. No. 1 2 3 4 5 6

Particulars Cataloguing Circulation Serial Control Acquisition Report Generation Budgeting

No. of Libraries and %age 12(57.1%) 10(47.6%) 3(14.3%) Nil 5(23.8%) Nill

Table 6.3 : Hardware and software available in College libraries in Gwalior City S. No. 1 2 3 4 5 6 7 8 9 10

Particulars Server with Intel Xeon processor Computer Clients More Than 5 Computer Clients 3 to 5 Computer Clients below 3 Access to OPAC Server Software: Windows 2008 Server Software: Windows 2007 Server Software: Windows 2003 Server Software: Windows 2000 Application Software: SOUL 2.0

No. of Libraries and %age 8(38.1%) 3(14.3%) 5(23.8%) 4(23.8%) 12(57.1%) 4(19.0%) 1(4.8%) 2(9.5%) 1(4.8%) 12(57.1%)

Table 6.4 reveals that the libraries those have stepped forward for automation have power backup facility. Table 6.5 shows that almost 48% libraries have network connectivity and all of them have access to library network i.e. INFLIBNET (N-LIST). Table 6.4 : Power Backup facility in College libraries in Gwalior S. No. 1 2

Particulars Generator (Common for the College campus) UPS

184

No. of Libraries and %age 10(47.6%) 12)57,1%)

Table 6.5 : Network Connectivity available in the College libraries in Gwalior S. No. 1 2 3 4

Particulars BSNL Lease Line BSNL Broad Band V-SAT Access to Library network: INFLIBNET-N-LIST

No. of Libraries and %age 1(4.8%) 9(42.8%) Nill 10(47.6%)

From the above discussion it can be inferred that A major portion of the college libraries have already stared the automation work. Most of the libraries have completed at least 50% retrospective data conversion part. The college librarian has that skill to work in the automated environment. The college librarians in M.P. have a common plate form (i.e. GCLA) to discuss all the matters relating library networking. The college libraries have the minimum infrastructure to participate in the Library Network, only a minor up-gradation may be required. All the college libraries are using the same application software for library management i.e. SOUL and other. Almost 50% of the college libraries have network connectivity and also have access to library network i.e. INFLIBNET. Almost all the librarians are interested to participate in the proposed GWACLIBNET portal or database organ machine. The college libraries in M.P. are under the same umbrella i.e. Under the Department of Higher Education, Government of MP and under University Grants Commission, Government of India; therefore, there will not be any obstacle to come into consensus to participate in the proposed GWACLIBNET. 7. Proposed Plan GWACLIBNET The member libraries will be linked with the GWACLIBNET host system through telephone connectivity or V-SAT links. The member libraries will have their own LAN setup and will be routed through 6 Zonal Center. A database center (organ machine) will be developed for creating Union Catalogue, records of journals, etc. which will also have work space for data handling, system software, communicating software, application software used by the member libraries, etc. This database will be regularly updated and accessible online 24X7 for the users of the member library logging through their respective Zonal Center. The LAN switch interfaces all the terminal points of the Zonal Center, which will be configured depending upon the nodes required for the purpose. The Router will preferably be accommodated from the V-SAT service provider. The RAS (remote Access Server) will have minimum 10 ports, to facilitate a minimum of 10 logins simultaneously. In the later period the capacity of RAS can be increased considering the rush on the traffic. 8. Suggestions

185

MPCLA (Madhya Pradesh College Library Association) particularly the Gwalior GCLA (Gwalior College Library Association) may take the initiative for establishment of the GWACLIBNET. The national Agency/ organization related to higher education and for MP Region like UGC, UGC-INFLIBNET (under the programme PLANNER), Ministry of Education and the Government of M.P. may be approached for financial aid for the purpose. A core body may be formed for looking after all the aspects of GWACLIBNET. NIC may be approached for technical help as a consultant agency. BSNL may be approached as network provider. A technical committee may be formed with the experts from library professionals, NIC, BSNL, etc. to short out and formulate the uniform guidelines in techniques, procedure and methods, network topology, network Protocol, etc. in order to facilitate pooling sharing and exchanging resources. The college librarian and the college authority should consider the technical aspect of library networking at the time of implementation of the proposed digital library project in college of M.P. A through study should be made on the guide line protocol etc. of the existing library network in India like DELNET, INFLIBNET, MALIBNET, BONET, etc. Conclusion In the present age the knowledge consider as power. Thus redefining and reengineering of library and information system is needed of modern era and the most important changes in this regard are the access to knowledge resource through library networks and/ or library consortia. Hence it is proposed to establish a network of college libraries in GWACLIBNET. The success of proposed library network will be depended on the proper planning and appropriate decision taken by the authority. At the same time lots of efforts to be given by the leading library professionals of this region for motivation and generation of well trained professionals to work in the automated collaborative environment. M.P. College librarians association (MPGCLA) particularly the Gwalior Zone take initiative with the help of other zones for the purpose. References : 1. 2. 3. 4.

5. 6.

Indian academic library consortia (IALC): A proposal for electronic resource sharing (n.d.).Retrieved from http://eprints.rclis.org/bitstream/10760/8156/1/CRIMEA_2002.pdf Rajput, P.S., & Naidu, G.H., (2008). Library in a networking environment. Library Herald, 46 (2), 138-149. Singh, S.P. (2008). Library Networking, Omega Publication, New Delhi. (pp. 179-84) Sinha, M.K.(2011). Design & development of regional knowledge network for North Eastern Regional (RKNNER): A proposal. In K.C. Satpathy, & R. Ramachandran (Ed.), Networking of library and information centers in digital era: problems and prospects (pp.19-35).Jaipur: national library, Kolkatta. inha.Manoj Kumar and satpathy, K.C. (2004). Library automation and networking for managing library and information services.indian journal of information library & society, 17(4),118-130. Suresh , Lata.(2011).networking of libraries in Rajasthan: vision for future. In K.C. satpathy, & R. Ramachandran (Ed.), networking and library and information centers in digital era: problems and prospects (pp.67-74). Jaipur : National Library, Kolka

186

CONCEPT OF CLOUD COMPUTING IN LIBRARIES Dr.M.Anandamurugan

Abstract Cloud computing technology came up as a boon for libraries. Cloud initiatives undertaken by giants, there are sizable number of initiatives relevant to libraries initiated by organizations and business houses, which are in the business of integrated library software, digital libraries, federated search, website hosting, library automation etc. The paper presents an overview of cloud computing and its possible applications that can be clubbed with library services on the web based environment. This study may be helpful in identifying and generating cloud based services for libraries. Key words: Cloud computing, Public Cloud, Private Cloud, Hybrid Cloud, Introduction Cloud computing is not a new technology that suddenly appeared on the web but it is a new form of computing. Cloud computing is a kind of computing technology which facilitates in sharing the resources and services over the internet rather than having these services and resources on local servers/ nodes or personal devices. The combination of servers, networks, connection, applications and resources is defined as 'cloud'. Cloud computing is acting as a resources pooling technology for accessing infinite computing services and resources as per demand of users and can be compare with models of pay as you use or utility model same as used for mobile services usages and electricity consumption. Wikipedia claimed that the concept of cloud computing was emerged back to the 1960s, when John McCarthy opined that computation may someday be organized as a public utility. Chellappa gave the first academic definition of the term Cloud Computing in 1997 and later on, in the year 2007 the term cloud computing came into popularity and firstly was used in this context when Kevin Kelly opined that eventually we will have the inter-cloud, the cloud of clouds. NIST provides a very good definition of cloud computing as cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.' Buyya defined 'Cloud computing is a parallel and distributed computing system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources based on Service Level Agreements (SLA) established through negotiation between the service provider and consumers.' The common characteristics of cloud computing noticed from above definitions are: Pay Per Use (No Ongoing Commitment, Utility Prices) Elastic Capacity and the Illusion of Infinite Resources 187

Self Service Interface Resources that are Abstracted or Virtualized. Review of Literature As a major application model in the era of the internet, cloud computing has become a significant research topic of the scientific and industrial communities since 2007 (Qi and Gani, 2012). Furthermore, Cloud computing has generated a lot of interest and competition in the industry and it is recognized as one of the top 10 technologies of 2010 (Tripathi and Mishra, 2011; Sharma, 2012). It is the next generation in computation. Maybe Clouds can save the world; possibly people can have everything they need on the cloud. It is the next natural step in the evolution of on demand information technology services and products. It is a style of computing in which ITusers to access technology-enabled services from the Internet (i.e., the Cloud) without knowledge of, expertise with, or control over the technology infrastructure that supports them (Mirzaei, 2008). According to National Institute of Standards and Technology (NIST) definition (Mell and Grance, 2011), Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. Cloud computing is an internet based service delivery model which provides internet based services, computing and storage for users in all markets including financial, health care & government (Sharma, 2012). Virtualization technologies promise great opportunities for reducing energy and hardware costs through server consolidation. Moreover, virtualization can optimize resource sharing among applications hosted in different virtual machines to better meet their resource needs. As a result more and more computing can be conducted in shared resource pools that act as private and public clouds (Sahu and Tiwari, 2012) Definitions Cloud computing is typically defined as a type of computing that relies on sharing computing resources rather than having local servers or personal devices to handle applications. In cloud computing, the word cloud (also phrased as "the cloud") is used as a metaphor for "the Internet," so the phrase cloud computing means "a type of Internet-based computing," where different services such as servers, storage and applications are delivered to an organization's computers and devices through the Internet. Cloud computing is comparable to grid computing, a type of computing where unused processing cycles of all computers in a network are harnesses to solve problems too intensive for any stand-alone machine. How cloud computing works

188

The goal of cloud computing is to apply traditional supercomputing, or high-performance computing power, normally used by military and research facilities, to perform tens of trillions of computations per second, in consumer-oriented applications such as financial portfolios, to deliver personalized information, to provide data storage or to power large, immersive computer games

Cloud Computing Functioning In Cloud Computing, users simply focus on the service they would like to use, and not worry about how to implement the software that provides these services

Cloud computing foundation comprises of data centers (Services, storage, and networking), the business/ library applications and middleware, virtualization software and of course operating systems. entire workload shifts to the cloud (i.e.) local computers are not burdened with running hundreds of applications anymore. All that Users need is system interface software, like a simple web browser to be run on their side.

189

Cloud

Types

Internet cloud is not cloud computing to use more effectively, this point should be understood. 1. Public Cloud 2. Hybrid Cloud

3. Private Cloud 4. Community Cloud

(a) Public Cloud External cloud provide by service provider. It refers to the resources (hardware, software, applications) that a service provider offers over the Internet. (e.g.) Email basic and oldest pay as you go model one only pays for the services he consumers, Benefits are many: 1. One is not to worry about managing the underlying IT infrastructure; 2. No security patch or updates to apply 3. No software updates Service providers work after all these.

(b) Private Cloud enterprise delivery of services to the users faster and more effectively. Better control over the entire process of into processing is provided this helps reduce costs, improves response time and provides greater flexibility Vm ware, Microsoft, IBM, SUN provide offerings. Open source implementations like Eucalyptus and Ubuntu Enterprise cloud.

190

(c) Hybrid Cloud Cloud computing environment that consists of internal / external providers (viz) a mix of private and public clouds. Secure and critical applications are hosted by private cloud that so Secure and critical applications are hosted by public cloud. This combination is known as a Hybrid Cloud. It is unique entity portability for data and application. (e.g) Cloud bursting: Amazon web services (VPC) Virtual Private Cloud. Private cloud and Amazon web services, Microsoft Windows Ampfabric.

(d) Community Cloud Community cloud is implemented when a set of business has a similar requirements and shares the context. This would be made available to a set of select organization. Central and State Govts. Share, National libraries of India (Calcutta) may set up a community cloud for all the public libraries that share common goals and requirements. It is a sort of private cloud but goes beyond just one organization.

On Libraries - Effects

Possible 191

Cost saving Flexibility and innovation Cloud OPAC and cloud ILS Private, Hybrid and community clouds. Building and managing you own data center. Cloud on Indian Libraries In India, cloud computing in libraries is in development phases. Libraries are trying to provide to users cloud based services but in real sense they are not fully successful owing to the lack of good service providers and technical skills of LIS professionals in the field of library management using advanced technology. But some services such as digital libraries, web documentation and using web2.0 technologies are running on successful modes. Some good examples of successful cloud computing libraries include Dura cloud, OCLC services and Google based cloud services. Nowadays many commercial as well as open sources venders (i.e. OSS) are clubbing the cloud computing technology into their services and products. However, cloud computing technology is not fully accepted in the Indian libraries but they are trying to develop themselves in this area. Conclusion Cloud Computing is the future of computing it is important aspect of IT (Information Technology). Cloud market will definitely occupy the whole IT world. We can use Face In the field of library services cloud computing will definitely be a key player as commonly these is a financial constraint in Indian Libraries. This study provides cloud computing concepts and implications of cloud based applications in libraries in order to enhance their services in a more efficient manner. No doubt, libraries are moving towards cloud computing technology in present time and taking advantages of cloud based services especially in building digital libraries, social networking and information communication with manifold flexibilities but some issues related to security, privacy, trustworthiness and legal issues were still not fully resolved. Therefore it is time for libraries think seriously before clubbing libraries services with cloud based technologies and provide reliable and rapid services to their users. References 1. 2. 3. 4.

Kroski, E. (2009). Library cloud Atlas. A guide to cloud computing and storage stacking the Tech. Library Journal, Retrieved April 11, 2011, from http://www. Library Journal.com /article/CA 6695772 Peter, Chris. (2010). what is Cloud Computing and How will it Affect Libraries. Retrieved April 23, 2011, from http://www.Techsoupforlibraries.org/blog/what-is-cloud-computing-and-how-will-it-affect-libraries. Chawla, M. (2010). Demystifying Cloud Computing. Retrieved April 23, 2011, from http://www.pcquest.com/content/search/showarticle1.asp?arid=122142. Cloud computing. Retrieved January 23, 2013, from http://en.wikipedia.org/wiki/ Cloud_computingMell, P. & Grance, T. (2009). Effectively and securely using the cloud computing paradigm. Retrieved January 24, 2013, http://csrc.nist.gov/organizations/fissea/2009-conference/presentations/fissea09-pmell-day3_cloudcomputing.pdf

192

5.

6.

Buyya, R., Yeo, C.S., Venugopal , S., Broberg, J. & Brandic, I. (2009). Cloud computing and emerging IT platforms: Vision, hype, and reality for delivering computing asthe 5th utility. Future Generation Computer Systems, 25, 599-616. Khan, S., Khan, S., & Galibeen, S. (2011). Cloud computing an emerging technology: Changing ways of libraries collaboration. International Research: Journal of Library and Information Science, 1(2).

193

IMPACT OF INFORMATION TECHNOLOGY IN LIBRARY SERVICES: AN OVERVIEW Archana Yadav Abstract Information has always been prime factor for the development of society and is often regarded as a vital national resource. The growth of information and the dependency on it have paved the way for the information society and subsequently the knowledge society. Information has become important part of our lives and should be available when needed. Information services are generated using new tools and techniques to facilitate the right users to the right information. Now these days Information is playing a vital role in the human development as air is essential for the survival of all living organisms on earth. The velocity of change brought about by new information technologies has a key effect on the way people live, work, and play worldwide. This paper attempts to discuss the fast development of Information Technology and its impact in the library services. Today libraries are equipped to accomplish the newly Information Technology based services. Information Technology enabled services fulfill the information needs of the users at the right time in the right place to the right person. Key Words: Information, Knowledge society, Velocity of change, IT enabled services, Library services, Survival, Information Technology. Introduction Information is the key factor of any kind of research and development. Information is a The information itself and way it is accessed have undergone changes owing to the developments in information and communication technology. It is a vital ingredient for socioeconomic and cultural development of any nation or individual. According to Kemp ed of man ranking after air, water, food and overstressed. Quick and easy access to every required information is a supreme importance especially for libraries. Uwaifo, Stephen Osahon (2006) Information technology application and the techniques are being used by the libraries for information processing, storage, communication, dissemination of information, automation etc. Further, origin of internet and the development of World Wide Web revolutionized the information communication technology. Recognizing the advantages application of information technology the libraries are essential to provide the facilities to their user community Dabas, C.(2008). Information - Chin imaginative works of mind which are communicated formally and or informally in any scientific and industrial methods and their practical use in industry. This explains why some 194

refer to it simply as practical science. Langley and shain define information technology as numerical information by a micro-electronic-based combination of computing and This information that is so vital to human life, where does it come from? An in-depth study of how information is generated would be a difficult task, but is it can be safely be concluded that research is one of the better known areas where information takes root. Most of what we know today is a result of research. The work of experts in the fields of science, technology, social science and the humanities continue to give birth to information that is beneficial to the whole society. The government, understanding the major role that R&D plays, also continues to pour funds into these fields as a result of which more and more information is generated- so much so that the world is being bombarded with information Information Technologies Technology began to transform libraries in the 1950s with microfilm and in the mid-1960s with the Xerox machine. Computerized databases were developed in the 1970s and offered more information and better ways to search and obtain it. term for various technologies involved in the processing and transmission of Information Technology as the application of computers and technologies to the acquisition, organization, storage, retrieval and dissemination of 3. According to the British Department of Industry, it defines Information Technology as e acquisition, processing, storage and dissemination of vocal, pictorial, textual and numerical information by microelectronics based combination of computing and Kumar, PSG, (2003) pointed out that a wealth of management issues must be addressed

analysis of library specifications and requirement; a delineation of library goals and objectives; management commitment and support and ongoing user education. She listed some questions which must be addressed in any consideration of application of IT to include: What are its key capabilities, advantages, and limitations in terms of library goals? What are effective planning strategies and techniques? What are the technological requirements and alternatives? These questions are further explained thus: Capabilities: The key capabilities have to do with weighing the advantages against the limitations to determine the final selection. Planning: In planning for IT implementation, the following should be determined. 195

Components Of Information Technology Technological change is becoming a driving force in our society. Information technology is a generic term used for a group of technologies. Following are the major components of information technologies as most relevant in modern library and information system. 1. 2. 3. 4. 5. 6.

Data Information Computers Networking Hardware and Software Mass storage Application of Information Technology in Library The library is the main information centre which can make use of the fat development IT for technologies which are expected to be used in the library activities/ operations and other library services for collection, processing, storage, retrieval and dissemination of recorded information, the fast developing information technologies have showered almost every areas of application including libraries. In case of libraries, these are good use in the following environments. a) Library Management: Library management includes the following activities which will certainly be geared up by the use of these fast IT developments: Classification, Cataloguing, Indexing, Database creation, Database Indexing. b) Library Automation: Library automation is the concept of reducing the human intervention in all the library services so that any user can receive the desired information with the maximum comfort and at the lowest cot. Major areas of the automation can be classified into two organization o f all library databases and all housekeeping operations of library. c) Library Networking: Library networking means a group pf Libraries and information Centers are interconnected for some common pattern or design for information exchange and communication with a view to improve efficiency. d) Audio-Video Technology: It includes photography, microfilms, microfiches, audio and tapes, printing, optical disk etc. e) Technical Communication: Technical Communication consisting of technical writing, editing, publishing, DTP systems etc. Dabas, C.( 2008) Impact of Information Technology in Library The IT has wide ranging impact on library and information work. Information activities have undergone rapid transformations from conventional methods, consequent upon introduction of new technologies. 196

Digital libraries The digital library service environment is defined as a network online information space in which users can discover, locate, acquire access to, and increasingly use information. There is no distinction about the information format. The identity of a digital library is the way the library discloses, provides access to, and supports the use of its increasingly virtual collection. Managing, administering, monitoring and ensuring fair use of its collection are a part of the mix, as well as keeping up with new technologies to support education and cultural engagement so that the library can evolve and sustain itself. Virtual reference Virtual reference is a great convenience for library patrons, its implementation introduces several challenges for traditional libraries. Most individual institutions don't have the staffing levels necessary to monitor an IM service regularly enough for the service be attractive to users. For real-time virtual reference, many libraries are part of chat cooperatives or consortia, some of which are able to offer reference services 24 hours a day. Libraries in a cyber world Rapidly-deteriorating materials create an urgent need to focus on preservation, particularly in research libraries. Preservation decisions often fall into two major categories: selection and medium. Problems of selection are best answered by a team of both librarians and scholars, who employ criteria that is three-fold: collection-, subject-, and usage-based. Problems of medium, on the other hand, are more subjective. Microfilm is quite stable and durable, but access is somewhat limited. Digitization, while alleviating access problems, poses concerns about cost, instability, and hardware and software change. Cost of adoption Libraries have been transformed and modernized by the application of information technology. Users do not have to go to libraries, and have the opportunity to retrieve information via Internet. Because of all the new technology being introduced to libraries, library administrators are forced to break down the budget, in order to make wise decisions when it comes to long-term benefits for the library. If a library administrator follows a cost structure model, he/she will have more success determining the direct and indirect costs. Online Public Access Catalogues (OPAC) OPAC is the computerized version of the traditional library catalogue, it is a finding tool used to search for information sources by author, subject, title, and series title. In a study sponsored by the Council on Library Resources by the United States (1982) the researcher find out that OPAC received strong acceptance from the library patrons and staff; the predominant search approach use by the users is subject approach, and the respondents

197

confirmation of ease of use of OPAC interfaces and self-explanatory displays of bibliographic information. RFID Radio frequency identification is a term used for technologies utilizing radio waves for identifying individual items automatically. The most common way is storing a serial number identifying a product and related information on a microchip attached to an antenna. RFID is used very similar to bar codes. Mehrjerdi, Yahia Zare (2011) Electronic services and e-resources: The important fact is convincing many libraries to move towards digital e-resources, which are found to be less expensive and more useful for easy access. This is especially helpful to distant learners who have limited time to access the libraries from outside by internet access to commonly available electronic resources, mainly CD-ROM, OPACs, E-Journals, EBooks, ETD and Internet, which are replacing the print media . Advantages of information technology include: Easy to gather different library activities. Collaboration and creation of library networks. Avoid repetition of efforts within a library. Increase the range of services offered. Save the time of the users. Increases efficiency. Speedy and easy access of information. Improves the quality of library services. Enhance the knowledge and experience. Integration within the organizations. Improve the status of the library. Improve the communication facilities. More stable. Helps to attract the users. Remote access to users. Round the clock access to users. Access to unlimited information from different sources. More up to date information Conclusion Utilization of Information Technology in present libraries is sanguine to gain right information at the right time in the right place as well as right price. The increasing role played by information technology in the development of library services for an active reaction to the challenges of the information service providing. Information Technology has broken the worldwide restrictions, new tools and methods help to provide better services to our societies. 198

References 1.

Information technologies: Their impact on Library Services Technology in modern era: Libraries and Librarians in New Millennium, New Delhi; Commonwealth, 1999, p 65-72.

2. 3. 4. 5.

6.

Publications,2002, p1-2. IT applications for TQM and Library marketing

p 74. Kannappan Kumar festschrift Library and Information Profession in India, Vol-1, Part-2, Delhi; B.R. Publishing Corporation, 2004, p612-617. Information Technology: Basic Concepts 17.

7. proceedings. 26, 1974, p 87. 8. 1, 2011 9. Annals of Library and Information Studies, Vol.53,March 2006, p15-17. 10. December 2007, p190-194. 11. Ess Ess Publications, 2008, p 21. 12. Soper, M association, Chicago, 1990, p2. 13. Delhi: Kaveri Books , 2004, p130. 14.

15. 16. 17. 18.

-42.

News, No: 2, 2006, p17-21. http://en.wikipedia.org/wiki/Online_public_access_catalog http://en.wikipedia.org/wiki/Union_catalog http://lis.sagepub.com/content/26/1/23.short. http://en.wikipedia.org/wiki/Image_scanner

199

XPLORING SMART DIGITAL DIFFERENCES IN LIBRARIES Bankapur (V M), Ramesh Patil Sanjivkumar Bakanetti Abstract The article deals with the challenges of the for the libraries to opt with technology as a coined her is smart way of the changing the dimension. The article stress the need for advances seen elsewhere as a case studies and analysis as enumerated here so that these may help in many library and information professionals and librarians at large. Introduction Technology has changed with times and necessities are added with innovations the way man needs it. The constant clout with this technology is adoptability. How the system has to adjust with change is a challenge. It is happing everywhere and every sphere of knowledge. So it seems technology has invaded us and has snatched our souls, going with this proverb, we all have to perceive that there is no alternative to technology. Use them for better and build to create a new systems approach. A system may be anything which has a complex set of elements. Library systems are one such entity which has drastically changed human needs and thirst for knowledge since ages. Today there is a challenge which has to see how libraries have to perform and change it for bettering the services as per the preferences of the users and make a mark of relevance. Libraries still play a vital role in all aspects of comfortable for use of information in any format. Smart gizmos (devices) are also adding to the library systems and libraries can give information in abundance if really it is a needed. Let us look into how these libraries are changing and what technologies have been used in the library world. Today library products are becoming more helpful providing users with feedback and opportunities to adjust searches with varieties of searching tools. Libraries should reap the benefits of technology and use them extensively by exploring the smart digital products and services and differentiating and creating values for the libraries. Harvesting with technology: Today it is the world of information; more precisely we give prominence to digital information/electronic information. So the whole world has changed because of the information form (which we call it as born digital). Hence the very essence of this change has made libraries to look into a different direction or smart enough to go digital. Let us work on the issues of the changes happening with relevance to the libraries. 1. Information forms: Electronic information or e-information is invading libraries of late. All resources are available in the digital forms, from books to journals and all other related sources. The challenges for libraries are how to store and structure these information sources. 200

-

-

-

-

2.

Storages: Huge storage capacities of hardware are needed which enable larger participations of library groups in shared systems that manages collections and enable access to their communities ; Structuring e-resources: Metadata structures have become very important for all libraries and may be expressed in different markup or programming language for e.g MARC and Dublin Core may be expressed in plain text. Html, xml and RDF. Likewise many metadata emerging today a new wave of structuring of e-content is digital skill to be developed; Access to information: When automation is the preferences of the library access to the library. Widely used for any information search on the web and apps. Smart widgets applications which navigates users to remote sites and matching records to a query. Many such searching library techniques are developed. Content development: publishing systems which generate information about their collections from metadata and scholarly literatures and varieties of in-house databases. Hardware - Server for library: Hardware-based solutions are typically implemented by using RAID (redundant array of independent disks which uses an intelligent drive controller and a redundant array of disk drives to help protect against data loss in case of media failure and to improve the performance of read and write operations. A disk array is an effective diskstorage solution for computers that are running SQL Server. - Scanners: CCD Wedge scanners or laser scanners are used for borrowing purposes; - Barcode, Barcode printers and now Q R codes have emerged as smart phones are growing; Next generation access readers - Wi-Fi gadgets - Kindle Readers - Fire tablets etc,. - Zinio services: World largest news stands from Rutherford Library accessible through the library card, tablets and smart phones .Many new services can be designed to quell the thrust of digital differences and presences.

3. Software has changed the working of libraries. It is also supported by open sources. A smart disseminating information. Here fig no.1. has explored such new products which are invading libraries and changing them into a digital store house of information and knowledge. Each software and application blend the needs of the users offering many useful antidote of the libraries.

201

activities

1. Trinity Factor: e- Information or content is one important commodity which takes different with technology which was not evident with earlier forms. With development of technology it has become more clear and obvious that libraries are leaping forward to go with bang as trends are executed to be the future generation libraries. Perceiving the importance of technology use particularly applications is becoming a reality. Dr S R Ranganathan always stressed on the trinity factors: As library is not just a place of collection building but a place for connecting people, place and platforms: BOOKS / INFORMATION (Information forms); Digital information the next generation content for library. Design varieties of digital content and accessibility; STAFF: Library staff role is changing in designing services and communicating through social networks and popularizing the art of promoting the services induced in the social networks and making a group of users; USERS: Users always play vital role in accepting the social network technology or SNS Collation, collaboration and connections). Conclusion Exploring the use of technology and the variations of use and its applications has changed with times and necessities are added with innovations in the way man needs it. The word Smart gizmos (devices) are also adding to the library systems and libraries can give 202

information in abundance if really it is a needed. Libraries are changing and what customized to the needs of the users. Three important things as referred by the Dr S R adaptability as technology is most opted by the users. Hope with emerging digital technologies has lessened the digital differences and the survival of libraries is being redefined. References 1. 2. 3. 4. 5. 6. 7.

Meredith G. Farkas (2007) Social Software in Libraries: Building Collaboration, Communication, and communledgity online Information Today; Oldenburg, Ray (1991). The Great Good Place. New York: Marlowe & Company. Knowe House. Mumbai Casey M.E. and Savastinuk, (LC) Library 2.0 Service for the next generation library, Library Journal. Changes in Academic Library Space a Case study. Australian Academic & Research Libraries www.rutherfordlibrary.org Bailin,Kylie. (2011).Changes in academic library space: A case study at the University of New South Wales /explore-materials/zinio-digital-magazines-for-your-computer-smart-phone-or tablet (referred on 24th Dec2104)

203

CLOUD COMPUTING TECHNOLOGY IN LIBRARIES Neha Kanojia, Mrs. Deepika Raj, Mrs. Avinash Kaur Abstract Cloud computing is a modern technique for any business or organization. The present paper describes concept of cloud computing, its models, advantages, characteristics, security issues and using application in library and information science. Whether they are using Google engaged in cloud computing. The paper deals with intellectual challenges over Cloud computing, debate revolving around the Cloud element of Cloud computing, Cloud models and the implications of Cloud computing in the libraries. Keywords: Cloud computing, Models, Reliability, Security, Libraries, Alliance of cloud computing with libraries, Library cloud initiatives. Introduction Cloud computing is a mega change of techno-salvation that has robbed IT of its traditional obligations and empowered the end users with on demand utility computing. The raft of Cloud has sailed past the consciousness of the people which has become their virtual living strategy. Users who have had the experience of using Web 2.0 services like Wikipedia, Blogger, and Flickr etc. have already experienced Cloud computing, maybe unknowingly Cloud computing is being envisaged as the new computing paradigm of the upcoming decade which virtualizes hardware and the software resources. Since, Cloud computing has emanated with its own set of standards, nomenclatures and practices it requires clear understanding for its acceptance in the libraries and realization of a Cloud libraries in the future. Reports used in the literature may be verbal, but in the vast majority of cases reports are written documents. The types of scholarship may be empirical, theoretical, critical/analytic, or methodological in nature. Second a literature review seeks to describe summaries, evaluate, clarify and/or integrate the content of primary reports. Definition: A literature review can be defined as a description of literature related to a certain topic within a certain field. This description includes an overview of the following: the main theories and hypotheses; scholarly opinions about the topic; Names of some reputable researchers in the field, etc. A literature review can be organized in different ways. It may be either a descriptive piece of writing or critical assessment of literature. Yet, keep in mind that even a descriptive 204

literature review should not be a mere list of the scholarly works. It requires your comments and opinions.( Ref no.6) Data confidentiality issue Cloud computing allows users to store their own information on remote servers, which means content such as user data, videos, tax preparation chart etc., can be stored with a single cloud provider or multiple cloud providers. When users store their data in such servers, data confidentiality is a requirement. Storing of data in remote servers also arises some privacy and confidentiality issues among individual, business, government agency, etc., some of these issues are mentioned below: 1. Privacy of personal information and confidentiality of business and govern- ment information implications is significant in CC. 2. The terms of service and privacy policy established by cloud provider are the key factors that gnificantly. 3. Privacy and confidentiality rights, obligations and status may change when a user discloses information to cloud provider based on information type and category of CC user. 4. The legal status of protections for personal or business information may be greatly affected by disclosure and remote storage. 5. Location of information may have considerable effects on the privacy and confidentiality information protection and also on privacy obligations for those who process or store information. 6. Information in cloud may have multiple legal locations at the same time but with differing legal consequences. 7. A cloud provider can examine user records for criminal activity and other matters according to law. 8. Access to the status of information on cloud along with privacy and con- fidentiality protections available to users would be difficult due to legal uncertainties. In addition to these to maintain confidentiality understanding data and its clas- sification, users being aware of which data is stored in the cloud and what levels of accessibility govern that piece of data should also be known.(Ref.8,9). Cloud Computing Cloud computing is most important and beneficial technology for the new era. Cloud computing comes in the libraries with the help of IT and ICT application. Meaning of cloud computing term that the computer delivering infrastructure, applications, business process and help with the computing power of personal collaborations that describes all types of IT

205

technology. Cloud computing is a set of hardware, software, shared resources, networks and information which are available on demand of computer and other devices. Characteristics Of Cloud Computing A. D i s p a t c h m e c h a n i s m Cloud based services is the painless IT solutions where user has to hardly bother to install or update the application or software that is being delivered. Moreover, these services are primarily accessible on demand which is based on one-to-many delivery model. B. Services supervision Cloud based services are supervised from one or more central locations instead at each where users do not need to do anything which counteracts the obligations to download patches and upgrades frequently. C. Metered services Cloud resources usages must be measured by integrating telemetry as a part of service offering. It should allow service providers to change bill plans from time to time without changing the metering software. It allows users to access and use resources by paying a monetary value for the duration of their usages. D. S e l f - h e a l i n g Cloud Computing offers uninterrupted services to the clients which maintains online backups of the data. The appealing features of Cloud computing is inherent resilience tofailures.Availability of Cloud services on demand. Types of Cloud Computing Cloud-based applications provide a range of solutions to a very large number of users. To help us analyze and describe cloud-based system, many people to refer a cloud solutions in terms of its deployment model and services model. Cloud Deployment Models A. Community: - The cloud computing technology is shared by two or more than two organizations, company, schools and university. B. Hybrid: - A cloud computing technology that consists of two or more than two private, public, or community clouds servers.

206

C. Private: - Owned by a specific entity and normally used only by that entity or one of its customers. The underlying technology may reside on-or-off site. A private cloud offers increased security at a greater cost. D. Public: - Available for use by the general public. May be owned by a large organization or company offering cloud services. A public cloud is usually the least expensive solutions.(Ref.1) Cloud Services Models A cloud can interact with a client in a variety of ways, through capabilities called services. Across the web, some major types, or models, of services have emerged, which are described:A. Cloud Software as a Services (SaaS):- The software as a services (SaaS) is a solutions model in which users use a web browser to access software that resides, along with the programs and user data, in the cloud. Organizations that use SaaS solutions eliminate the need for in-house applications, administrative support for the applications, and data storage. B. Cloud Platform as a Services (PaaS):- The platform as a services (PaaS) model provides a collections of hardware and software resources that developers can use to build and deploy applications within the cloud. Using PaaS, developers eliminate the need to buy and maintain hardware, as well as the need to install and manage operating system and database software. C. Cloud Infrastructure as a Services (IaaS):- The infrastructure as a services (IaaS) model provides makes all of the computing hardware resources available; the customers, in turn, are responsible for installing and managing the systems, which they can normally do over the Internet. D. Cloud Identity as a Services (IDaaS):- The identity as a services (IDaaS) is difficult, time consuming, and expensive. Over the few past years, companies have begun to emerge to provide identity as a services (IDaaS), or cloud-based ID management.( Ref.no.2) Reliability and Security In Cloud Computing:These are two major issues about cloud storage. To ensure reliability cloud storage providers use redundancy techniques. Also, as security is of prime concern cloud storage providers use Encryption, Authentication and Authorization techniques or a combination of these.

207

A. Encryption: - Encryption systems are used to perform various operations on the encrypted data without knowing the private key (without decryption), the client is the only holder of the secret key. B. Authentication: - Only id's name and password of user is required for authentic valid cloud users. C. Authorization: - Define the list of people who are eligible to access the information stored on cloud. Multiple level authorization can also be defined. What Cloud Computing Means For Libraries:The cloud computing has played in transforming libraries and information technology organizations. It beings with a history of technology adoption in libraries and concludes with an exploration of current trends in cloud computing and how these trends are beginning to transform libraries. Alliance of Cloud Computing With Libraries:Most institutions and companies are going to move to the Cloud which will eliminate the dependency of desktops. There is an intellectual contestation of varied interpretations regarding the implication of cloud computing in the libraries. The web has also expanded the scope of services provided by librarians. Cloud computing and web collaboration is two major concepts that underlie new and innovative developments in library automation. Cloud services allow for more optimal resource utilization, easier access, and more effective cost reduction. The growing Internet usage among library users plus the time users spend on the Internet has made it imperative for the libraries to offer their services online. Today`s information consumers have more alternative and attractive ways of finding information than the tradit Internet tools and services such as web search engines, eto be studied and redesign the library services. -based new generation of ILS allows many libraries to share useful data, for instance, sharing of full-text journal titles from electronic databases. Many libraries investment on IT infrastructure for various online as well as subscription based services. With these success libraries are motivated for using subscription based IT infrastructure in the Cloud. of Cloud computing by many organizations, including OCLC, Ohio Link, SirsiDynix, and the Library of congress suggest that this mode of computing will have a significant impact on the configuration, the economics, and perhaps the personal requirements of library c Cloud computing given their service oriented mission and need to find appropriate solutions using limited resources. There are many reasons for why Cloud computing is being in common. Technologically we use Cloud computing because we can and it`s convenient. Economically, it is cost effective and pocket friendly.

208

Libraries may be governed by the policies and regulations that dictate how they can use Cloud-based solutions. On the other hand, cataloguing rule, will replace AACR2 and be implemented in the future; the new Cloudgoing to change the practice of traditional reprographic services offered in the libraries. Cloud Computing Initiatives In Libraries:Cloud Computing has been recognised as a legitimate areas of research and application. Its implications is scattered across a wide range of disciplines. Some of the notable initiatives exclusively in the libraries have been enlisted below: A. OCLC`s Web scale:OCLC`s Web scale is an exemplary Cloud computing solution for the libraries. OCLC for a long time has been offering libraries with global accessibility to its readymade database of catalogues stored in the Cloud. B. Ex-Libris Cloud:It is a leading provider of library automation solutions of print, electronic and digital documents that caters to the library`s needs of resource description, management and distribution. Some of the products of Ex-libris offered in Cloud are: a) b) c) d) e) f) g)

Primo discovery and delivery It is a system of integrated library Linking of SFX scholarly Metalib meta searching Verde e-resource management Digital based management tools Alma

C. OSS Labs:OSS Labs are exploiting Amazon`s elastic Cloud computing platform to provide Koha (open source Integrated library system) and D space (open access scholarly or published repository software) hosting and maintenance services to the subscribing libraries. D. D

:-

It is an open source platform that offers on-demand storage and services in the Cloud to the library users and staff. E. Shared Academic Knowledge Base plus or KB+:which offers services over the Cloud with libraries or users accessing it through their desktop PC or devices having Web browser across the Internet. 209

F . 3M Library System:The 3M Cloud library offers a user friendly flexible e-book lending system. Users can browse, read and issue the titles of their interest. Through the latest mobile technology it offers users with the facilities to explore and borrow e-books from anywhere at any time. Our Information Technology Environment The information technology (IT) landscape of libraries has changed dramatically over the past 20 years. In the 20 years since the advent of the web, libraries have almost ubiquitously adopted internet access and have become key players in the provision of internet services to their communities. During this time libraries have redirected services both to support in house internet use and to serve the needs of users via digital means. It Service And Cloud Computing In Libraries:The rapid pace of change in libraries is affecting IT service needs in both the library and parent organization. It is common for decision about IT platform adoption and administration to be the purview of IT departments that primarily serve the parent organization rather than the library. This can mean that important IT decision are made at an organizational rather than department level, including selection of server hardware, operating system, and software stacks-all of which influence software selection in other areas of the organization. This situation has potential to limit the solutions libraries adopt new digital services. At the same time, libraries are adopting system that including large-scale data management components and are turning to a wider range of information service providers. Systems such as electronic resources management system and recommender service are key example of data-rich services being implemented in new Data as a Service (DaaS) solution. How Cloud Computing Responds To Current It Services:The idea of cloud computing is taking over a number of previous concepts, including grid computing, data centres, and web services. While each of these types of computing maintains distinct definitions and service approaches, cloud computing providers are beginning to demonstrate how a cloud-centric approach differentiates itself from traditional approaches. In relation to these ideas are two related concepts that are often used together in cloud computing. software system designed to support interoperable machine-to-machine interaction over a be thought of as a set of standardized methods that enables access to a software program or system. Overview of Cloud Computing and Virtualization:Cloud computing and virtualization are complementary technologies which, although technically very different, are often seen as similar approaches to solving a common problem. A decision to either virtualizes an IT service or deploy tin a cloud environmental 210

come down to issues of services scale, organizational factors, or institutional capacity rather that to issue of the technologies themselves. The technical definitions of virtualization and cloud computing are very from the operational definitions of outsourcing and services hosting. Cloud-based approaches such as software as a service (SaaS) and SaaS style services hosting as provided by library IT vendors. The creations of an IT service infrastructure that more closely matches the diverse suite to providers and platform that is typical in libraries. Advantage:Easy and faster deployment of computing User can only pay for own Monthly low payments Offers the latest services and updated software Dynamic scalability Simplified maintenance Diverse platform support Unlimited storage capacity Data backup and restoration supports High computing power Location and device independence Disadvantage:Security Performance Integration with in-house IT Lack of Cloud regulations Dependence on internet connectivity Loss of control in Cloud Trust over the Cloud service provider Comclusion:Cloud computing has generated a great deal of buzz in technologies circles in general and within the library community in particular for good reason. Cloud computing is all about virtualized web based services providing a painless and economic computing solution. The advantages of cloud computing include flexibility, easy to use, cost saving on hardware, and possible time saving for staff that would allow technology staff to concern treat on tasks more closely related to the library mission than the maintenance of servers. Reference:1. 2.

Corrado, Edward M. & Moulaison, Heather Lea, (2013), Getting started with cloud computing New Delhi, Ess Ess Publications. p.p. 2 to 5 In-tet references: (Corrado & Moulaison, 2013) Jamsa, Kris, (2013),:- Cloud Computing: SaaS, PaaS, IaaS, Virtulizatin, Business Models, Mobile, Security, and More, Burlington, Jones & Barlett Learning Press. p.p. 1 to 5 In-text references: (Jamsa, 2013) Mitchell, Erik T., (2013) Cloud based services for your library New Delhi, Ess Ess Publications. p.p. 3 to 8

211

3. 4. 5. 6. 7. 8. 9.

In-text references: (Mitchell, 2013) Yuvraj, Mayank (Jan-Jun 2013) Seeding the Ideas of Cloud Computing in Libraries Asia Pacific Journal of Library and Information Science. Vol.3 No.1 p.p. 85 to 92 In-text references: (Yavraj: Jan-Jun 2013) http://ieeexplore.ieee.org/xpl/login.jsp http://www.sjsu.edu/faculty/weinstein.agrawal/urbp298_phI_litreview_hodge.df http://www.deakin.edu.au/library/findout/research/litrev.php Piers Wilson. Positive perspectives on cloud security. Information Security Technical Report, 2011. 25, 31, 45, 57 P.T. Jaeger, J. Lin, and J.M. Grimes. Cloud computing and information policy: Computing in a policy cloud? Journal of Information Technology & Politics, 5(3):269 283, 2008. 30

212

IMPORTANCE OF DISTANCE LEARNING AND ROLL OF LIS PROFESSIONALS Anamika Shrivastava, Renu Saxena Varsha Sahu Abstract Distance learning is a way of learning remotely without being in regular face-to-face contact with a teacher in the classroom. The majority of distance education today takes place using the Internet, now readily accessible for the vast majority of students whether in their own homes or at facilities such as local libraries. These electronic means are used to distribute the learning material, keep students in touch with teachers, and provide access to communication between students. This Paper discussed about distance learning its importance and roll of LIS Professional to fulfill distance learners demands. Key words: Distance Learning, Importance of Distance Learning, Roll of LIS Professional Skill for LIS Professionals Introduction Distance Learning is a type of education, typically college-level, where students work on their own at home or at the office and communicate with faculty and other students via email, electronic forums, video conferencing, chat rooms, bulletin boards, instant messaging and other forms of computer-based communication. Most distance learning programs include a computer-based training (CBT) system and communications tools to produce a virtual classroom. Because the Internet and World Wide Web are accessible from virtually all computer platforms, they serve as the foundation for many distance learning systems. Distance learning, sometimes called e-learning, is a formalized teaching and learning system specifically designed to be carried out remotely by using electronic communication. Because distance learning is less expensive to support and is not constrained by geographic considerations, it offers opportunities in situations where traditional education has difficulty operating. Students with scheduling or distance problems can benefit, as can employees, because distance education can be more flexible in terms of time and can be delivered virtually anywhere. Literature Review Chutima Sacchanand (2002) described the changing of distance education environment, characteristics of distance students in higher education and their problems in using library resources and services. Distance education has been moving very fast from correspondence education to online education or web-based delivery of education. Education librarians have much more critical roles to play in supporting the distance education system in the new learning environment. 213

Hermosa, N.N. and Anday, A.G.(2008)founded that online library support such as: access to library resources and services, the interaction of library to its users and other information providers and how these library resources and services can be delivered to online learners. Library service is indeed an essential component of equality online learning system. As access to Internet-based courses grows, an increasing number of e-learners are dispersed around the globe, often in parts of the world where physical access to the collections of large academic and research libraries is impossible. Judy Block (2010) focused how the digital divide affects distance education. he concluded that The solution to the issue of digital inclusion is one of working together to create open education and bridge the technological divide. Library administrators should provide programming to its patrons to develop their skill level .such as providing a workshop on Administrators must do everything in their power to bridge the digital divide. Administrators can also address the skill divide by instituting educational programs intended to bring competency skills of searching the Internet. Oladokun, O. (2014)believed that distance mode of educational delivery has become a very trendy method of popularizing edufound everywhere and anywhere in metropolitan, non-metropolitan areas as well as any other environment for purposes of this review, information environment is structured into information needs, information sources and information seeking behavior of distance learners. Under the umbrella of information environment of distance learners, the review was extended to examining the various studies or researches that have been carried out on information needs and information (seeking) behavior of distance learners. Distance education is a method of study that is pursued by students who are physically separated from their tutors and institution of instruction for the greater part of their study (Watson, 1992). The distinguishing characteristic of distance education from other forms of education is the physical separation that exists between the students, their tutors and the institution of instruction. Distance Learning In the past decade, distance education has become an increasingly popular way for colleges to provide access to their programs and for students to learn about topics and get degrees they might not otherwise be able to pursue. Instructors from grade school to college are using the potential of distance learning to teach students from all around the globe and allow them to work collaboratively on projects, degrees-focused content and educational enrichment. Distance learning can use other technological format as well including television, DVDs, teleconferencing, and printable material, but the immediacy and functionality of Web learning has made it a first choice for many distance learners. Online programs often take advantage of a number of emerging technologies to make keeping in touch and effectively communicating ideas easier and more efficient than ever before and students may find themselves using interactive videos, e-mail, and discussion boards to complete their lessons. Some Key Point of Distance Learning as: 214

These materials are produced by the university, college or learning provider and are either sent directly to the student or more usually today accessed via the internet. Tutorial support is provided via a virtual learning environment, telephone, email or other electronic means. There may be occasional face-to-face encounters with tutors and attendance at week-long summer schools. Importance of Distance Learning Distance learning, also known as online education, is a viable option for many individuals of all ages who desire to get an education. It holds a number of pros and advantages over a traditional learning environment. Due to the imposing absence of a traditional teacher, as a student you learn to motivate yourself, when the learning environment comes under your control. The development of the streak of self motivation is also an important aspect of your growth. The self directed aspect of distance learning method in turn helps cultivate this streak Flexibility of choice If you are a part of the traditional learning system, you need to follow the set schedule of a given course or curriculum. But with the availability of distance education, you can come back to the learning process even after being cut off from the regular learning scheme. It gives you a greater flexibility to opt for a course of study even after long spells of separation. An online education provides the opportunity to study more subjects and reach out to programs that are not available in the immediate area. Distance learning is much more flexible than traditional styles of classroom education. Students who need to take other classes or work can do class work whenever they have a free moment instead of being for students, parents and professionals to take the classes whenever it fits into their schedule. This is beneficial over classroom education that requires students to schedule work and childcare around the class time. Better accessibility In case, you are separated from mainstream education because of distance, time and other relevant reasons. You can fall back upon distance education on merit of its accessibility. If you go for the online learning method, you only need to have a computer with an internet connection. Similarly if you choose for a correspondence course as one of tools of distance education, you need to ensure connection by means of postal delivery. This aspect of accessibility helps you to continue education despite being professionally employed Networking: Students who enroll in classes with online education obtain a wider range of networking opportunities. Instead of being limited to networking in the local area, distance learning enables students to make connections with a more diverse range of people. Saves time & money 215

You need not travel across to a new region or country for availing of the benefits of a course. You can have it accessed by means of online method of distance learning. This not only saves time but also cuts down on financial expense. Moreover, most courses offered as part of distance learning method are cheaper than their regular counterpart. Online classes typically cost less than an education in a classroom environment. There are less space limitations and materials required for each student and the savings are passed on from the educational institution to each student. A huge advantage to getting an online education is drive or want to spend money on the costs of public transportation every single day will likely choose to get an online education over the traditional classroom. Selection of Professors: Distance learning enables students to learn from some of the most prestigious professors and guest speakers in each field. No classroom sitting: Sitting in the classroom is not the best way for every student to learn. A student may learn better at his own pace and in a different format than traditional schooling options offer. You can earn as well as learn For those employed and in need for professional advancement, distance education is particularly beneficial. For the advancement of qualification, you need not give up your job. The adaptability of distance education will help accommodate distance learning can usually be completed on your own schedule, it is much easier to complete distance learning courses while working than more traditional educational programs. Keeping your job gives you more income, experience and stability while completing your degree giving you less to worry about and more time to focus on your studies. Roll of LIS Professional Library services in support of college, university, or other post-secondary courses and programs offered away from a main campus, or in the absence of a traditional campus, and regardless of where credit is given. Courses thus supported may be taught in traditional or nontraditional formats or media, may or may not require physical facilities, and may or may not involve live interaction of teachers and students. The phrase is inclusive of services to courses in all post-secondary programs designated as: extension, extended, off-campus, extended campus, distance, distributed, open, flexible, franchising, virtual, synchronous, or asynchronous. The delivery of library and information services to those who learn at a distance is undisputedly the most pressing challenge that distance librarians encounter. Distance librarianship demands that libraries and librarians recognize that their role has transformed from being custodial in orientation to become cutting edge in nature particularly with respect to the delivery of information services. Distance education and librarianship demands that students are placed at the centre of the educational paradigm

216

The library must maintain a current strategic plan and vision for serving distance learners. Strategic planning is an iterative process that includes evaluation, updating, and refinement. Formal planning procedures and methods must be used. These planning methods require learners. The library must likewise include distance learning library services in its mission statement and goals, which serve as a framework for all its activities. The mission and goals should be compatible and consistent with those developed by the originating institution. These methods help the institution prepare for the future by clearly defining a vision and mission, by setting goals and objectives, and by implementing specific strategies or courses of action designed to help meet those ends. The originating institution is responsible for ensuring that the distance learning community has access to library materials equivalent to those provided in traditional settings. Thus, the institution must provide or secure convenient, direct access to library materials in appropriate formats that are of sufficient quality, depth, number, scope, and currency to: eds in fulfilling course assignments; enrich the academic programs; meet teaching and research needs; support curricular needs; facilitate the acquisition of lifelong learning skills; Accommodate students with varying levels of technological access (i.e. low bandwidth); and accommodate other informational needs of the distance learning community as appropriate. When more than one institution is involved in the provision of a distance learning program, each is responsible for the provision of library materials to the students enrolled in its courses, unless an equitable agreement for otherwise providing these materials has been made. develops a written statement of immediate and long-range goals and objectives for distance learning library services, which addresses defined needs and outlines the methods by which progress can be measured; Skill Required For Lerner Personal skills. In the online environment, and particularly as a distance education student, it's beneficial to be an active learner who takes responsibility for their own learning, motivation and self discipline. Literacy skills. Studying online is dependent on strong reading and writing skills. Much subject content will be delivered by readings, and a lot of your communication will be in written form. If you know these are not your areas of strength, seek some support and try and develop these skills. Study skills. Studying online requires many of the same skills as traditional face-to-face study. Things like time management, motivation, being clear of expectations and exam preparation still remain important aspects of study. 217

General computer and Internet skills. You will need at least a basic level of proficiency in computer use to successfully study online. Necessary -skills include word processing, file management, saving and printing. Being able to go to specific URLs, book-marking, saving and printing web pages will all be important skills. More advanced skills such as web searching and website evaluation would also benefit most students. online forum discussions with other students, and your lecturers. Form an online study (2002) For LIS Professional Comfort in the online medium: Librarians need to do so much online these days, way beyond basi to be able to use search engines and use them well. They need to be able to find quality online resources. They need to help patrons set up e-mail and teach basic Internet skills. They need to be able to troubleshoot problems users are having accessing online library ference services online via e-mail and synchronous chat. More important than knowing specific tools is a general Internet and search skills. Vision to translate traditional library services into the online medium: With the growth of the distance learning and the fact that so many patrons access the library from the medium. provide equivalent services to people who only access the library from online? Librarians need to know how to capitalize on the technologies out there (HTML, blogs, wikis, screen casting, IM, etc.) to provide these services online to their patrons. Internet knowledge-how to search the web what the internet is vs. what the world wide web is good searching habits knowledge of spyware and how it can disable a computer how to use various browsers including IE, Firefox, Mozilla, Opera, Netscape and others Software knowledge- Microsoft Office products and other alternatives, anti-virus software, personal firewall software ftp telnet HTML editors basic ability to understand your operating system (os) knowledge of what (os) you have on your computer knowledge of how to figure out what (os) others have ability to test & learn new software. Networking knowledge- what is the network? What do you need to put a computer on a network? (network interface card & data cable) Wireless networks how to connect to wireless on PCs with various operating systems. How to determine if internet connectivity problems are network problems, computer problems or web site failures what is an IP address? - some knowledge of the following concepts: DNS (internal & external), NAT

218

(network address translation), VPN (virtual private network) basics of how it works

what is a proxy server & the

Access for Achievement of Superior Academic Skills: Access to appropriate library services and resources is essential for the attainment of superior academic skills in postsecondary education, regardless of where students, faculty, staff, and programs are located. Members of the distance learning community, including those with disabilities, must therefore be provided effective and appropriate library services and resources, which may differ from, but must be equivalent to those provided for students and faculty in traditional campus settings. Strategic Planning: The library Administrator must maintain a current strategic plan and vision for serving distance learners. Strategic planning is an iterative process that includes evaluation, updating, and refinement. Formal planning procedures and methods must be used. These planning methods require input from a broad spectrum of the originating distance learning library services in its mission statement and goals, which serve as a framework for all its activities. The mission and goals should be compatible and consistent with those developed by the originating institution. These methods help the institution prepare for the future by clearly defining a vision and mission, by setting goals and objectives, and by implementing specific strategies or courses of action designed to help meet those ends. And finally, distance education has also added to the pool of opportunities for librarians to become creators of information either through the development of new materials or when they repackage of information to suit particular needs. The repackaging of information or value adding services that librarians contribute information used in distance learning systems adds to the teaching role that librarians to this system of education. The teaching skills that librarians develop to assist distance learners are often applied to traditional information services. The transferrable of these skills represents a new paradigm in traditional library and information services Conclusion With distance learning courses, students can complete their course work from just about work when and where it is more convenient for them without having to squeeze in scheduled classes to an already busy life. Distance learning is also a great tool to help reach students who are in geographically remote areas and may not have readily available access to educational facilities or who want to explore opportunities not offered by their local schools. The delivery of library and information services to distance learners has introduced a number of new professional paradigms in the field of librarianship. Prices for online courses are generally cheaper than their onworry about commuting, moving or getting meal plans on campus, some additional benefits to learning from home. Distance learning may not be the ideal option for everyone but should be considered when looking at options for education. Librarians, distance educators

219

and administrators must therefore adopt new strategies to ensure that quality library and information services are available to those who learn at a distance. References: 1. 2.

3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17.

key role.68th IFLA Council and GeneralConference. Available at http://eprints.rclis.org/6992/1/113-098e.pdf Hermosa,N.N.and Anday, A.G.(2008).Distance Learning and Digital Libraries:The UP OpenUniversity Experience. Journal of Philippine Librarianship, 28,(1) 90-105.available at http://paarl.wikispaces.com/file/view/Anday.pdf Judy Block(2010)Online Journal of Distance Learning Administration, Volume XIII, Number I, Spring. Available at http://www.westga.edu/~distance/ojdla/spring131/block131.html Oladokun, O. (2014). The Information Environment of Distance Learners: A Literature Review. Crea-tive Education, 5, 303-317. Watson, E.F. (1992) Library services to distance learners: a report. Vancouver, Commonwealth of Learning http://www.col.org/forum/pcfpapers/watson.pdf http://www.prokerala.com/education/distance-education-advantages-disadvantages.php https://scruffynerf.wordpress.com/2006/07/19/technical-skills-the-librarian/ http://www.csu.edu.au/distance-education/what-is-distance-education/online-learning http://www.thecompleteuniversityguide.co.uk/distance-learning/what-is-distance-learning/ http://whatis.techtarget.com/definition/distance-learning-e-learning http://www.distancelearningnet.com/what-is-distance-learning/ http://www.webopedia.com/TERM/D/distance_learning.html http://www.njvu.org/top-10-advantages-and-benefits-of-distance-learning/ http://www.ala.org/acrl/standards/guidelinesdistancelearning http://www.distancelearningnet.com/advantages-and-disadvantages-of-distance-learning/ http://meredith.wolfwater.com/wordpress/2006/07/17/skills-for-the-21st-century-librarian/

220

WEB SERVICES: ADVANTAGE FOR LIBRARIES AND INFORMATION SEEKERS Anamika Shrivastava Omvati Sharma & Jyoti Mittal Abstract efficient communication is the need today. The web is increasingly an important resource in many aspects of life such as education, employment, government, commerce, healthcare and more. The internet has already had a major impact on how people find and access information, and now the rising popularity of e-books is helping transform users reading habits. In this changing landscape, libraries are trying to adjust their services to these new realities while still serving the needs of patrons who rely on more traditional resources. A Web Service is a method of communication between two electronic devices over a network. It is a software function provided at a network address over the web with the service always on as in the concept of utility computing. This paper discussed about web services especially for libraries and its user, their resources and its advantage. Key Words: Different Resources for Web Based Library Services, Advantage of using Web Services, Web services for libraries and Information Seekers. Introduction The term "Web Services" can be confusing. It is, unfortunately, often used in many different ways. Compounding this confusion is term "services" that has a different meaning than the term Web Services. The term Web Services refers to the technologies that allow for making connections. Web services (also known as application services) are services (usually including some combination of programming and data, but possibly including human resources as well) that are made available from a business's Web server for Web users or other Web-connected programs. Providers of Web services are generally known as application service providers. users to submit their queries to the library at any time from any place in the world. Web Based Services, Digital Library Services, Internet Library Services (2005). From traditional online services to today, four generations of information retrieval tools have passed that assist users in searching the World Wide Web. Web services are powered by XML and three other core technologies: WSDL( Web Services Description Language), SOAP ( Simple Object Access Protocol), and UDDI( Universal Description Discovery and Integration). Before building a Web service, its developers create its definition in the form of a WSDL document that describes the service's location on the Web and the functionality the service provides. Information about the service may then be entered in a UDDI registry, which allows Web service consumers to search for and locate the services they need. This step is optional but is beneficial when a 221

company wants its Web services to be discovered by internal and/or external service consumers. Based on information in the UDDI registry, the Web services client developer uses instructions in the WSDL to construct SOAP messages for exchanging data with the service over HTTP. Different Definitions Web service is any piece of software that makes it self available over the internet and uses a standardized XML messaging system. XML is used to encode all communications to a web service. For example, a client invokes a web service by sending an XML message, then waits for a corresponding XML response. Because all communication is in XML, web services are not tied to any one operating system or programming language--Java can talk with Perl; Windows applications can talk with UNIX applications. Web Services are self-contained, modular, distributed, dynamic applications that can be described, published, located, or invoked over the network to create products, processes, and supply chains. These applications can be local, distributed, or Web-based. Web services are built on top of open standards such as TCP/IP, HTTP, Java, HTML, and XML. Web services are XML-based information exchange systems that use the Internet for direct application-to-application interaction. These systems can include programs, objects, messages, or documents. A web service is a collection of open protocols and standards used for exchanging data between applications or systems. Software applications written in various programming languages and running on various platforms can use web services to exchange data over computer networks like the Internet in a manner similar to inter-process communication on a single computer. This interoperability (e.g., between Java and Python, or Windows and Linux applications) is due to the use of open standards. Literature Review S.K. Pathak, Ashrushikta Misra and Gitanjalee Sahoo (2008) believed that this is the age of internet it provides many more facilities to its users. The library or information centre is in the practice ensuring that its users are served as quickly and effectively as possible, using the web based services the most appropriate tools. Web services can empower libraries, offering more control and simpler system customization and integration. But these and other advantages are dependent on web services being standardized Raj Kumar Bhardwaj and Dr. (Mrs.) Parmjeet K.Walia (2012) found that the basic function of the college library is to provide the study material to its users in short possible time to serve the information requirements of the students and teachers web based environment, role of library and information professionals have changed altogether, their role is not just as custodian of books but to teach the students how to use the existing resources, frequently organizing workshops, Professionals must have competencies to create web pages, how to build up institutional repository. Information professionals must change the way of managing documents with latest tools and technologies. Professionals must have competencies to create web pages, how to build up institutional repository. 222

Bhatnagar, Anjana(2005) focused on what are web based services, why it is necessary and why it is so popular among users, with their advantages and disadvantages. And she concludes that Web-based library services will become more widespread and sophisticated as the web becomes common place throughout the world, and to be successful players in the E-world. Libraries must continue to address the web design and implementation issues. The librarians should be expert to hold the hands of the users who are moving towards new communication paradigm a shift from face to face human contact to human machine interaction, from paper to electronic delivery, librarians may play a leadership role in providing better Web Based library Services facilities to their current techno savvy users. Reddy, Nagi y. and Ali, Yakub (2006) evaluate the services provided by the library in the IT environment and also finds out the user satisfaction based on a questionnaire survey. The survey was conducted among the students and research scholars of the university to study their satisfaction level about the information technology based services in the library. And the result fond that the users are largely satisfied with the library services provided by the IGM Table 4 Use of e-Journals and databases In percentages) Subjects e-Journals CDROM access databases Sciences 77 75 Social Sc. 76 66 Humanities 82 60 Library in the IT environment. The study once again reiterates the benefits of IT in libraries to meet the user requirements and have satisfied library clientele. Ghosh and Ghosh (2009) conducted a study to examine the progress India has made in its move towards a knowledge-based economy. The Indian Government has demonstrated its commitment to the development of fundamental pillars of knowledge sharing infrastructure, knowledge workers and a knowledge innovation system. Libraries are identified as key players in building an inclusive knowledge economy (KE) for a country. The important findings of the study were: the practice-based examples of how information and communication technology (ICT) projects are influencing contemporary Indian society and an account of government policies in regard to ICT implementation and development towards a KE are presented. The impediments in the process of KE in India are identified and recommendations are made. Hussian, Ackhtar, Mohd Asif Khan and Nishat Fatima Zaidi (2013) discussed about the satisfaction level of users regarding research work, online database services and infrastructure facilities. Suggestions have been given to make the service more beneficial in the library users of B-schools in Delhi and NCR region. And they found that ICT has a great importance in each and every sphere of life; now libraries are not left apart from the impact of ICT it dependence upon the attitude of librarian and the library professional. Thus attitude of library professional have very much importance in the context of ICT application in the library. Different Resources for Web Based Library Services Today, users may have access a variety of textual information resources. There are different kinds of web-based reference resources and services for accessing information from libraries such as OPAC, Gateways, Portals, Subject Portals, Electronic Journals, Online Databases, Subject Directories and Search Engines. These resources overlap considerably in the type of information they cover, and sometimes it is difficult to distinguish between some 223

of them. A library should have a good collection of these resources like selected Web links, subscription resources, and library materials in well-organized pages for serving better services to their users. Many libraries and organizations are providing digital reference service through collaborative services. Existing library consortia are adding digital reference to current shared services, and networks of libraries. Some regional library consortia are offering member libraries the opportunity to share reference questions with each other using the Internet and other technologies. OPAC - On Line Public Access Catalogues, form an important part of many digital n and databases, in addition to the traditional bibliographic records. Gateways A gateway is defined as a facility that allows easier access to network based resources in a given subject area. Gateways provide a simple search facility and a much-enhanced service through a resource database and indexes, which can be searched through a web based interface. Information provided by gateways is catalogued by hand. Gateways cover a wide range of subjects, through some areas, such as music and religious studies, currently lack subject gateways. Some well-known gateways are as follows: Internet Public Library (IPL), Bulletin Board for Libraries (BUBL), National Information Services and Systems (NISS) Portals In the library community, portals may be defined as an amalgamation of services to the users where the amalgamation is achieved through seamless integration of existing services by using binding agents such as customization and authentication services, search protocols such as Z39.50, loan protocols such as ISO10161, and e-commerce. The result is a personalized service which allows the individual to access the rich content of both printbased and electronic systems. Portals are either commercial or free web facilities that offer information services to a specific audience. The facilities include web search to communication to email to news etc. There are three kinds of portals; Consumer (or horizontal), Vertical and Enterprise. Consumer portals are aimed at consumer audiences and offer free email, games, chat etc. Examples are Yahoo!, MSN and AOL. Vertical portals, target a specified audience, such as a particular industry, and offer many of the consumer portal features. Example includes Vertical Net. Enterprise portals on the other hand are similar to consumer portals, but they are offered only to corporations or similar organizations. Examples include Epicanthic and Corporate 224

Yahoo! These portals can be best understood as electronic pathfinders for users, pulling together in one place in a web site selected links to subjects or interest-oriented resources located on the WW W. Subject Portals Web Search Engines had been developed initially by computer scientists, by borrowing techniques from information retrieval search such as best match searching and relevance ranking. Information professional are increasing bringing their skills to help organize the growing wealth of Internet resources. A good example of their influence is the development of subject-specific web search engines known as subject portals, where evaluation of material covered is a major concern. Two prime UK subject portals are SOSIG Social Science Information Gateway, covering social science resources and OMNI Organizing medical networked information covering medical resources. The aim of the subject portal is to list and review the most important sites on the web relevant to Electronic Journals. Electronic journals form a large part of the collection of a library for providing web based services. Today many journals are available electronically some are full text and some contain only bibliographic information with abstract. Major advantage of electronic journals is that they are constantly updated and easy to access but disadvantage is that breaching of copyright law is very easy. They are available as bitmaps, PostScript, PDF, ASCII, SGML and HTML. Library services may be delivering to users on CD-Rom, through email or through web. Some international societies and associations have developed their own digital libraries through which users can get access to all their publications. Services are available to the members of society or associations through subscription. Some of popular one is ACM Digital Library (http://portal.acm.org/portal.cfm) EBSCO databases (http://search.epnet.com/) Emerald full text(http://iris.emeraldinsight.com/) IEL Online (http://www.ieee.org/) OCLC (http://www.oclc.org) Springer Verlage link (http://www.springerlink.com/) Online Databases These are large collections of machine-readable data that are maintained by commercial agencies and are accessed through communication lines. Many libraries subscribe to them for easy access and use of current information. The disadvantage is that only bibliographic data is presented and not full text. The information cannot be accessed when the system is down for any reason. Examples Ei Compendex, SciFinder Scholar, Web of Science, Current Contents etc. Search Engines

225

Search Engines are huge databases of web page files that have been assembled automatically by machines where as the subject directories are human-compiled and maintained. Search engine indexes every page of a website and subject directories linked only homepages. Search Engine is the popular term for an information retrieval (IR) system. A search engine is computer software that searches a collection of electronic materials to retrieve citations, doc retrieved materials may be text documents, facts that have been extracted from text, images, or sounds. A query is a question phrased so that it can be interpreted properly by search engine. Depending on the type of software, it may be a collection of commands, a statement in either full or partial sentences, one or more keywords, or in the case of non-text searching, an image or sequence of sounds to be matched. Subject Directories Subject directories differ from search engines in that search engines are populated by robots that finds and index sites whereas humans making editorial decisions that populate subject directories. Subject directories are basically index home pages of sites and can be classified as general, academic, commercial or portal. Among the well known subject directories are the Argus Clearinghouse (www.clearinghouse.net) and Yahoo (www.yahoo.com). Strengths include relevance, effectiveness and relative high quality of content. Weaknesses are that they lack depth in their coverage of the subjects Advantage of using Web Services Exposing the existing function on to network: A Web service is a unit of managed code that can be remotely invoked using HTTP, that is, it can be activated using HTTP requests. So, Web Services allows you to expose the functionality of your existing code over the network. Once it is exposed on the network, other application can use the functionality of your program. Connecting Different Applications ie Interoperability: This is the most important benefit of Web Services. Web Services typically work outside of private networks, offering developers a non-proprietary route to their solutions. The inherent interoperability that comes with using vendor, platform, and language independent XML technologies and the ubiquitous HTTP as a transport mean that any application can communicate with any other application using Web services. The client only requires the WSDL definition to effectively exchange data with the service and neither part needs to know how the other is implemented or in what format its underlying data is stored. These benefits allow organizations to integrate disparate applications and data formats with relative ease. Web services are also versatile by design. They can be accessed by humans via a Web-based client interface, or they can be accessed by other applications and other Web services. Services developed are likely, therefore, to have a longer life-span, offering better return on investment of the developed service. Web Services also let developers use their preferred and share data and services among themselves. Other applications can also use the services 226

of the web services. For example VB or .NET application can talk to java web services and vice versa. So, Web services is used to make the application platform and technology independent. Standardized Protocol: Web Services uses standardized industry standard protocol for the communication. All the four layers (Service Transport, XML Messaging, Service Description and Service Discovery layers) uses the well defined protocol in the Web Services protocol stack. This standardization of protocol stack gives the business many advantages like wide range of choices, reduction in the cost due to competition and increase in the quality. Low Cost of communication: Web Services uses SOAP over HTTP protocol for the communication, so you can use your existing low cost internet for implementing Web Services. This solution is much less costly compared to proprietary solutions like EDI/B2B. Beside SOAP over HTTP, Web Services can also be implemented on other reliable transport mechanisms like FTP etc. Usability : Web Services allow the business logic of many different systems to be exposed over the Web. This gives your applications the freedom to chose the Web Services that they need. Instead of re-inventing the wheel for each client, you need only include additional application-specific business logic on the client-side. This allows you to develop services and/or client-side code using the languages and tools that you want. Reusability: It is another positive side-effect of Web services' interoperability and flexibility. One service might be utilized by several clients, all of which employ the operations provided to fulfill different business objectives. Instead of having to create a custom service for each unique requirement, portions of a service are simply re-used as necessary. Web Services provide not a component-based model of application development, but the closest thing possible to zero-coding deployment of such services. This makes it easy to reuse Web Service components as appropriate in other services. It also makes it easy to deploy legacy code as a Web Service. Deployability: Web Services are deployed over standard Internet technologies. This makes it possible to deploy Web Services even over the fire wall to servers running on the Internet on the other side of the globe. Also thanks to the use of proven community standards, underlying security (such as SSL) is already built-in. A client can even combine data from multiple Web services to, for instance, present a user with an application to update sales, shipping, and ERP systems from one unified interface even if the systems themselves are incompatible. Because the systems exchange information via Web services, a change to the sales database, for example, will not affect the service itself. All these benefits add up to significant cost savings. Easy interoperability means the need to create highly customized applications for integrating data, which can be expensive, is removed. Existing investments in systems develop-ment and infrastructure can be utilized easily and combined to add additional value. Since Web services are based on open standards their cost is low and the associated learning curve is smaller than that of many proprietary solutions. Finally, Web services take advantage of ubiquitous protocols 227

and the Web infrastructure that already exists in every organization, so they require little if any additional technology investment. Web services for libraries And Information Seekers Due to the tremendous growth and continuous development of technology, the role of library becomes more responsive in making the users techno-savvy. Technological developments have affected not only the formats and sources of the information, but also how and where to provide library services. Libraries and their resources have partially moved to the virtual world of the Internet. As a result, library users can access the resources from outside the physical library. Bhatnagar, Anjana(2005). Over the last few years, Web services and the service-oriented architecture (SOA) have become dominant themes in IT across many industries. Web-based computing, service orientation, and cloud computing increasingly displace the client/server approach favored by libraries in the ast. In library automation, one major trend involves evolving or rebuilding automation systems to adopt this new approach to software. Purveyors of both open source and proprietary library automation products increasingly emphasize the ways in which they embrace openness, support application programming interfaces (APIs), or implement Web services. In general, any online service delivered from a Web site. Since there are countless applications and services that emanate from the Web, such usage of the term is commonplace in articles from non-IT publications. This is another example of generic names coined for specific technologies. Additional services Led by advances in technology the range of services available is constantly increasing, and with it the expectations of library users. Depending on the aims and purpose of the library, some of the following examples might be regarded as core services, others as value added services: OPACs giving in-library access to the catalogue remote access to library catalogues community information creation of digitized content to provide remote access to special collections and local studies materials tailored mailing lists and bulletin boards for groups of users such as reading groups, homework clubs and distance learners personalized services such as informing a user that, based on previous reading habits, a new book, video or web site has arrived which may be of interest online enquiries services requests and renewals by e-mail or other form of remote access Co-operative networked services between libraries in different sectors and between libraries and museums and libraries and other advice agencies. document delivery

228

Development of services which will extend access to all and further the aim of increasing social incl housebound people to select their own books. their automation products are able to interoperate and thrive in this growing realm of Web services. Conclusion Web services are the latest candidate in technology for enabling distributed computing Libraries increasingly need to extract data, connect with external systems, and implement functionality not included with the delivered systems. Rather than relying on the product developers for enhancements to meet these needs, libraries increasingly demand the ability to exploit their systems using APIs, Web services, or other technologies. In India, all the Universities have good libraries and most of them are providing web based services. .in India, all the universities have good libraries and most of them are providing web based services. Current Awareness Services (CAS) and Selective Dissemination of Information maximum but still few of them are lacking behind due to lack of basic infrastructure and lack of skilled man power. it is the right time for library professional to come forward firmly and convert the traditional library into teaching library providing organized training to use e-resources to user. Raj Kumar Bhardwaj and Dr. (Mrs.) Parmjeet K.Walia (2012). References 1.

2.

S.K. Pathak, Ashrushikta Misra and Gitanjalee Sahoo (2008).Future of Web Based Library And Information Servicess:An Indian Scenario.6th Convention Planner. Nagaland.2008. Pp 406-414 available at http://ir.inflibnet.ac.in/bitstream/1944/1156/1/36.pdf Raj Kumar Bhardwaj and Dr. (Mrs.) Parmjeet K.Walia (2012). Library Philosophy and Practice 2012. Web

at http://www.webpages.uidaho.edu/~mbolin/bhardwaj-walia.pdf Bhatnagar, Anjana(2005). WEB-BASED LIBRARY SERVICES 3 Convention PLANNER -2005, Assam Univ., Silchar, 10-11 Nov., 2005pp 426-434. available at http://ir.inflibnet.ac.in/bitstream/1944/1418/1/58.pdf 4. Reddy, Nagi y. and Ali, Yakub (2006). Information technology based services in auniversity library:A user satisfaction survey. Annals of Library and Information Studies. Vol. 53, March 2006, pp.15-17. Available at. http://pdf.aminer.org/000/245/888/satisfaction_with_internet_based_services.pdf 5. Ghosh, M. & Ghosh, I. "ICT and Information Strategies for a Knowledge Economy: The Indian Experience." Program: electronic library and information system 43, no. 2 (2009): 187-201. 6. Hussain Akhtar (2013), "The ICT based library and Information services: a case study of B-Schoolsin Delhi and NCR region" (2013).Library Philosophy and Practice (e-journal).Paper 1011. Availabl at http://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=2443&context=libphilprac 7. http://www.altova.com/whitepapers/webservices.pdf 8. http://www.ukoln.ac.uk/public/earl/issuepapers/range.html 9. http://www.tutorialspoint.com/webservices/what_are_web_services.htm 10. http://www.service-architecture.com/articles/web-services/web_services_definition.html 11. http://www.tutorialspoint.com/webservices/why_web_services.htm 12. https://social.msdn.microsoft.com/Forums/en-US/435f43a9-ee17-4700-8c9d-d9c3ba57b5ef/advantagesdisadvantages-of-webservices?forum=asmxandxml 3.

229

SOCIAL MEDIA AND DIGITAL LIBRARIANSHIP Dr. Anil Kumar Dhiman Ranjendra Kumar Bharti

Abstract Social networking is a process of building relationship among a group with a common interest. While Social Networking Sites (SNS) are the web-based services which allow users to connect with others, share information and show their interests to others. There exist so many social networking sites like, Linkedin, Twitter, Flickr, Myspace, and Facebook which are so much popular among teenagers, youngsters and even slightly matured people. SNSs are emerging as a new medium for not only exchange of thoughts but also becoming so much popular among library fraternity for various purposes. This paper attempts to define the concept of social networking and how social networking media / sites can be used in library services. Key Words: Digital Libraries, Social Media, Social Networking Sites. 1. Introduction Digital libraries which are the set of digital objects and which services allow a community of users to access and re-use the digital objects (Meghini et.al., 2010), have evolved as a result of advancement in information & communication technologies (Dhiman, 2003; Dhiman & Rani, 2012). Further, internet and the revolution of information distribution have significantly exaggerated the relationship between librarians and library patrons over the last few decades. These have led to the formation of social media in digital environment. Social media are the group of internet-based applications that are built on the ideological and technological foundations of Web 2.0 and that allow the creation and exchange of usergenerated contents (Kaplan & Michael, 2010). They enable social interaction through the internet. Collins & Quan-Hasse (2012) add that social media has become an increasingly familiar tool employed in academic libraries to market services and resources to current and prospective patrons. Lucky (2012) mentions that many more social media has led to incorporation of social media tools and technologies, for example the social networking, social tagging, etc. on many digital libraries websites. Thus, modern libraries may make use of social media through social networks. 2. What is Social Networking? Social networks have emerged as a revolution in web 2.0 services which are those applications that make the most of the intrinsic advantages of the platform and delivering software as a continually updated service that gets better the more people use it, consuming 230

and remixing data from multiple sources, including individual users, while providing their own data and services in a form that allows remixing by others, creating network effects through an architecture of participation (Dhiman & Sumankumari, 2010). But with the advancement in technology, they have undergone a dramatic growth and popularity in recent years. A social network refers to the articulation of a social relationship, ascribed or achieved, among individuals, families, households, villages, communities, and regions, etc., whereas, social networking includes to a range of web-enabled/IT-enabled software programs that allow browse, search, invite friends to connect and interact, share film reviews, comments, blog entries, favorites, discussions, events, videos, ratings, music, classified ads, tag and that lets the user to interact and work collaboratively with other users, including the ability to browse, search, invite friends to connect and interact with web world. -based services that allow individuals to construct a public or semi-public profile within a bounded system, articulate a list of other users with whom they share a connection, to view and navigate their list of connections and those made by others within the system. -widening circles of contacts, inviting convergence among the hitherto separate activities of email, messaging, website creation, Thus, they are providing a lot of opportunities for the youngsters, especially the students as well as to other professionals. 3. Types of Social Networking Media There exist many variations of social media but most popular include blogs, Facebook, photos sharing, podcasts, Friendster, LinkedIn, MySpace, RSS feeds, Twitter, You tube, Weibo and wikis etc. Xie & Stevenson (2013) have listed the types and definitions of most important social media with examples as follows: Table 1: Important Social Media Types and Examples S.N. Types 1. Blogs

Definitions Examples They allow a user to share their thoughts and opinions on Blog subjects in a diary like fashion in a series of posts. Also create discussions or informational site published online and

2.

Microblogs

They allow users to communicate with a handle or username Twitter that the user creates. Users can write their short message,

3.

Photo-

followers. These are the online image and video hosting sites that allow Facebook, 231

sharing Podcasts

users to share, comment and connect through posted images. Multimedia digital file that is stored on the internet and is available to download is similar to a radio broadcasts that is available freely online. 4. RSS These are the Rich Site Summary or Really Simple Feeds Syndication which comprise of frequently updated web feeds that indicate news, events and blog entries which a user can subscribe to and follow. RSS takes current headlines from different websites and pushes those headlines down to your computer for quick scanning. 5. Social These are the online platforms that allow users to Networks communicate and connect with each other via interests, backgrounds and activities which are part of a large social network. 6. Video These are simply the content distribution of the videos typically available free to the public. 7. Wikis Wikis allow users to create and edit webpages contents online. Hyperlinks and crosslinks connect between pages. Users are allowed and encouraged to edit wikis. 4. Uses of Social Media Tools/Sites in Libraries

Flickr, Pinterest, Twitter, Podcasts RSS Feeds

Facebook, Twitter, Reddit YouTube Wiki

Social media was created for users to communicate and connect. Dickson & Holely (2010) faculty and students. Some outreach methods focus on programs aimed at faculty with the hope that faculty will encourage library use among their students. Other approaches focus more on the student population, including embedded librarians and collaboration with Social networking can be relevant to information seeking and sharing on information retrieval perspective by providing speed and quick information to the information extremely important to learn how users are interacting with libraries and other cultural provides the libraries an innovative and effective way of connecting with their users. library services and the events (Chen et.al. 2012). Social networking sites may facilitate collaboration and may promote effective communication between librarians and their patrons. They may also act as a generator of free flow of information among the libraries and its users. Dhiman (2012) has discussed various roles of social networking sites for library services. Today, many libraries across the world can be seen to make use of Twitter or other tools to connect themselves with important information sources. Chan (2012) adds that social networking sites may also be used as a medium for marketing of library services. Some important types of social media with their possible roles in libraries are discussed here below: 232

4.1. Blogs Blogs are a simple and efficient way for librarians to stay informed and for libraries to disseminate information in a timely manner (Dhiman & Sharma, 2008). These are the extension of what we already do, they identify, organize, and make information accessible in libraries. Blogs give us an opportunity to be more responsive, to reach out to the faculty and students via library blogs to highlight news, post student/faculty book reviews and invite comments, announce events, list new acquisitions, etc. Further, the blogs can be used for the promotion of reading and books in different ways that is one of the main activities of an academic library. 4.2. Facebook Facebook was founded by Mark Zuckerberg in 2004 to leave comments, message each other via widgets on the blogs. It is its interactivity that distinguishes Facebook from other static websites. Facebook enables users to choose their own privacy settings and choose who can see specific parts of their profile. It engages and pushes the contents to users. Facebook also gives an opportunity to make community and receive fast, quick respond to feed back. As far as libraries are concerned, facebook may help students or research scholars to develop practical research skills that they need in a world where knowledge construction and dissemination make increasing use of online information network (Parveen, 2011). It has also mobilized library services among younger generation of LIS Professionals. Besides, Facebook had facilitated the development of professional relationships in and beyond libraries (Graham et. al., 2009). 4.3. Flickr Flickr is an online image and video hosting sites that allow users to share, comment and connect through posted images. Flickr allows the users to post photographs and to create discussion groups. While Flickr is known largely as a photo sharing Web site, it also allows s of the library and its staff to provide a virtual tour of the library itself while simultaneously putting a human face to the building. The libraries can also post material from special collections on a Flickr account, though a general search of Flickr reveals that most academic libraries use Flickr to 4.4. Pinterest Pinterest is also an image sharing site, which allows to upload the images that grabs attention quickly without explanatory text. Smeaton & Davis (2014) says that premise is that pinning interesting images gathers followers. The images which are seen as interesting are then re-pinned and shared amongst the Pinterest community that can lead users back to the library board. One way to create a unique and appealing board is to create 4.5. Twitter 233

Twitter is a popular social media tool for people to connect and communicate with one another. Twitter allows registered users to post brief messages for other users who follow messages regarding its resources (new book alerts, etc.) and services. Many students have also joined the network and became followers. The library also follows useful tweets from 4.6. YouTube web. You Tube enables the users to embed their videos onto other web sites, including other tools such as Facebook, blogs, Wikis, or the library Web site. Kroski (2007a) mentions that videos for stud

234

4.7. Wikis (Boxen, 2008), which presents user-supplied contents in chronological order. Wikipedia is a repositories can be created through social bookmarking sites, developing bibliographies through social bookmarking sites, building historical and cultural collections through media sharing applications, and forming relationships with like-minded individuals in social Thus, it is seen that Blogs and Wikis which encourage interaction and collaboration among users, are an important component for a new outreach tool. Besides, there are social bookmarking sites, like Reddit, Stumble Upon and Digg etc. that allow users to organize and share links to websites. Libraries can also make their use to popularize, promote and disseminate and also to marketize their services in online environment. A typical model for marketing of library services through social media (based on Alkindi & Al-Suqri, 2013), is shown in figure 1. Alkindi & Al-Suqri mention that the library can do marketing by using several types of applications provided by SNSs, such as Facebook applications. These applications can be used to improve library services in order to attract users and provide them with the best services possible. These applications can also support the best use of library collections by users and at the same time, can enhance some services to facilitate the use of information resources. Further, as social media offer the chance to improve customer service issues and complaints, the services of library can also be improved on the basis of inputs received from their customers. 5. Conclusion Social media and digital libraries connect learners to a wide range of informational resources. Faisal (2009), rightly mentions that the emergence of online social networks and its expanding user base demand immediate attention from the side of academic libraries. generation libraries has been steadily climbing in the past few years. Digital libraries that are the part of a parent institution are beginning to seek out their own unique social media pages to Hence, the use of social networking sites is increasing among the people especially among the youths in college and the universities. Similarly, it is impacting on the working environment of library professionals. They are making use of social networking sites not only for exchanging information among themselves but also trying to publicize their libraries and for the promotion of library activities among the researchers, faculties and the students in academic environment.

235

But the need is that librarian should think seriously about them and should have faith in the use of social networking sites for their libraries. 6. References 1.

2. 3. 4.

5. 6.

7. 8.

9. 10. 11. 12.

13. 14.

15. 16. 17. 18.

Science Arts, Humanities & Psychology, 13 (2). https:// Alkindi, Salim Said & Al-Suqri, Mohammed Nasser (2013). Social Networking Sites as Marketing and Outreach Tools of Library and Information Services. Global Journal of Human Social globaljournals.org/GJHSS_Volume13/1-SocialNetworking-Sites-as-Marketing.pdf (accessed on 02.1.2015). Boyd, D., & Ellison N. (2007). Social Network Sites: Definition, History and Scholarship. Journal of Computer-Mediated Communication, 13 (1), 210-230. Boxen, J. (2008). Library 2.0: A Review of the Literature. The Reference Librarian, 49 (1), 21-34. Chan, Christopher (2012). Marketing the Academic Library with Online Social Network Advertising. Proceedings of the IATUL Conferences. Paper 21. http://docs.lib.purdue.edu/iatul/2012/papers/21 (accessed on 02.1.2015). Chen, D.Y., Chu, S.K. & Xu, S.Q. (2012). How Do Libraries Use Social Networking Sites to Interact with Users. ASIST, 1-10. Collins, G. & Quan-Haase, A. (2012) Social Media and Academic Libraries: Current Trends and Future Challenges. Asist 2012 Conference. https://www.asis.org/asist2012/proceedings/Submissions/272.pdf (accessed on 02.1.2015). Dhiman, A.K. (2003). Basics of Information Technology for Librarians and Information Scientists. 2 Vols. New Delhi: Ess Ess Publications. Dhiman, A.K. (2012). Social Networking Sites : Implications and Applications for Libraries. In Shantanu Ganguly, Rama Nand Malviya and P.K. Bhattacharya (Eds.) Creating Digital Library in Globalized E-Society (pp. 261-269). New Delhi: D.P.S. Publishing House. Dhiman, A.K. and Sharma, H. (2008). Blogging and Uses of Blogs in Libraries. In Jagdish Arora (Ed.) Automation to Transformation (pp. 437-45). Ahemdabad: INFLIBNET. Dhiman, A.K. and Rani, Yashoda. (2012). Manual of Digital Libraries. 2 Vols. New Delhi: Ess Ess Publications. Dhiman, A.K. & Sumankumari, R. (2010). Library 2.0 : the Concepts and Prospects. PEARL - A Journal of Library and Information Science, 4 (1), 66-71. Dickson, A. & Holley, R. P. (2010). Social Networking in Academic Libraries: the Possibilities and the Voncerns. New Library World, 111(11/12), 468-479. http://digitalcommons.wayne.edu/slisfrp/33 (accessed on 02.1.2015). Farkas, M. (2007). Your Stuff, their Space. American Libraries, 38 (11), 36. Faisal, S.L. (2009). Blogs and Online Social Networks as User Centric Service Tools in Academic Libraries: An Indian Library Experience. Anonymous (Ed.) Globalizing Academic Libraries (pp. 488495). Delhi: Mittal Publication. Graham, J.M., Faix, A., & Hartman, L. (2009). Crashing the Facebook Party: One Library's Experiences in the Students' Domain. Library Review, 58(3), 228-236. Kaplan Andreas M. & Michael, Haenlein (2010). Users of the World, Unite! The Challenges and Opportunities of Social Media. Business Horizons, 53 (1), 61. Kroski, E. (2007a). The Social Tools of Web 2.0: Opportunities for Academic Libraries. Choice, 44 (12), 2011-21. Kroski, E. (2007b). Folksonomies and User-based Tagging. In N. Courtney (Ed.) Library 2.0 and Beyond: Innovative Technologies and Tomorrow's User (pp. 91-104). Westport, Conn: Libraries Unlimited.

19. Social Networking Sites for Intimacy, Privacy and Self-Expression. New Media & Society, 10 (3), 393-411. 20. Lucky, S. (2012). Evolving and Emerging Trends in Digital Libraries User Interfaces. www.caisacsi.ca/proceedings/2012/caisacsi2012_48_rathi.pdf (accessed on 02.1.2015). 21. Meghini, C., Spyratos, N., & Yang, J. (2010) A Data Model for Digital Libraries. International Journal on Digital Libraries, 11(1), 41-56.

236

22. Mishra, C. (2008) Social Networking Technologies (SITs) in Digital Environment: Its Possible Implications on Libraries. eprints.rclis.org/16844/1/Social%20networking%20in%20Library.pdf (accessed on 02.1.2015). 23. Parveen, N. (2011). Use of Social Networking Site (Facebook) in making awareness among the Library and Information Science Professionals of University Libraries of UP: A Case Study. International Journal of Digital Library Services, 1 (1), 9-17. 24. Smeaton, Kathleen & Davis, Kate (2014). Using Social Media to create a Participatory Library Service : An Australian Study. Library and Information Research. 38 (117), 54-76. 25. Xie, Iris & Stevenson, Jennifer. (2013). Social Media Applications in Libraries. Online Information Review, 38 (4), 502-523. 26. Xie, Iris & Stevenson, Jennifer. (2014). Functions of Twitter in Digital Libraries. 77th ASIS&T Annual Meeting, October 31November 4, 2014, Seattle, WA, USA. https://www.asis.org/asist2014/proceedings/submissions/posters/276 poster.pdf (accessed on 02.1.2015).

237

PARAMOUNT PRACTICES ADOPTED IN ACADEMIC LIBRARY AND ITS CHALLENGES Sarita Bhargava Alka Chaturvedi Abstract This Paper highlight with the introduction to the role of educational or academic libraries and the impact of ICT on them in the present context. The paper focus on the current challenges faced by the educational libraries, and how it may be overcome by using the paramount practices and also highlights the vast areas under which the works are classified under each segment. The steps that are adopted in the academic libraries are discussed and concludes that with the adoption of the paramount practices in educational libraries there will be a endless improvement and overall performance within the institution. Keywords: Academic Libraries, Paramount Practices, Information Literacy, User Education, Library Services Introduction The Present Information and communication technologies (ICT) have made a tremendous impact on the functions of the academic libraries. The developments and changes in the ICT have changed the user's expectations from the academic libraries in different ways. The ways to build a library collection and offer services to the end users vary from the recent to past exercises. Thus to effectively meet the demands of the end users, the academic libraries need to identify and adopt good and paramount practices and benchmarks which will ultimately enhance the value based services of the libraries in an academic environment, NAAC (2003). *In the present day scenario the fast- paced educational innovations become necessary for continuous review and improvement of the overall functions of the library and information centres. At times there is hesitation on the part of some institutions to share their paramount practices data. In the present age of information explosion the libraries and information resource centre play not just an important learning-support function, but the library itself has been emerging as a site of learning, sometimes more important than even the class -room. In the huge and heterogeneous higher education system, there are numerous innovative practices, but a few have been considered discussed for this presentation and for the overall enhancement of the institutional effectiveness. (*Umesh, K. Y. (2012). Best practices adopted in academic libraries and information resource centres. International Journal of Information Dissemination and Technology) Literature Review Catherine Edwards, Graham Walton, (2000) colleges and universities are changing faster than their respective parent institutions. Essentially everything in and around the library is changing: services, technologies, 238

NAAC- Best practices in Library and Information Services, case presentations. This paper discuss about to effectively meet the demands of the end users, the academic libraries need to identify and adopt good and paramount practices and benchmarks which will ultimately enhance the value based services of the libraries in an academic environment Thea Farley (Librarian at D.J. Freeman, London) The effective management of change is a crucial issue for academic libraries in the 1990s and beyond, as change is impinging on every aspect of their work. Through a consideration of aspects of organisational theory, changes in academic libraries, and human resource management, this paper demonstrates the pressing need for attention to change and its effect on people, in an organisational setting. A case study is used to illuminate a literature review, and to ground the conclusions of the study in the experiences of staff in an academic library in a time of change. Preeti Mahajan Panjab University, (10-3-2005)The primary objective of libraries is to organize and provide access to information, and it remains the same although the format and methods have changed drastically. Under the present scenario of declining budgets and higher subscription costs of journals in India, it is becoming very difficult to meet the demands of library/information users. The only solution to the problem is the pooling and sharing of resources print as well as electronic by way of consortia. New technology has provided great opportunities for delivery of services within consortia. More and more libraries must unite, which of course requires a change in the attitudes, practices, and policies to get the maximum benefit. Umesh, K. Y. (2012). International Journal of Information Dissemination and Technology. The Paper throws light with the introduction to the role of academic libraries and the impact of ICT on them in the present context. The article stresses on the current challenges faced by the academic libraries, and how it can be overcome by using the best practices. Also highlights the broad areas under which the works are classified, along with its list under each category. The process that are adopted in the academic libraries are discussed and concludes that with the adoption of the best practices in academic libraries there will be a continuous improvement and overall performance in the institution / organization. Role of the Academic Libraries The role of the library and information centre in a college is aimed at realizing the educational goals of the college or the parent organization. The college libraries not only provides stimulus to reading by procuring materials for study and research, by introducing open access system, by providing long hours of open, by organizing the library resources in a systematic way, but also feeds the intellect of the student, creation the researches of the faculty and thus serve the teaching and research needs of the faculty. The college library and disseminating information

information resource centre acts as a vehicle and the related computer technologies through 239

for the

paramount practices for utilization by its community of users and also for the exchange of information among its users. Challenges Faced By Academic Libraries Explosive growth of information and documents Increased cost of the documents and information materials Increase in users information needs New role of the librarian and greater responsibilities. Latest techniques and concepts in handling of information. New electronic information environment. Creation of databases and its security. Marketing of library and information services. The library and information centre of an institution play a central role in facilitating dissemination and creation of new knowledge. In the current learning environment, the library and information centre as learning resource has taken up increasingly more academic space and time in the life of the user, thus it is time to identify and adopt the new ways that will lead library and information centre to improve their process and activities, thereby optimizing the resource utilization and delivering high quality, and value added services to their users. A Paramount Practice may be innovative and be a program, policy, strategy, philosophy process or practice that solves problem or create new opportunities and positively impacts on the institute. Institutional quality is the aggregate of the paramount practices followed in different areas of institutional performance. In broad, the use of technology and innovative ideas lead to evolve paramount practices evolving in the library and information environment. Thus a 'Paramount practice' in library in simple terms is known as that practice which makes the way for pleasing to the eye an existing function or an activity and helps in effective implementation or use of the process thereby leading to a continuous improvement and overall performance of the library NAAC (2006). With this information as background an attempt has been made to highlight the practices that are adopted in the academic libraries. The Paramount Practices are classified under the following broad areas: Management and administration of a library In-service Programmes Observation of other Library practice. Staff promotional policy. Maintenance of service area Special deposit scheme Resource Generation through external membership. Resource generation through internet services. Student participation programme. 240

Collection and services Collection development in different formats Compact storage of less used collection Library book exhibition Extended library opening hours Extended hours of service Extent of use of services User education Initiation to fresher Preparatory course for students project User orientation Information aids Library use statistics Library paramount user award User feedback practice through different formats Suggestion box and timely response Use of technology in libraries On-line information retrieval- Internet access Free browsing unit -Internet access. Broadband internet centre Library homepage for information dissemination A strong and dynamic library website User feedback through library homepage Access to e-resources Information retrieval through web OPAC Campus-wide LAN facility Database creation using international standard formats Electronic surveillance system CCTV Paramount Practices Adopted In Academic Library Book Display programme To organize programme on important dates and important occasion on renowned personalities. This provides an opportunity for users to know the various types of information resources available on a particular aspect in the library and information hub. Orientation programme The paramount practices is to create knowledge among the students about the library resources, the library services, good reading habits, creative programmes and activities for maximum utilization of the library. In other words enlighten the 241

fresh students at the beginning of each academic year about the importance of the library, thereby exposing the students to its various sections of the library, the library resources and the various library services. Educating the User The academic libraries have a great role and responsibility in creating awareness among its users which will help to make use of the library resources, facilities, services, more effectively and efficiently. Through 1. User orientation which may be individual or groups, Kulkarni (2009). Library Brochures, circulars, Pamphlets and handouts. Staff Users Meet The academic libraries should organize various programmes including orientation, lectures on related issues, and topics, workshops, seminars, which focuses the issues useful to the users as well as to the staff. The libraries may organize programmes in information handling in the present digital era, knowledge networking, role of librarians in the electronic era, subject searching, time management, public relations, knowledge based systems, this helps to keep abreast the staff and the users about the latest developments and trends in library principles and practices, thereby bridging the gap between the staff and the users. Developing Virtual Presence The libraries can use web 2.0 applications like social networking, blogging, use of RSS feed, audio and video streaming, wikipedia, etc, and interacting delivery information services Demonstrations and Exhibitions The Libraries should organize demonstrations and exhibitions to create awareness about their collection, services. This can be done inside the library separately through displaying the special collection and literary works of specific authors or group of authors thereby creating awareness about the particular author or literary works among its users, thus attracting even the people from different sections of the society like parents, management members, relatives of the staff members and the public. Information Brochures Information brochures and pamphlets are also one of the important sources for creating awareness about the facilities, services, and the collections of the library, the users can be provided the information brochures at the time of their enrolments as registered members. The Information brochures may be on reprography or Xerox facilities, latest publications, and latest additions to the library. CD/DVD list, book bank facilities, library rules and regulations, electronic resources, and online information services list. 242

Web Based Services The libraries can provide various web based services through its strong Library Website updated with services such as virtual tour, virtual reference desk, ask the librarian, full text article, help desk, lecture notes, electronic announcement, e-Books, digital suggestion box, project reports, frequently asked questions, dissertations, face book etc. Conclusion The academic libraries should take initiations of the broad activity to enhance the socio-economic position of the books / documents in the library. Also they should create an environment and conditions for keeping abreast of the new and latest knowledge and uses of modern technological achievements in the field, and also in the diffusion of the latest information and lastly the training of professional personnel through a unified system, so that the knowledge from the source to its beneficiaries or users can be disseminated in a most efficient and effective way through the adoption of the paramount practices. When the library staff work in coordination and with a team spirit under the dynamic leadership of the librarian with the involvement of the paramount practices they are sure to benefit the user community and thereby contribute to the overall performance in achieving the goals and objectives of the institution. Also by adopting the effective means in the library and through its implementation at the appropriate time we can create a friendly environment, there by bridging the gap between the library and the users, which in total will result in the optimum utilization of the resources, thereby effectively meeting the challenges along with the goals and objectives of the institution we work. Thus the paramount practices adopted should bridge the gap between the library and the users for effective and maximum utilization of the resources which will result in the advancement and promotion of higher educational goals and the vision and mission of the library. The paramount practices are evolved through stages of failures and corrections and through the borrowed paramount practices we can enhance our competitive advantage and also enhance the institutional effectiveness. The author hopes that the outlined paramount practices will serve as one of the ways to strengthen the library and information services of academic libraries. References 1. 2. 3.

4.

Best practice. Available at http://en.wikipedia.org/wiki/ Best_practices. Edwards, C., & Walton, G. (2000). Change and Conflict in Academic Library. Retrieved july 20, 2009, from Library Managementhttp://www.emeraldlibrary.com Farley, T., Broady-Preston, J., & Hayward, T. (1998). Academic libraries, people and change: a case study of the 1990s. Retrieved July 15, 2009, from OCLC Systems & Services: http://www.emeraldinsight.com/1640140420.html Mahajan, P. (2005, Fall). Academic Libraries in India: a Present-Day Scenario. Retrieved July

243

5. 6.

10, 2009, from Library Philosophy and Practice 8(1), Fall: http://www.libr.unl.edu:2000/LPP/lppv8n1.htm NAAC- Best practices in Library and Information Services, case presentations, Best practices seriesNAAC-2003. Umesh, K. Y. (2012). Best practices adopted in academic libraries and information resource centres. International Journal of Information Dissemination and Technology, 2(3), 186-188.

244

VIRTUAL LIBRARY - A GLOBAL SYMBOL OF THE INFORMATION ACCESS PARADIGM Dr. Rakesh Shrivastava Abstract Virtual libraries offer opportunities for learning that are not possible in their physical counterparts. Whereas physical libraries operate with designated hours, virtual libraries are available anytime and anywhere there is an Internet connection. "A paradigm shift takes place from libraries as collectors of items to libraries as facilitators of access to all kinds of information, provided by anybody, located anywhere in the world, accessible at any time" (Grothkopf, 2000). Virtual libraries are organized collections of digital information. They are constructed collections organized for a particular community of users, and they are designed to support the information needs of that community (Saracevic, 2000). Virtual libraries can offer resources from many sources and in many formats, including audio and video. The items in these virtual collections do not have to reside on one server, but they share a common interface to assist the user in accessing the collection. The emphasis in virtual libraries is on organization and access, not on physical collections (Baldwin & Mitchell, 1996). The mere presence of virtual libraries, however, does not cause learning to occur. It is how these libraries are utilized by students and teachers that will enable learning. Virtual libraries present a new paradigm for information access & learning. They have the ability to transform the relationship between learners and resources, facilitating both formal and informal learning. With careful design and the support of skilled information professionals, virtual libraries can provide powerful environmenst for student learning. Keywords: Virtual Library, Structure and design of a virtual library, National Virtual Library of India Introduction Rapid advances in information technologies have revolutionized the role of libraries. As a result, libraries face new challenges, competitors, demands, and expectations. Libraries are redesigning services and information products to add value to their services and to satisfy the changing information needs of the user community. Traditional libraries are still handling largely printed materials that are expensive and bulky. Information seekers are no longer satisfied with only printed materials. They want to supplement the printed information with more dynamic electronic resources. Demands for digital information are increasing. Most of the Libraries today, offer a wide range of on-line services to their users. And, now, the internet and web technologies are not the new things to any academician hence, it is the time for a Library to be virtual and develop its on-line presence in order to further facilitate and enrich the educational processes. In this direction, Virtual Libraries provide a new way of serving the new generation users of the libraries. Virtual libraries are the new vision of libraries of the future. Virtual Library is another kind of Digital Library which provides portal to information that is available electronically elsewhere. This is referred so to emphasize that the Library does not itself hold content. Librarians have used 245

this term for a decade or more to denote a Library that provides access to distributed information in electronic format through pointers provided locally. A Virtual Library has been defined by Gapen (1993) services of libraries and other information resources, combining a non-site collection of current and heavily used materials in both print and electronic form, with an electronic network which provides access to, and delivery from, external worldwide library and information contents makes virtual libraries a global symbol of the information access paradigm. The Virtual Library has changed the traditional focus of librarians on the selection, cataloguing and management of information resources such as books and periodicals. The virtual library is putting emphasis on access without the need to allow for the time required by these technical processes. Virtual Libraries have induced libraries, scholars, publishers and document delivery vendors to develop new partnerships that are working for the good of scholarly communication in both developed and developing countries. What is virtual library? "Virtual reality (VR) is an environment that is simulated by a computer. Most virtual reality environments are primarily visual experiences, displayed either on a computer screen or through special stereoscopic displays, but some simulations include additional sensory information, such as sound through speakers or headphones. Some advanced and experimental systems have included limited tactile feedback." (Wikipedia, 2005b) Given this definition of VR a virtual library could be a library environment simulated by a computer. Another definition of a virtual library is synonymous with a digital library: "Virtual Library means library without walls. The resources are available in digital format, there is no paper, microforms etc. The resources are locally held or accessed through computer networks." (Wikipedia, 2005a)"The World Wide Web Virtual Library was the first index of content on the World Wide Web. It was started by Tim Berners-Lee, the creator of HTML and the Web itself, in 1991 at CERN in Geneva." (Wikipedia, 2005c). The new library-documentary model being generated from these premises has been named differently :electronic library, digital library, hybrid library, library without walls, or simply virtual library. We shall refer to this new documentary model as a virtual library, as we refer to that type of library which carries out its function exclusively in a virtual environment indeed, virtuality has made it possible -, and in addition because it uses digital documents achieved. A Virtual Library becomes the gateway that provides integrated access to all sorts of resources digital collections, and to resources which can be located on any Internet site, whether public or private. It becomes a uniform interface, understandable to the user, with access to remote information, which they can access immediately, whenever they decide they want it. In The interactive environment enables the definition of services to adapt to the community to which it serves, while at the same time it allows for the setting up of a line of work which 246

demands a model of library which is totally flexible and which has the capacity of constantly adapting itself to the new needs expressed by its users. Special Features of a virtual library 1. It provides speedy and wide access in a global manner. Greater emphasis is on access and not on collection. 2.Virtual libraries can empower the user and promote informal learning. Offer lifelong learning opportunities. 3. Virtual libraries can offer just-in-time learning and Time saving. 4. Virtual libraries can offer individuals-enable just-for-me learning. 5. Minimizing digital divide because Virtual Library can afford to support Virtual Library services globally. Virtual vs physical libraries Virtual libraries offer opportunities for learning that are not possible in their physical counterparts. Whereas physical libraries operate with designated hours, virtual libraries are available any time and anywhere if there is an Internet connection. "A paradigm shift takes place from libraries as collectors of items to libraries as facilitators of access to all kinds of information, provided by anybody, located anywhere in the world, accessible at any time" (Grothkopf, 2000). Virtual libraries, especially those with customized collections, facilitate just-in-time learning. (Riel, 1998) described just-in-time learning as learning needed for a particular task or purpose. Just-in-time learning can be independent of time and place (Riel, 1998; Weinberger, 1997). Schools with virtual libraries can make resources available just in time for specific assignments. Virtual libraries provide immediate access to a range of resources not available in physical collections. Virtual libraries often contain more up-to- date information than physical collections. Their sources can be searched more efficiently than those in physical libraries, and the information they contain can be updated more frequently. Well-designed virtual library collections are organized and managed to increase the productivity and efficiency of the user (Saracevic, 2000; Schamber, 1990). (Roes, 2001) believed that virtual libraries complement other virtual learning environments such as those provided in distance education and courses offered online. Virtual libraries can empower the user and promote informal learning. (Marchionini and Maurer,1995a) saw advancing informal learning as the most important change created by virtual libraries. Virtual libraries, which are customized for the learning needs of particular users-either schools, classes, or individualsenable just-for-me learning. Just-for-me learning can be tailored to individual learning styles, preferences, and other characteristics of the learner or community of learners. Teacher librarians who have selected online resources for specific classes, teachers, or student groups are facilitating just-for-me leaming. (Neuman ,1997) recognized that any library must have a range of resources to meet the information needs of different users, and she saw the variety of formats and methods of navigation that can be used in virtual libraries 247

as one of their greatest strengths. Resources in a virtual library can be organized so that sources for a particular group of users are easily identified. Virtual libraries can be customized for particular schools, grades, and subjects. This variety of formats in presentation and navigation is quite different from that of a physical library. Virtual libraries have the ability to transform practices and values for those who work in schools and libraries because of the processes that are enabled through these virtual resources (Bruce & Leander, 1997). Bruce and Leander, however, were concerned that the people who work in libraries might transfer values embedded in physical libraries to virtual libraries, thus preventing this transformation from occurring. (Marchionini and Maurer ,1995a) also saw the possibility of changed practices in virtual libraries, and believed that virtual libraries offered the potential for users to become authors and publishers as well as readers in this online environment, blurring the line between reader and author. Structure and design of a virtual library Environment Koganuramath described, the Virtual Library Environment means virtual teams; virtual communication and the electronic environment are now a reality for the library. Users are able to view and request information resources either from the library Intranet site or over the Internet and contact staff by phone and e-mail for more general research requests. Library services will entirely be virtual (Koganuramath, muttayya,2007). Delivering virtual information services differs from traditional information service delivery Clients are unable to visit the library to preview resources, collect material or access The virtual environment has had a dramatic impact on the way team members operate. As emphasized by Cascio,who believes one of the most challenging aspects of virtual teams is the absence of physical interaction and the lack of synergy associated with verbal and non-verbal communication (Cascio, 1999). As with servicing remote clients, working in the virtual team increases the importance of communication and willingness to interact via new electronic tools resources such as databases(Mohrman, 1999). All communication must be conducted through e-mail, phone or fax. Information literacy training must be delivered innovatively, as traditional face to face training sessions are no longer possible. Even delivered via telephone linksearching networked databases through their Intranets or Internet. There is an increased dependence on information technology to access information resources. As a result, library users often expect technical support from librarians, and queries regarding network and access problems become increasingly common. This means librarians must keep abreast of current technical developments and know when and where to refer clients with problems beyond their knowledge. Organization A virtual library is an organized set of links to items (documents, software, images, databases etc) on the network. The purpose of a virtual library is to enable users of a site to find information that exists elsewhere on the network. Virtual libraries (VL) are a natural growth of the ability of modern client server protocols (especially HTTP and Gopher) to 248

provide seamless links to information anywhere on the Internet. The first VLs were menus of links about a particular topic. They were thrown together by site managers to assist the users find items of interest. As the sheer volume of information has grown this approach is increasingly difficult to maintain. Automation, cooperation and more flexible designs are becoming essential. Much attention has focused on the development of automated systems for indexing network information. Many of these systems are non-selective in building indexes. Others are designed to index information only for a particular suite of sites. However the real advantage of a virtual library, especially one associated with a special interest network, is that it focuses on material relevant to a particular topic. The design is intended to be a fruitful mixture of automation with human participation, of flexible searching with "guided tours" of the information. Important issues in running a virtual library include finding the "records" (i.e. the links to relevant interest), managing the records, and providing access to the records. the VL can be developed by a Special Network Interest group. Any or all nodes in this network can participate in the management of its virtual library. Coordinating centre One node of the network acts as the coordinating centre for the VL. (The work might be divided amongst several nodes). Its main role is to collate and process records. If the VL includes a central main database, this would normally be maintained by the coordinating centre. Team of editors The virtual library is managed by a team of editors; there may be one or many. Each editor has responsibility for (say) a given theme or topic. There is a coordinating editor (i.e. at the VL coordinating centre) who supervises the merging of incoming entries. General editing functions (see details below) include: supervising automated searches; evaluating incoming items; editing email and web submission forms; locating and entering new relevant entries; assessing quality incoming entries; supervising the validation and merging procedures; creating views; responding to user queries. Records collection

249

An important principle in operating a VL is to distribute the work as widely as possible. Ideally, the editors should have to do little searching for records themselves. There are three main sources of records: Manual - direct collation of records by the editors (this is still the MAIN method of compilation at most sites); Passive - public submissions from users via email or WWW forms; Active - automated searches using Web walkers, worms, spiders, harvesters, etc. Structure of records The records maintained by the library must include enough information to identify what the item is, where it is, and how to maintain it. The submission form provides the following fields: URL for the source A title for the item A brief informative description of the item Contact for the item (usually the site maintainer) Name Email address About this submission Name Email address Indexing Details Standard keyword/headings Other keywords Datestamp for the record Organization of records As with any library, the records in a VL need to contain enough details to allow them to be indexed adequately. Full text indexes of filenames (cf Archie) or titles (cf Veronica) are useful, but can be both unreliable and wasteful. It is therefore useful to include a series of keywords with each record. By drawing keywords from a standard list, and allowing that list 250

to be augmented by user-supplied terms, the VL can build up a rich set of classifications. These categories will also reflect the thinking of its users. As conceived here, a VL consists primarily of files containing lists of records, with each record including the information described above. For maintenance purposes, one effective design is to build the files chronologically - e.g. by date stamping and storing the updates file (see below) for each month. All methods of accessing records (e.g. a word search) simply filter these files. There are two chief ways of retrieving the records. Searches filter the stored records to retrieve those that satisfy a specified search criterion. The VL can also provide views of the information. Views are collated subsets of records. They are prepared by the editors (or interested users) to help guide users to relevant information. Most early VLs were really just views, but without an underlying database structure. Views can either be simple HTML documents containing items copied from the database, or else pre-canned filters for pulling out and displaying records from the database. Indian scenario Research, academics and also general users from various sectors are today depending more on digital information. Corresponding to this demand an increasing amount of digitized data and services based on such data are being initiated. Digital Information today serves as an important knowledge asset. While the proliferation of the digital data and information services heralds a new and exciting era, it also presents many issues and challenges. The common user is often clueless about the existence of resources that are useful. The problem is akin to having large collection of printed materials in the form of books and journals but only when it is organized and services are provided, it is called a 'library'. In fact immense amount of information are digitized under several projects carried out by various Agencies and Govt. Virtual Information Centre (http://www.vic-ikp.info/vic_new/index.asp): ICICI Knowledge Park, Bangalore has set up a VIC to provide access to digital resources of member partners. K-Library is a part of VIC. It covers four domains namely Biotechnology, Networking and Telecommunication, Pharmaceutical Sciences and Material Science. Resource coverage includes Electronic Journals and Newsletters, Books, Discussion Forum, Conferences, Portals, Preprints and e-prints, Science and Research News etc. SunSite India (http://sunsite.serc.iisc.ernet.in/virlib) : The SunSITE India is a joint initiative of the Indian Institute of Science and Sun Microsystems. A large public software archive is maintained on SunSITE India server. e-Gate (http://www.drdo.org/egate/index.html): It is a part of Defense Research 251

and Development Organization, a Government of India undertaking. The e-Gate provides links on resources for Aerospace, Chemistry, Computers, Electronics, Academic, Defense, Research and General. It also has a listing of various search engines, newspapers and patents. IUCAA (http://www.iucaa.ernet.in/library/): IUCAA library is one of the most advanced modern libraries specializing in Astronomy and Astrophysics in India. National Virtual Library of India A high level Committee i.e., National Mission on Libraries (NML) is constituted with chairmanship of Prof Deepak Pental on 20th March, 2012 by Govt of India. The Raja Rammohun Roy Library Foundation (RRRLF), an autonomous body under the Ministry of Culture is the nodal agency for the National Mission on Libraries for administrative, logistic, planning and budgeting purposes. RRRLF will also be the main executing agency for :(a) Upgradation of existing libraries and setting up the model libraries(b) Capacity Building (c) Survey of Libraries( National Virtual Library of India,2012). NML at its First Meeting held on 18th May 2012 constituted the Working group on the setting up of National Virtual Library, Networking and ICT applications in libraries. Dr. HK Kaul, Director, DELNET, Delhi is appointed as a Chairman of working group. The target users of National Virtual Library ( NVL) will be: students, researchers, doctors, professionals, and novice users, including educationally, socially, economically, physically disadvantaged groups. They can be built incorporating many modules to cater to information needs. In fact immense amount of information are digitized under several projects carried out by various Agencies and Govt. Departments like MOC, HRD, DIT, CDAC, Prasar Bharati, AIR, State Govts, etc. Most of them are available on the web, but are dispersed. There is no comprehensive database built for all such resources. The true potential can only be exploited for use by the masses when it is usefully organized and presented in user-friendly services including multilingual services. Digital era has also brought in 'digital divide' and has marked the society into digital haves and have-nots. It is no longer a matter of choice but rather a compulsion to find ways and means of bridging the gap between the two sections. Virtual library will go a long way in filling up the gap. Research, academics and also general users from various sectors are today depending more on digital information. Corresponding to this demand an increasing amount of digitized data and services based on such data are being initiated. Digital Information today serves as an important knowledge asset( National Virtual Library of India,2012 ). Concluding Remark Virtual is "a place, not a format" (Abram, 1999), and many people spend a lot of time in this virtual place (Cyber Atlas staff, 2002). Teenagers in particular prefer the Internet as an information source to traditional print sources (Lenhart, Rainie, & Lewis, 2001). Virtual libraries require connectivity. If there is no Internet connection, the virtual library is 252

inaccessible. Although Internet use is becoming more widespread, many people still do not have Internet access (Cyber Atlas staff, 2002). The term digital divide describes the gap between those people with access to the Internet and information technology tools and those without (Digital Divide Basics, 2002). This digital divide exists not only between countries, but within countries. Connectivity, however, is not the only concern with the use of virtual libraries. Even if students have access to virtual libraries, they may not possess the skills to access and utilize the information effectively. Hargattai noted considerable difference in people's online skills in locating particular information and used the term second-level digital divide to describe the group of people who had access to the Internet, but lacked the skills to utilize the online information efficiently (Hargattai, 2002). The mere presence of virtual libraries, however, does not cause learning to occur. It is how these libraries are utilized by students and teachers that will enable learning. Virtual libraries present a new paradigm for information access & learning. They have the ability to transform the relationship between learners and resources, facilitating both formal and informal learning. With careful design and the support of skilled information professionals, virtual libraries can provide powerful environmenst for student learning. References: 1. 2. 3. 4. 5.

6.

7. 8. 9. 10. 11. 12.

13. 14.

15. 16. 17. 18.

Abram, S. (1999). Are you building your library with the right stuff? Computers in Libraries,19(8), 8086. Retrieved November 22,2002, from Infotrac Database. Baldwin, C.M., & Mitchell, S. (1996). Collection issues and overview. Untangling the Web.Retrieved November 22,2002, from http:/ /www.library.ucsb.edu/untangle/ba1dwin.html Bruce, B.C., & Leander, K.M. (1997). Searching for digital libraries in education: Why computers cannot tell the story. Library Trends, 45,746-771. Retrieved January 19, 2002,from EBSCO database. Cascio, W.F. (1999) Virtual workplaces: implications for organizational behavior. In C.L. Cooper & D.M. Rousseau (Eds.), The virtual organization. Trends in Organizational Behavior, Vol 6, pp. 1-14. Chichester: John Wiley & Sons. Cyber Atlas staff. (2002). The world's online populations. CyberAtlas. Int Media Group.Retrieved November 20, 2002, from http:/ /cyberatlas.internet.com/big picture/geographics/article/0,,5911151151,00.html Digital Divide Basics. (2002). Digital Divide Network. Retrieved November 20,2002, fromhttp:/ /www.digitaldividenetwork.org/content/sections/index.cfm?key=2 Gapen, K. G. (1993). The virtual library: Knowledge, society, and the librarian. In: L.M. Saunders (Ed.), The Virtual Library: Visions and Realities, pp.1-14. Westport: Meckler. Grothkopf, U. (2000). Astronomy libraries 2000: Context, coordination, cooperation. European Southern Observatory, Garching, Germany. Retrieved November 22,2002, from http://www.eso.org/gen-fac/libraries/astrolib2000/astrolib2000.html Hargattai, E. (2002). Second-level digital divide: differences in people's online skills. First Monday, 7(4). Retrieved November 22, 2002, from http://www.firstmonday.org/issues/issue7A4/hargittai/index.html Koganuramath, muttayya(2007). Theme Paper Virtual Library: an overview 5th International CALIBER -2007, Panjab University, Chandigarh, 08-10 February, 2007p.536. © INFLIBNET Centre, Ahmedabad. Lenhart, A., Rainie, L., & Lewis, 0. (2001). Teenage life online: The rise of the instant-message generation and the Internet's impact onfriendships andfamily relationships. Washington, DC: Pew Internet and American Life Project. Retrieved November 22, 2002, from http:/ /www.pewintemet.org/reports/pdfs/PIP_Teens-Report.pdf Marchionini, G., & Maurer, H. (1995a). Digital libraries in education: promises, challenges and issues. Retrieved November 22, 2002, from http://www.ils.unc.edu/-march/ cacm95/sub8.htrnl Marchionini, G., & Maurer, H. (1995b). How do libraries support teaching and learning. Communications of the ACM (Association for computing machinery). Retrieved November

253

19. 22,2002, from http:/ /www.ils.unc.edu / -march/cacm95/mainbody.html 20. Mohrman, susan albers (1999). The context for geographically dispersed teams and network.center for effective orgsnization ceo publication G99-5(364) http://www.marshall.use.edu/ceo 21. National Virtual Library of India, (2012). national mission on libraries ministry of Culture, Government of India http://www.nmlindia.nic.in/pages/display/38-national-virtual-library-of-india-(nvli). 22. Neuman, D. (1997). Learning and the digital library. Library Trends, 45(4), 687-708. RetrievedNovember 20,2002, from EBSCO database 23. Riel, M. (1998). Education in the 21st century: Just-in-time learning or learning communities. Center for Collaborative Research in Education, University of California Irvine. Retrieved November 22,2002, from http:/ /www.gse.uci.edu/vkiosk/faculty/riel/jit-learning/ 24. Roes, H. (2001,). Digital libraries in education. D-Lib Magazine, 7(7/8). Retrieved 25. Saracevic, T. (2000). Digital library evaluation: Toward an evolution of concepts. LibraryTrends, 49, 350-370. Retrieved November 22,2002, from EBSCO database. 26. Schamber, L. (1990). Library and information services fkr productivity. (ERIC Document Retrieval No. ED 327320). Retrieved November 22,2002, from EBSCO database.November 22,2002, from http:/ / www.dlib.org/dlib/julyOl / roes/07roes.html 27. Weinberger, M.I. (1997). Just in time learning with Electric Library. Library Trends, 45,623-638. 28. Wikipedia, the free encyclopedia. (2005a). Virtual library. http://en.wikipedia.org/wiki/Virtual_library 29. Wikipedia, the free encyclopedia. (2005b). Virtual reality. http://en.wikipedia.org/wiki/Virtual_reality 30. Wikipedia, the free encyclopedia. (2005c). World Wide Web Virtual Library. http://en.wikipedia.org/wiki/World_Wide_Web_Virtual_Library 31. The WWW Virtual Library: http://vlib.org/

254

BLOG: A MARKETING TOOL FOR LIBRARY SERVICES

Anamika Shrivastava Krishan Kant Yadav Sumit Rajput Abstract Blogs are a simple and efficient way for librarians to stay informed and for libraries to disseminate information in a timely manner. Blogging is a new medium which gives anyone with an internet connection the chance to share whatever is on their mind. People now can update their blogs, and potentially reach everyone with an internet connection. This paper detail discussed about blog and its benefit especially in the field of library services and how it is possible to use blog as a marketing tool for library. Key Words: Blog, Types of Blogs, Benefits of Blogs, Blogging for Library services Introduction different type of Websites and Portals which share information on specific topics or wider categories. It usually includes Features like Blog Posts, Videos, Comments, Links to other websites, Widgets, etc. Blogs are simply a focused version of a website, that typically have only a single subject. This allows the author to focus in detail on their area of expertise, and draw in readers who are looking for this specific content. Their goal is to entice the readers to stay, start social conversations and interactions. Blogs work much like a journal or diary, and the most recent posts appear at the top of the page. Librarians have had to learn how to do a lot with just a little in order to promote awareness of their programs and services. They have seized the opportunities to market libraries in the real world via traditional media: newspapers, corporate newsletters, radio, and TV. Many libraries produce brochures, pathfinders, and their own newsletters. So it is no surprise to see librarians stepping up to the plate and spreading the word online with blogs. Savvy librarians have identified blogs as another means to market libraries and their services. A blog can be created by just one author or done collaboratively by a community of authors. Blogs can be updated from several times per day to just a few times per week or month. Some blogs encourage interactivity between the writer and audience by allowing readers to post comments and questions about entries. Literature Review Faisal, S.L. (2009) found that to reach the users where they are, the libraries should revamp their service strategies by incorporating tools like blogs and online social networks. Blogs and online social networks are two leading Web 2.0 technologies that can be adapted as a part of online services in academic libraries. Blogs can be used as library information, 255

publicity and feedback tool. The emergence of online social networks and its expanding user base demand immediate attention from the side of academic libraries. Dhiman Anil Kumar & Sharma Hemant (2008) considered that along with web 2.0,blogging is also getting popularity among the library professionals. Blogging could be an efficient and effective alternative for information and knowledge transfer, resulting in a more productive workforce. Blogging is also becoming a fast cornerstone tool in the online dissemination and consumptions of information, which makes sense to briefly explain a concept that extends the concept even further. It is called syndication and the premise is simple enough. Sheikh Mohd Imran (2011) investigated twelve national libraries for case study that use web 2.0. and found that last two decades have witnessed the rapid transformation of the library in applying information technology. Blogs were the second most common used Web 2.0 technology in national libraries because of its benefits. Some notable advantages of blogs are that libraries can use cheap or free software, and blogs require a minimal maintenance and staff's time. Additionally, blogs allow library users to freely exchange ideas on different library topics that traditional publications or services cannot offer. Michelle McLean (2010) discussed about Casey-Cardinia Library Corporation (CCLC) conducted a survey of online and in-building users overa two-week period to discover their awareness of and use of CCLC's five library blogs. The survey results confirmed in general the expected usage of CCLC's blogs. The main benefits of using blogs for this purpose have been the ease of use for multiple content authors and the ability to disseminate content easily to a wide range of virtual locations. The results from the survey indicate that many users were reading the blogs for the purposes for which they were created. Blog A weblog, also called a blog, is a journal that is maintained by a blogger and contains information that is instantly published to their blog website. Blogging is a very popular activity. "Blog" is an abbreviated version of "weblog," which is a term used to describe websites that maintain an ongoing chronicle of information. A blog features diary-type commentary and links to articles on other websites, usually presented as a list of entries in reverse chronological order Many blogs focus on a particular topic, such as web design, home staging, sports, or mobile technology. Some are more eclectic, presenting links to all types of other sites. And others are more like personal journals, presenting the author's daily life and thoughts. Blog, short for Weblog, is a Web site that contains brief entries arranged in reverse chronological order. Blogs are diverse, ranging from personal diaries to news sites that monitor developments on anything from Outer Mongolia to copyright law. Evan Williams, the creator of Blogger, describes them this way: the blog concept is about three things: Frequency, Brevity and Personality. A blog can be created by just one author or done collaboratively by a community of authors. Blogs can be updated from several times per day to just a few times 256

per week or month. Some blogs encourage interactivity between the writer and audience by allowing readers to post comments and questions about entries. Today, blogs range from the simple "what's new" type of listing to sophisticated Web sites with dozens of bells and whistles site search capabilities; topical categories; daily, weekly, and monthly archives; built-in mailing list management functions; and RSS headline syndication. The blog content ranges from the interesting and insightful to the mundane and useless Types of Blogs Some popular types of blog are given bellow Niche Expert Blogs-The primary reason an author creates a niche blog is to make money. The goal is to develop content which readers find valuable, with the simplest form of blog es readers posts that are easily shared, or can be quickly read and understood. They also give a wealth of targeted advertising opportunities, as consumers searching for reviews say, on a specific camera model are only one step away from a purchase. Niche experts focus on a huge range of different topics, from fashion and celebrity columns, to tech news and the latest gadgets. There are also many on fitness, dieting, and basically whatever subject you can possibly imagine. Their overall goal is simply to draw users in, and profit from affiliate sales and advertising.

income coming in, excellent for busy people who are blogging on the side, part time. Business Blogs-The business blog is one of the fastest growing segments in the blogosphere. Their goal is to produce high quality content which your customers appreciate. In turn, this draws even more traffic to your businesses website. The concept behind this type of blog is that you fill it with information that supports your that most of the business blogs produce the highest quality of information on the web. They have high standards for the work they produce, as it all links back to their businesses branding. This means they will only publish the best and most relevant articles, as the goal is to demonstrate their expertise in the industry, using their blog as another method of marketing in the digital age. Professional Blogs-Professional blogs sit between a niche site and a business blog, in a mix that is a little like both. Usually the creators are writing about a topic they truly love, and the blog is an integral part of their business. Most of the bloggers operating a professional blog will only have one or two sites live, which is very different to niche blogs where people usually have multiple sites running at one time. Using their blogs as their base, professional bloggers make money with a wide range of different means. Straying away from the advertising and affiliate markets, professional bloggers use their sites to promote their own courses, eBooks, subscription services, consulting or any other digital products they have produced. Journals-Journals are a much more informal type of blog, and are usually set in a narrative style. Authors of a journal use the medium to write content on a wide range of topics 257

blogging for, many journal-type blogs often have a large audience. It depends on the writer how often they post content, but many have daily or very regular posting schedules. may have in a couple of ads to help pay for hosting costs, but their blog is normally just a place to have fun, a hobby which allows them to connect to other people, and make friends Branding Blogs-Bloggers that use this type of blog are seeking to make a name for themselves in the industry they are working in. They have a long term goal of making money from their blog, but they shy away from short term advertising to get a little cash. This market is setup around building your expertise on a particular topic, perhaps you want re a writer and want to create interactions with your readers to connect on a deeper level. Promotional Blogs-These blogs are the byproduct of bloggers who are out there with the t refers to the

and posting about their products on other sites. This type of blog promotion works really well for writers who are selling a new book, or want to increase awareness of a unique product that will be remembered by readers. The promoters often have a great insight to share with an audience, and give a unique view on their topic of expertise. Blogs come in all shapes and sizes, and many of them will fall into more than one of these categories at the same time. There are some more types of blogs: Schools: Word Press is a great way for teachers and students to collaborate on classroom projects. Non-profits: Foundations, charities, and human rights groups find our blogs to be great tools to raise awareness and money for their causes. Politics: Members of parliament, political parties, government agencies, and activists using our blogs to connect with their constituencies. Military: Members of the military blog to report what they see happening in various parts of the world and to stay in touch with their families. Private: Some people make their blogs private to share photos and information within families, companies, or schools. Sports: for various sports. SEO blogs: Blogs that are written for search engines instead of humans. These blogs are dedicated to trying to fool Google and other search engines into ranking them or the sites they link to highly. WordPress.com is not meant for this type of activity. Affiliate marketing blogs: Blogs with the primary purpose of driving traffic to affiliate programs and get-rich-level marketing (MLM) blogs and pyramid schemes. To be clear, people writing their own original book, movie or game reviews and linking 258

them to Amazon, or people linking to their own products on Etsy do NOT fall into this category. Warez blogs: Blogs that promote pirated copies of ebooks, software packages, music, movies, games, etc. Automated blogs: Blogs that are generated by computers, including randomly generated blogs, blogs that re-publish press releases, marketing material, search engine results, link dumps or any other mass-produced content. Blogging for librarian In this digital age when librarians change their traditional work they become into information provider by many ways. They want up to date with current information and Blog helps many ways for librarian like:Writing a blog keeps current. You'll want to know what's going on in the world before you start talking about it. Posting regularly to a blog encourages you to actively engage the process of information seeking and current awareness. Blogs are an advocacy tool. If you want change, you have to talk about it. Blogs are a great forum, not only for exposing the world to the issues facing both libraries and librarians, but also for thinking through your ideas and cultivating means of expressing them effectively. Blogs build community. Some provide up-to-date information on local events, fulfilling their role as a news and information source for their community. Still librarians are using blogs to provide announcements of new library acquisitions, promoting the services that they work so hard to provide. Visibility People will hear about you. People discover information online today, so if you want to be a discovered and become a visible expert then blogging and social media will provide you with the visibility. Conducting a Google search is how we find people and information today not the library or the town square unless you are a stalker. Keep unique. One of the problems with librarianship is image. Stereotypes of librarians abound. Publishing a blog is an opportunity to demonstrate your individuality and thereby work to dispel some of those pervasive myths. Even if you don't think of yourself as unique and fear being redundant, your voice is yours and yours alone, so join the chorus. Synergizes and Synthesizes Focus The act of writing and expressing yourself online will provide you with internal and external feedback that sharpens the intuition and focuses your mind. The benefits of a social interactive web is that it taps into the global mind that will provide you with the feedback loop that will sharpen and amplify your purpose and passion if you are willing to listen. Become a champion researcher You will also find the best ways to obtain information as you hunt down information and resources for your blog. You will identify global bloggers, resources and experts that will provide the insights and information you need to provide ongoing value to your readers. Improve writing and video skill- Just the discipline of sitting down and writing will improve your writing skills. If you spend some time interviewing people with videos or even

259

video skills will improve whether that is the lighting, the structure of the set or even the sound quality. Blog as a Marketing tool for Library services corporate newsletters, radio, and TV. Many libraries produce brochures, pathfinders, and their own newsletters. It is not surprising to see libraries using blogs as marketing tools as

reposition the library into a more interactive, reachable, open, and collaborative working place which promotes the creation and sharing of knowledge where students, faculty and the library system are the main

There are dozens of ways that libraries are using blogs already. The most obvious application is for library news, which you need to be able to update frequently and easily. Blogging software helps make this job easy enough for anyone to do it. Here are other ways to use blogs to your advantage: Promote Library Events: Create a blog that promotes library events and programs. Reach out beyond the visitors to your regular Web site. Set up an RSS feed for your blog and alert everyone in your community that they can include your headlines on their sites or can use an RSS newsreader to see what's up at the library. Support Your Dedicated Users: An obvious hit with most library visitors is finding out what new books, videos, CDs, or DVDs have been added to the collection. Think about setting up topics on your blog for each genre: mysteries, horror, science fiction, romance, and so on. In an academic library, prepare special alerts about new resources and Web sites for particular departments or colleges. Engage Your Community: Post new book reviews and book award lists. Invite comments and suggestions. Create an online book discussion area by asking readers to recommend books to others. Support Your Community: Librarians are always looking for ways to offer value-added services. Can you offer a special service with the blog and reach a new audience? A local election news blog that posts announcements about candidate Web sites, nominations, and meetings might be a natural project for libraries that are mandated to make local council minutes and agendas available to the public. Building New Ties: Are you trying to reach a new area of your community? What about offering a blog in another language to provide short entries on upcoming programs and new resources? Perhaps you are trying to reach out to teachers in order to market library services and to make sure that school visits work effectively for the library and the schools. What about starting a blog-style newsletter that's just for teachers? You can focus on special services for teachers, programs for schools, new research resources, book lists, and seasonal Web sites of interest. Some blogs allow you to have extended entries and include feature articles. 260

A great library blog requires three ingredients: inspiration, motivation, and dedication. Inspiration is that very cheerful moment when your new marketing idea meshes perfectly with a blog as the delivery vehicle. Motivation is the energy to put good ideas into practice and helps launch the new blog. Dedication is what comes next. It's the hard work that keeps the blog updated with pithy, lively posts on a constant basis. Blogs can be very effective tools for reaching online audiences. Library and information professionals may enjoy the benefits of blogs for a wide variety of purposes. Those include publication records, annual progress report of the library, messages to the new college and university students, and many other messages, purposes, and Conclusion Blogs have exploded on the Web because they have made it much easier to publish content online. Weblogs are an excellent way to stay current. News travels down the blogging pipelines long before it appears in print and, in many cases, online magazines and journals. The power of the blog lies not only in the ease of publishing new content, but also in its ability to automatically archive old posts and refresh the content of the main page. By visiting the blogs of other librarians, you gain the perspective of others working in your field, confronting similar issues and exploring solutions. Librarians also experience the wonderful diversity of opinions, perspectives and personalities that make up our field. Librarians are great filters of information and relying on a select group to provide your daily information can be a great time-saver. One very significant factor in the growth of blogs is the blogging community. Almost everyone likes to know that his or her work is read and recognized by others. Bloggers can receive almost-instant feedback as visitors and other bloggers comment on posts and explore mutual ideas. Promoting your library's services, resources, and programs online can be a lot easier with the help of a blog. Reference 1. 2.

3. 4. 5.

6.

Darlene Fichter (2003) Why and How to Use Blogs to Promote Your Library's Services vol.17 no. 6 Available at http://www.infotoday.com/mls/nov03/fichter.shtml Faisal, S.L. (2009) Blogs and Online Social Networks as User Centric Service Tools in Academic Libraries: An Indian Library Experience. ICAL 2009 LIBRARY SERVICESPp 488-495 available at http://crl.du.ac.in/ical09/papers/index_files/ical-83_162_349_1_RV.pdf Trivedi, Mayank (2010), Blogging for Libraries and Librarians, Library Philosophy and Practice 2010 available at http://www.webpages.uidaho.edu/~mbolin/trivedi-blog.pdf Dhiman Anil Kumar & Sharma Hemant (2008) Blogging and Uses of Blogs in Libraries . International CALIBER-2008 . pp 437-445. Available at http://ir.inflibnet.ac.in/bitstream/1944/1268/1/47.pdf Sheikh Mohd Imran (2011) Impact and application of web 2.0in libraries: A case study of 12 national libraries of the developed nations, BJIS, v.5, n.2, p.41-56, Jul./Dec. 2011. Available at : dialnet.unirioja.es/descarga/articulo/4357239/2.pdf Michelle McLean (2010), Evaluating Web 2.0: user experiences with Public Library blogs, VALA Conference available at http://researchbank.rmit.edu.au/eserv/rmit:3419/n2006016983.pdf?origin=publication_detail

261

KNOWLEDGE MANAGEMENT AND LIBRARIES OF TECHNOLOGICAL ERA Shraddha Shahane Manjula Chauhan Abstract This article explains the knowledge management and their different perspectives. In the article, it describes meaning, definition, objectives, need, components principles, tools, strategies and process of knowledge management. It also describes relation between information technology and knowledge management. It explains the importance of knowledge management for librarian and information professionals. It also discusses about barriers of knowledge management which are facing by librarian and information professionals.

I.Introduction In the age of the globalization and increased worldwide competition, many organizations are looking for new ways to gain competitive advantage. These organizations are trying to use a variety of organizational resources for doing this. Today, knowledge, as an intangible asset, has taken precedence over traditional organizational resources such as capital and labor. Knowledge in organization resides within individuals as well as working processes, which is more specifically known as tacit and explicit knowledge respectively. Knowledge Management is an emerging field, much tooted or hyped since late 1990s. Knowledge Management is a complex process, which deals with creations, acquisitions, packaging and application of knowledge. KM is 'an interdisciplinary field that is concerned with systematic, effective management and utilization of an organization's knowledge resources. It encompasses creation, storage, retrieval, and distribution of an organization's knowledge similar to records and information management'. KM as an emerging discipline focuses on the various management processes that facilitate finding, identifying, capturing, creating, storing, sustaining, applying, sharing and renewing knowledge to improve an organization's performance.

Ii.Review Of Literature 1. Lee suggests that librarians/libraries in the digital and knowledge age should be in charge of knowledge management in their respective organizations in order to leverage the intellectual assets and to facilitate knowledge creations. The management of information has long been regarded as the domain of librarians and libraries. Librarians and information professionals are trained to be experts in information searching, selecting, acquiring, organizing,

262

2.

Raja et al. (2009)

management. Provision of adequate budgetary support, professional training and a proactive outlook are key factors for an effective knowledge management strategy. Knowledge Management helps library and information professionals in improving the services being rendered to their users. Information professionals have to recast their roles as knowledge professiona information but they have to acquire skills to keep themselves updated so as to cope intelligently and objectively with the effective and efficient knowledge management in Academic l 3. Husain & Najim information management can be very beneficial to KM, but these are not sufficient and there is a need to acquire additional competencies in the fields of communication, human resource management, change management, project management. Based on the extensive review of literature, this study provides a theoretical foundation for further research to investigate the problems and prospects of implementing KM in librarie 4.

Asogwa (2012)

and strengthens relationships and inter-networking between libraries, librarians, and users. KM creates enablement to mine and extract the wealth of knowledge in library employees. Information technologies, information explosion, multiple formats of information, changing o academic librarians. It has transformed them from custodians of recorded human intellect to knowledge navigators; they have migrated from librarians to cyberians, knowledge engineers,

Iii.Meaning Of Knowledge Management words i.e. knowledge and management separately:Knowledge - Knowledge is a product of human experience and it can be defined as the management of creating, sustaining, applying, and renewing knowledge resources of an organization including its relationship with seeker and service provider. Knowledge can be broadly divided into following types: A. Tacit knowledge- It is complex form of knowledge. It has two dimensions namely technical and cognitive. This is personal know- ledge, which is in human mind and difficult to formalize and also difficult to communicate. B. Explicit knowledge-It is formal and easy to communicate to others. It is the knowledge of rationality. That is, policies, rules, specifications and formulae. It is also known as declarative knowledge.

263

C. Externalized Knowledge-One of the aspects of tacit knowledge is the cognitive dimension that comprises beliefs, ideals, values and mental models. D. Cultural knowledge-The cultural knowledge as knowledge which includes assumptions and beliefs. It is used to understand, describe and explain the reality as well as conventions. It is also useful to form the framework among organizational members, recognize the new information and evaluate alternative interpretations and actions. Management- Management is a mental process. Management as the process of coordinating total resources of an organization towards the accomplishment of desired goals of that organization through the execution of a group of inter-related functions such as planning, organization, staffing, directing and controlling. Knowledge Management-Knowledge Management is a process, which deals with knowledge creation, acquisition, packaging and application or reuse of knowledge. KM is a very broad field, and includes by necessity many people of diverse educational and experiential backgrounds. Bell(1973) facts or ideas, presenting a reasoned judgment or an experimental result, which is transmitted to others through some communication medium in some systematic heavily influenced by the growth and application of computer technology to data and information management. It is a journey that moves an organization (library) from their present knowledge-chaotic environment to a knowledge-centric system. Knowledge plays an important role in modern world of organization. Knowledge management is a newly emerging interdisciplinary business model that has knowledge with the frame work of an organization. Knowledge management is a journey that moves an organization (library) from their present knowledge-chaotic environment to a knowledge-centric system (Taylor, 1999, p. 139-151).Knowledge Management involves people, technology and processes in parts. It rests on two foundations first utilizing and exploiting the organization skill, talents, thought, ideas, commitments, motivations and imagination. Peter Drucker etc. to become the chief source of production then. The concept Knowledge Management was started and popularized in the business world during the last decade of the 20th Century. The applications of knowledge management have now spread to other organizations including government agencies, research and development, university and others. Librarian and Information professionals are trained to be experts in information searching, selecting, acquiring, organizing, preserving, repackaging, disseminating and serving. In the 21st century the library will inevitably face the new subject of Knowledge Management. It is basically consists of the following four steps: Knowledge Collection Organization Data protection and presentation 264

Dissemination of Knowledge Information

Iv.Objectives Of Knowledge Management The main objective of Knowledge management is to ensure that the right information is delivered to the right person just in time, in order to take the most appropriate decision. The objectives are as follows: To promote collection, processing, storage and distribution of knowledge. Knowledge Management is the systematic process of finding, selecting, organizing, in a specific area of interest. To examine the concepts of knowledge and KM. To determine the scope of KM in LIS profession. To examine the opportunities and threats for LIS professionals as emerged from the origin of KM. To identify the requirement of competencies among LIS professionals for their involvement in KM practice. to raise the theoretical approaches and concepts cited in the studies, as well as the relationships between them; identify the tools and practices of knowledge management proposals in studies; Synthesize research through a concept map.

V.Components of Knowledge Management Some essential components of knowledge management are following:Treating the knowledge component of business activates as explicit concern of business reflected in strategy. Policy and practice at all levels of the organization. Making a direct connection between and organization intellectuals assists both explicit and positive business result. Identifying and mapping intellectual assets within the organization generating new knowledge for organization, generating new knowledge for competitive advantage within organization, making vast amounts of corporate information assessable, sharing of best practices and technology enables all of the above.

265

Vi.Principles Of Knowledge Management Thomas H. Davenport (1998) has formulated ten principles of knowledge management as given below:Knowledge Management is expensive. Effective management of knowledge requires hybrid solutions of people and technology. Knowledge Management is highly political. Knowledge Management requires knowledge managers. Knowledge Management benefits more from maps than model, more from markets than from hierarchies. Sharing and using knowledge are often unnatural acts. Knowledge Management means improving knowledge process. Knowledge access in only the beginning. Knowledge Management never ends. Knowledge Management requires a knowledge contract.( p.43-57)

Vii.Tools For Knowledge Management Librarians should first and foremost have knowledge of the tools, skills, and competencies needed for effective knowledge management and take steps to acquire them. Key types of knowledge related tools are given below which is effective in managing and handling information and knowledge and thereby maintaining the knowledge base organization: Intranets/Extranets Electronic Document Management Data Analysis Data Warehousing Help Desk Technologies Mapping Tools Machine learning Workflow management systems 266

Groupware Information Retrieval Tools Data Warehousing: Metadata Portals Agent Technologies

Viii.Strategies Knowledge management can help transform the library into a more efficient knowledge sharing organization.Organizations have tried knowledge capture incentives, including making content submission mandatory and incorporating rewards into performance measurement plans. Considerable controversy exists over whether incentives work or not in this field and no consensus has emerged. Hansen et al. propose a simple framework, distinguishing two opposing KM strategies: One strategy to KM involves actively managing knowledge (push strategy). In such an instance, individuals strive to explicitly encode their knowledge into a shared knowledge repository, such as a database, as well as retrieving knowledge they need that other individuals have provided to the repository. This is commonly known as the Codification approach to KM. Another strategy to KM involves individuals making knowledge requests of experts associated with a particular subject on an adhoc basis (pull strategy). In such an instance, expert individual(s) can provide their insights to the particular person or people needing this (This is commonly known as the Personalization approach to KM. Storytelling (as a means of transferring tacit knowledge) Cross-project learning After action reviews Knowledge mapping (a map of knowledge repositories within a company accessible by all) Communities of practice Expert directories (to enable knowledge seeker to reach to the experts) Best practice transfer 267

Knowledge fairs

Ix.Process Of Knowledge Management P. Galagan proposed a process of knowledge management for Generating new knowledge which is given below:Accessing knowledge from external sources. Representing knowledge in documents, databases, software and so forth. Embedding knowledge in processes, products, or services. Transferring existing knowledge around an organization. Using accessible knowledge in decision-making. Facilitating knowledge growth through culture and incentives. Measuring the value of knowledge assets and the impact of knowledge management.

X.Information Technology & Knowledge Management The combination of computers, databases, and telecommunications, especially the Internet, provide librarians with an incredible number of options for improving the way libraries as organizations function. To facilitate the implementation of knowledge management, a welldefined and operational knowledge management system should be in place. As the companies become more explicitly reliant on effective management of their knowledge and information, so the opportunities for information professionals are opening up (Abell & Wingar 2005, p.7). Latest information technology should be used in the libraries. Information Technology in knowledge management helps to maximize the benefits and provides confidence in librarians. In this regard, the library director / librarian should consider himself as the chief knowledge officer of the entire organization and should work together with the chief information officer, heads of the planning department, the computer and information technology center, the human resource management department, the finance department etc., to design and develop such a system. They must also be prepared to take the risk of self-promotion in competitive markets for higher-level jobs (Abell & Oxbrow 2001). Extranet, internet and available software programs to facilitate the capture, analysis, organization, storage and sharing of internal and external information resources for effective knowledge exchange among users.

Reduced service costs Save the time of users as well as staffs 268

Quality and quantity improvement Improved user services Improved customer/user satisfaction through a more professional approach to service delivery Improved productivity Information Technology helps to maximizes the benefits Provides confidence to managed and cover risk to achieve the organizational goal Reducing risks and errors Faster and easier recovery of data and disseminate the information

Xi.Knowledge Management And Libraries & Information Centers The libraries are the backbone of information dissemination in any organization, and the different services offered by the libraries are mainly designed to fulfill the goals/missions of the organization. As a learning organization, libraries should provide a strong leadership in knowledge management. Raja provide effective support in implementing knowledge management. Librarians should train themselves and their staff to develop the appropriate knowledge management systems and use information technologies to equipped libraries to provide better, faster and pinpointed p.701-704). The main aim of library is to provide right information to the right user at the right time. Libraries should improve their knowledge management in all of the key areas of library services. Librarians play a central role in the development of processe digital age, libraries face challenges from both within (academia) and without (the business sector). Implementation of knowledge management enhances the traditional functions of library. In the current digital and networked knowledge age, the size of information sources on the web is growing exponentially. The ultimate purpose of KM is to increase the effectiveness and sustainability of organizations. Libraries should use the new approach to capture web information by cooperative efforts such as Dublin Core, Metadata and the cooperative online resources catalogue. Other new methods such as data mining text mining, content management, search engines, natural language searching, concept of yellow pages and such technologies in information visualization as two dimensional or three dimensional knowledge mapping. KM injects new blood into the library culture which results in a sharing and learning culture. Useful websites and knowledge sources should be regularly searched and selected from the internet and included in OPACs.

Xii.Barriers To Knowledge Management In Libraries

269

Every library professional who works in academic, public or any special library wants to use the techniques of knowledge management to achieve the organizational goal and provide better service to its users but due to some following barriers they are not able to use that :There is no co-operation between senior and junior staff. Lack of business knowledge. Generally, the junior staff cannot share their knowledge and ideas when they feel there is no benefit of this in terms of salary increases. Lack of understanding of the interplay between information and organizational objectives. Every library cannot participate in terms of modern technology and its management. Lack of communication skills. Lack of staff training. Lack of sufficient budget / funds Lack of tool and technologies Lack of Centralised policy for Library Cess. Poor team and leadership skills. concern with external information resources rather than internal organizational knowledge assets

Xiii.Conclusion Knowledge management is a wide, interdisciplinary field and it goes beyond individual skills and qualifications to embrace the many aspects of management of a key resource. Knowledge Management helps library and information professionals in improving the services being rendered to their users. Information professionals have to recast their roles as knowledge professional. Knowledge is growing very fast in every aspect of life and it is becoming very difficult for knowledge professionals to capture and disseminate the available information to the deserving person without using the emerging technologies.Library is a knowledge-based organization where the organization and maintenance of recorded knowledge is a practice as old as civilization itself. Now, the to acquire skills to keep themselves updated so as to cope intelligently and objectively with the effective and efficient knowledge management in libraries. The best knowledge creators are academics. Knowledge creation is best performed by universities. As a learning and knowledge organization, universities should empower their libraries to develop campuswide knowledge management systems. The library and Information professionals are best 270

knowledge creators. Knowledge management is a powerful tool for promoting innovation, realizing and re-engineering the various aspects of day-to-day activities of an organization. Knowledge is growing very fast in every aspect of life and it is becoming very difficult for knowledge professionals to capture and disseminate the available information to the deserving person without using the emerging technologies. Therefore the utility of knowledge management in a library cannot be ignored. For this, Information technology and systems can provide effective support in implementing knowledge management.

References 1.

2. 3.

4.

5. 6.

7. 8.

9.

10.

11. 12.

13.

14. 15. 16.

Abell,A.&Oxbrow,N.(2001).Competing with knowledge: the information professionals in the knowledge management age. London: Library Association Publishing.33(4):p.288.Retrieved online from www.informationr.net/ir/reviews/revs070.html on 8/12/15 Abell,A.&Wingar,L.(2005).The commercial connection: realizing the potential of information skills. Business Information Review 22(3): 172-181. Asogwa, B. E. (2012). Knowledge Management in Academic Libraries: Librarians in the 21st Century. Journal of Knowledge Management Practice, 13(2).1-8Retrieved online from www.tlainc.com/articl301.htm on12/01/15 Bell,D.(1973). The Coming of Post-industrial Society: A Venture in Social Forecasting. New York: Basic Books. p.175. Retrieved online from https://www.os3.nl/_.../daniel_bell__the_coming_of_post-industrial_soc. On 05/01/2015 Davenport,T.H.,DeLong,D.W.,&Beers,M.C.(1998).Successful Knowledge Management Projects. Sloan Management Review ,39 (2):43-57. Drucker,P.(1993). Knowledge Management: Classical and Contemporary works. Post-capitalism Society. Oxford, Great Britain: Butterworth-Heinemann.p.1-265.Retrieved online from https://books.google.co.in/books?isbn=0262632616 on 06/01/2015 Galagan, P. (1997).Smart Companies (Knowledge Management): Training and Development 51(12): 20-25. Husain,S. & Nazim, M.( 2013).Concepts of Knowledge Management among Library & Information Science Professionals. International Journal of Information Dissemination and Technology, 3(4), 264-269.Retrieved online from www.ijidt.com on 12/12/2014 Harper,R.(2013).Knowledge Management through the Lens of Library and Information Science: A Study of Job Advertisements. LibraryTrends . Retrieved online from www.ripublication .com on4/12/2014 Kebede ,G.(2010). Knowledge management: An information science perspective. International Journal of Information Management.30,p. 416 424. Retrieved online from www.researchgate.net on 10/01/2015 Lee, H. W. (2005). Knowledge Management and the Role of Libraries. Asian division,Library of Congress, Retrieved online from www.white-clouds.com/iclc/cliej/cl19lee.html 27/12/2014 Onyancha , O.B. & Ocholla, D.N(2010). Conceptualising 'knowledge management' in the context of library and information science using the core/ periphery model. Retrieved online from www.researchgate. Net on 14/12/2014 Patil,S.S.(2013).Knowledge Management in Libraries. International Journal of Digital Libraries and Knowledge Management. 3(2), p. 71-74 Retrieved on line from http://www.ripublication.com on20/01/2015 Rath,P.(2007).Library and Information Science Education and Skills in the Knowledge Era. P.130.Retrieved online from www.naclin.org/pravakar%20rath.ppt on 25/01/02015 Raja,Md.,Wasim,R.,Ahmad, Md. Z. ,& Sinha ,A. K. Knowledge Management and Academic Libraries in IT Era : Problems and Positions. Retrieved online from crl.du.ac.in on 25/01/2015 Sarrafzadeh, M. (2005). The implications of knowledge management for the library and information professions. Journal of Knowledge Management. 2(1). Retrieved online from www.actkm.org on 05/02/2015

271

17. Sarrafzadeh, Maryam. (2008).The implications of knowledge management for the library and https://researchbank.rmit.edu.au/eserv/rmit:13384/Sarrafzadeh.pdf on 15/12/2014 18. Sharma, A. K. (2010).Knowledge management and new generation of libraries information services: A concepts. International Journal of Library and Information Science,1 (2).p. 024-030.Retrieved online from http://www.academicjournals.org/ijlis on 15/12/2014. 19. Taylor,R.M.(1999). Steps on the Path of Knowledge. Australian Library Journal 49(2): 139-151. 20. en.wikipedia.org/wiki/Knowledge_management 21. www.academicjournals.org/ijlis.

272

Section

III

IT Aplications in Management

273

A STUDY OF GLOBALIZATION AND ITS IMPACT ON MANAGEMENT EDUCATION IN INDIA Sourabh Jain Abstract In present scenario the Globalization has a multi-dimensional impact on the system of management education. It promotes & provides the new tools and techniques in the different areas (like e-learning, flexible learning, distance education programs, and overseas training). Globalization will mean many different things for education. In the near future, 'it will mean more competitive and deregulated educational system. In current scenario the globalization has created the awareness for the jobs for skilled professional in management. According to Economic Survey 2013-2014, India is going to be the youngest nation with largest work force in the world which possesses the potential to become an economic superpower & its create the National Youth Policy which focus on the Education, Employment Skills, Entrepreneurship & others. Future technologies call for cusp of a revolutionary change in management education for meeting the growing requirements of the industry. In India the management education is a very important part in our education system. After the de-regularization of management education in the year of 1991, there is rapidly growth in management educational institutions with the positive aspect of Liberalization, Privatisation & Globalization (LPG). There are several management institutes provide the management education in India, a few have accredited with international accredited councils and international standards. Keywords:- Globalization, Management Education, Introduction The term 'globalization' means integration of economies and societies through cross country flows of information, ideas, technologies, goods, services, capital, finance and people. Cross border integration can have several dimensions - cultural, social, political and economic. Infact, some people fear cultural and social integration even more than economic integration. Globalization is the tendency of businesses, technologies, or philosophies to spread throughout the world, or the process of making this happen. The global economy is sometimes referred to as globality, characterized as a totally interconnected marketplace, restaurants around the world is an example of globalization; the fact that they adapt their menus to suit local tastes is an example of globalization (also known as internationalization), a combination of globalization and localization. Management education is definitely changing the work culture of business. Because of management studies students will develop their personality, knowledge, communication skills, and decision making these things which are really essential for organizational growth as well as Historical Development Nothing is permanent, only change is permanent. Globalization is a feature of changing world. It is no more a recent phenomenon in the world and since India 274

is a major player of twenty first century we are facing its socio economic impacts. Initial enthusiasm for globalization as a beneficial set of processes has yielded to an understanding that the phenomenon is largely associated with increasing social inequality within and between countries as well as instability and conflict. Globalization has been a historical process. During the Pre-World War I period of 1870 to 1914, there was rapid integration of the economies in terms of trade flows, movement of capital and migration of people. The growth of globalization was mainly led by the technological forces in the fields of transport and communication. Indeed there were no passports and visa requirements and very few non-tariff barriers and restrictions on fund flows. Globalization, process was slow between the First and the Second World War. After World War II, all the leading countries resolved not to repeat the mistakes they had committed previously by opting for isolation. Although after 1945, there was a drive to increase integration; it took a long time to reach the PreWorld War I level. In terms of percentage of exports and imports to total output, the US could reach the pre-World War level of 11 per cent only around 1970. Most of the developing countries like India, Pakistan, Bangladesh, and Sri Lanka which gained Independence from the colonial rule in the immediate Post-World War II period followed an import substitution industrialization regime. The Soviet bloc countries were also shielded from the process of global economic integration. However, times have changed. In the last two decades, the process of globalization has proceeded with greater vigour. The former Soviet bloc countries are getting integrated with the global economy. More and more developing countries are turning towards outward oriented policy of growth. Yet, studies point out that trade and capital markets are no more globalized today than they were at the end of the 19th century. Nevertheless, there are more concerns about globalization now than before because of the nature and speed of transformation. The business sector in India is highly promising in the present scenario. The impact of globalization has changed the business procedure in India in terms of psychology, methodology, technology, mindset work culture etc. Newer challenges, newer opportunities are day-by-day in front of Indian industries, which are profitable and prospective. The fundamental scope of doing business in India is lying with its people. The huge population of India has created a large unsaturated market of consumers. This is one of the reasons why global companies are very much interested in doing business in India. In the post globalization era this scope has increased immensely for global multinational companies as Government of India has also played a very crucial and supportive role in this respect through liberalized policies and legislative structure. Management Education in India Education is an essential component for social and economic development of the nation. India is one of the largest education sectors in the world. According to annual report (20132014), published by Ministry of Human Resource Development, there were 44 Central Universities, 130 Deemed Universities and 500 Colleges at the time of independence. At present, there are 504 Universities and university-level institutions (as on 30.09.2014) 322 State Universities, 192 State Private Universities, 45 Central Universities, 128 Deemed Universities, 33 institutions of national importance established under Acts of Parliament five Institutions established under various State legislations. In 2008-2009 the number institutes increased at exceptional rate, thus can be called the golden year in respect to establishment 275

of institutes. In last 5 years the number of AICTE approved colleges has increased in disciplines, whereas number of management institutes has seen growth. IIMs & B-Schools: IIM Calcutta and IIM Ahmedabad were both established in 1961 and 1962 respectively. IIMC tied up with the Alfred P Sloan School of Management (MIT), the government of West Bengal and the Ford Foundation. IIMA, in its initial years, tied up with the Harvard Business School. Many other management institutions came out i third IIM in Bangalore was established in the year 1973 and the fourth IIM in Lucknow was occurred. Many private bodies opened numerous management institutions across the country. Thereafter there was an immense rush in the number of B-Schools in the country. Government recognized organizations also established to control and maintain the standards in sector. (i) A Standing Finance Committee (SFS) proposal of `97.00 crore for Fellow Programme in Management (FPM) in IIMs at Ahmedabad, Bangalore, Calcutta, Lucknow, Indore and Kozhikode has been finalized and fund were released; and (ii) Four Regional Centres were established in IIM Bangalore, Lucknow, Indore and Kozhikode under NMTT Programme for imparting training to faculties and administrative heads of various institutes . All India Council for Technical Education (AICTE) The All India Council for Technical Education (AICTE) was established in November, 1945 under department of higher education, Ministry of Human Resource Development. AICTE is responsible for proper planning and co-ordinate development of the technical education and management education system in India. It is a national level government body that gives approval The National Board of Accreditation (NBA) was set up by AICTE in September 1994 in order to assess the qualitative competence of educational institutions from the Diploma level to the Post-graduate level in Engineering and Technology, Architecture, Pharmacy, Town Planning and Management The Government of India (Ministry of Human Resource Development) constituted a National Working Group to look into the role of AICTE in the context of proliferation of technical institutions, maintenance of standards and other related matters. The Working Group recommended that AICTE be vested with the necessary statutory authority for making it more effective, which would consequently require 64 restructuring and strengthening with necessary infrastructure and operating mechanisms. The Council is a 51-member body and has a Chairman, a Vice-Chairman and a Member Secretary with tenure appointments. The details of the approved programs/institutions and intakes for the year 2013-14 (up to October 2013) are summarized below:The Council has granted approval to 171 Institutions in the year of reporting and with an additional intake of 14898 in the various Technical/Management courses. Growth of Management Institutions Year 2008-2009 2009-2010 After 2010 to Till (2014) ```

MGMT 1523 1940 4604

`Source: www.aicte-india.org

276

Added in year 1345 1131 3473

NAAC: The National Assessment and Accreditation Council (NAAC) was established in 1994 with its headquarters at Bangalore. The NAAC is an autonomous body established by the University Grants Commission (UGC) of India to assess and accredit institutions of higher education in the country. AIMS: The Association of Indian Management Schools (AIMS) was founded in 1988, a non-profit professional organization. The Structure of management education in India is divided into major divisions as outlined hereunder: Institutions of national importance and university departments; Colleges affiliated to the universities; Non-university autonomous institutions; Distance/correspondence based institutions; and Un-affiliated institutions. Some Important Facts about Indian Management Education Merely 10% of graduates from business schools manage to get hired by corporate India. Campus recruitments have gone down by 40% in the same period. Only 5% of undergraduate students in the country want a master's degree after graduation. Some students are time bounded as they already know what they want. Their priorities are to get a job at 22, get married by 24 and get on with the job of starting a family by 25. About 2, 00,000 students choose to go Abroad every year. Brain Drain in higher education is also one of the reasons to drop down enrolment rate for Management education in India. Lack of good placement is also one of the reasons that students are losing interest for higher education. Review of Literature Globalization describes the broadening and strengthening of world links which have taken place progressively since the World War II, and have now reached a stage where almost no one is completely untouched by events originating outside their own country and where international Constraints increasingly restrict independent national action (Stewart, 1996). He has analyzed that links between the education and globalization. The growth of globalization has increased the opportunities for countries which having good levels of education but it is difficult for countries with weak levels of education. Even it is difficult for developed 277

countries unless invest in good education. It is widely argued that the nature of contemporary globalization is best viewed as a multifaceted rather than a singular condition and that it is associated with various consequences at the economic, political and socioplaying an important role in formulating policies and qualitative research. Secondly, argued that knowledge and information revolution associated with globalization has created a positive climate for challenges for comparative education. Education helps to enrich human lives, to empower people and thereby to raise human wellbeing (Stewart, 1996). He also discussed those two types of casual chain: first how the beings) capacity and secondly how the ways in which some of the features of globalization have exacerbated educational conditions and therefore it is difficult for countries to succeed in the global economy. The last ten years have seen massive expansion of the Master of Business Administration (MBA) provision around the world, with virtually every university business school having one and some having more than one. Blass and Weight (2005) mentioned that the MBA is positioned as a qualification that is plagued by market confusion as to what it actually represents and what its value is. A pre-emptive post-mortem is carried out into the future of the MBA and the future senior manager/leader, which highlights the gap between research and practice, league tables, e-learning, and attempts at internationalization as some of the causes of the current malaise. Quality of education worldwide in general and specifically in India has suffered drastically due to massive expansion. Gupta (2007) identified various reasons for decline in standards are lack of appropriate infrastructure, shortage of adequately qualified faculty, compromise in research Activities and converting education institutions into factories. Challenges/Issues of Management Education 1. Traditional Setup. 2. Industry- Institute Interaction 3. Entrepreneurship training 4. Quality of Faculty 5. Cost of Education 6. Global Competition SWOT Analysis of Indian Management Education System Strength- Indian education system moulds the growing minds with huge amount of information and knowledge. Indian education system gives the greater exposure to the subject knowledge. Indians are rich in theoretical knowledge .India has abduct strength of resources and man power (NASA,MAC).Cost of education is very low. Number of management education institutions in India is more compare to developed countries. 278

Weakness- The weakness of Indian management education system is: Lack of adequate upgradation of curriculum. No benchmark and no common course content and no common exam procedure national wide. Lack of specialized courses or modular and rigid curriculum learning considered as one step process. Education is exam oriented. No fixed parameters. Lack of Industry Institute interaction. Rigidity in curriculum. Lack of multidisciplinary courses. Role of teacher is confined to teaching alone. Lack of policy makers. Opportunities- India has rich resources of human as well as physical. In India enough number of management education institutions. Therefore, we can produce more and highly Producing enough number of managerial skilled outputs. By making more Autonomy Curriculum should be made more realistic, practically biased and job oriented. Students will be regarded more as a customer. To provide highly managerial skilled labor to the country Threats- Similarly the threats of Indian management education system are Lack of interest and interaction from the industry in developing and collaborating in the research field. Threat from within of deteriorating standards of education due to lack of benchmark in terms of quality of institutions. Loss of quality standards by management institutions as more and more students opts for education abroad. Lack of team work. Attitude of the people who fail to work collectively on a common platform. Impacts of Globalization in Management Education- There are some positive & negative impacts of Globalization on management education. The positive impacts of globalization areGlobal sharing of knowledge, skills, and intellectual assets that are necessary to multiple developments at different levels. Mutual support, supplement and benefit to produce synergy for various developments of countries, Communities and individuals. Creating values and enhancing efficiency through the above global sharing and mutual support to serving local needs and growth. Promoting international understanding, collaboration, harmony, and acceptance to cultural diversity across countries and regions. Facilitating communications, interactions, and encouraging multi-cultural contributions at different levels among countries. There are some negative impacts of globalizations areIncreasing the technological gaps and digital divides between advanced countries and less developed countries. Creating more legitimate opportunities for a few advanced countries for a new form of colonization of developing countries. Increasing inequalities and conflicts between areas and cultures; and promoting the dominant cultures and values of some advanced areas. Suggestions The study focused on impact of globalization on management education and accreditation systems in India. From the extensive literature survey this paper has drawn valuable 279

suggestions to improve the quality standards in management education India. Global management education is a highly challenged market. Internationalization of the business has forced the institutions to build up managerial with a global direction. Management education should bring on acquirement of a portfolio of competencies. Every executive needs to have a global mindset to face the competition from anywhere and in any form. To provide this kind of global knowledge Indian management education system has to improve the quality standards according the global market. The norms for setting up of institutes and universities have been prescribed by regulatory bodies such as the UGC and AICTE in terms of physical and academic infrastructure and human resources. But there is no separate regulation body for management institutions. It is also common assessment criteria for accrediting the technical and management education institutions. The formal systems of accreditation being offered by NBA and the NAAC have to improve and suggested to follow separate criteria for management Education: Government has to focus on quality education from these institutions and also should establish a strong body to regulate the institutions to maintain standards to offer quality education. Conclusions The quality of management education in India must be viewed as a considered issue for social and international development and economic growth. The objective of this study was to provide a small contribution towards identifying the impact of globalization on management education through a conceptual framework. This paper give emphasis to that the theoretical and conceptual reference which played an important role to understand the various blockades to provide quality management education in India. India is witnessing new era in the field of management education. Many corporate groups like Reliance, Nirma, Tata, Starlight etc have promoted management institutes. Some foreign universities are also coming to India. But government should issue some guidelines so that fee structure remains within certain limit and those who are from economically poor background/have some opportunity. References 1. 2. 3. 4. 5. 6. 7. 8.

AICTE (2004) report of the High Power Committee for mobilization of Additional Resources for Technical Education, All India Council for Technical Education, New Delhi. Government of India (2007) Approach paper to the ninth Five year plan : 1997-2002, Planning Commission, New Delhi. Rani, Geetha, P. (2008) Financing Education in India in the Economic Reform Period : Focus on Intra sectoral Allocation of Resources to education , in Globalization and Challenges of Education, NIEPA, 2006. Dr. C. Rangrajan, Chairman - economic Advisory Council to The Prime Minister. The Globalization of Indian economy: a need for internationalization of higher technical education (Patil & Pudlowski). Globalization of Education (KV Sagar) Reports of UGC , AICTE & Ministry of HRD Government of India Globalization : Its Impact on Management Education in India Dr. Jayant Sharma, Kanti Mohan Saini, Mahesh Chandra Joshi Volume 5 Issue 2 (July 2012) Pacific Business Review International. IJRFM Volume 1, Issue 5 (September, 2011) (ISSN 2231-5985) Impact of Globalizaion on Management Education in India. Smrita Jain, Vibhor Jain, Sachin Bhardwaj.

280

9. 10. 11. 12. 13.

14.

Impact of Globalization on Indian Technical Education System, Mahadevi. S. Banad and Mahadev. Talawar. Dept. of Electrical & Electronics Engg. B.V.V.S.Polytechnic (Autonomous), Bagalkot, Karnataka State, India International Journal of Managerial Studies and Research (IJMSR) Volume 2, Issue 3, April 2014, PP 20-26 ISSN 2349-0330 (Print) & ISSN 2349-0349 (Online) www.arcjournals.org Globalization and Management Education in India: A Framework for Analysis, Dr. A. Subrahmanyam, Prof. B. Raja Shekhar, Gupta, S. (2007). Professional Education in India-A Case Study of Management Education in NCR. IME, Sahibabad, Ghaziabad.

281

KNOWLEDGE MANAGEMENT AND ITS UTILIZATIONS Mohd Ateek Abstract The paper has tried to cover different aspects of knowledge management (KM) and its applications. The relevance of KM for organization finds mention in this paper states that km is a cross- disciplinary domain value of knowledge for an organization is described in detail. The paper exploring the developing a context of knowledge management and highlight some emerging implications and value added knowledge based information in the global information society lastly in the 21st Century, the library will inevitable face the new subject of knowledge management (KM) the paper has profiled the different types of knowledge initiatives with example. Kumar Rajesh (2008) Key words: Knowledge Management (KM) Knowledge sharing (KS) Introduction: Knowledge Management (KM) is the hottest topic of the day km is concerned with the developing organization in such a manger as to derive knowledge from information . doing this requires organization in such a manner as to derive knowledge from information. Knowledge exits in the minds of people, not technology. Technology can capture information, but it cannot convert information into knowledge. Only human being can convert information into knowledge when read and assimilated after retrieval useful technology for this phase of the knowledge of the km process include statistical analysis, software, data mining tolls and decision support system, artificial intelligence, and visualization tools. According to Beynton four steps in getting started in KM are: Making knowledge visible. Building knowledge intensity. Development a knowledge culture. Building knowledge infrastructure. The four steps are interdependent in that embarking on without the others will hinder the acceptance and success of knowledge management as a major organizations focus. Some well-run organization have been doing these four steps for many year, while others are beginning to recognize their important and the extent to which they need to be integrated whit how work gets done. KM is not owned by any one group in an originations, nor any profession or industry. But if libraries and information specialist want to be key person in the emerging km phenomenon, they need to understand the multiply perspective of other person. KM in Library and users, to strengthen knowledge internetworking and to quicken flow. Amudhavalli, A. (2008) The focus of knowledge management is on doing the right thing instead of doing thing right. It provides a framework within which the organization view all its processes as knowledge processes and all business process involve creation, dissemination and application of knowledge towards organization sustenance and survival. 282

Every few year, a new technology development or management philosophy capture the attention of many strategy thinkers in organization yes, knowledge management is the hottest subject of the day. The question is what is this activity called knowledge management, and why is it so important to each and every one of us. First there was the total quality management, and Business Process Reengineering. There is no doubt, that the last couple of years has seen a surge of interest in knowledge management and also the internet. The knowledge economy as a new concept that has appeared worldwide in recent year in the knowledge management etc, the management refers to effectively identity acquire develop, resolve, use store and share knowledge, to create an approach to information and knowledge, and to rise the society to information and knowledge is rising and people demands for information and knowledge is rising are increasing step by step. This has provided a good environment for library management. Moreover, as information and knowledge. How to manage will become an important subject facing on effective research and development of knowledge management in libraries should be focused on effective research and development of creation of knowledge bases exchange and sharing of knowledge between library staffs. Kumar Rajesh (2008) Key to knowledge Management in Libraries: Information Technology is a tool knowledge management in libraries. knowledge acquisition is the starting point of knowledge management in libraries. The application of information technology enlarges the scope of knowledge acquisition. Man brain alone is not suffice to accomplish such important task in the modern society. Knowledge changes with each passing day. Only a closely knitted link of resource and workers by computer networks, thus constructing knowledge networks in libraries are essential for the making of knowledge warehouse of libraries Knowledge Creation: Knowledge creation revolves around the activities that result in conversion of knowledge. The process of conversion involves creation of tacit knowledge through sharing, moving from tacit knowledge to explicit, enhancing explicit content by combining codified knowledge and using explicit knowledge to create new tacit knowledge through thinking and sharing. Knowledge sharing: Sharing knowledge require a different kind of environment, a human and information system to reduce the knowledge gap. Knowledge sharing require different set of tolls and mind-set that appreciates the following : 1. Knowledge / Learning is by people i.e., it is a human activity. 2. Thinking creates knowledge. 3. Knowledge is created as gets used and is dynamic. It through Organization and Communities in may way.

283

Definitions: Knowledge management enables the creation communication and application of knowledge of all kinds to achieve goals. Knowledge management is a conscious strategy of getting knowledge to the right people at the time. Knowledge management provides access to experience, knowledge and expertise that create new capable, enable superior performance. Encourage innovation, and leverages exixsinting information and knowledge assets of the organization, facilitates information and knowledge dissemination across boundaries and ingrates the information and knowledge into day to day business process. technology and principles that serves to promote a learning environment supportive of the es the identification and analyses of available and required knowledge and subsequent planning and control if actions to develop knowledge assets so as to fulfill organization objectives. In simple terms knowledge management means, management of knowledge. It can be (2008) Objective: Knowledge innovation is the core of the knowledge economy society. As bases for collection, processing, storage and distribution of knowledge and information, libraries represent and indispensable link in the scientific research process directly. The library work is a component of knowledge innovation. They act as bridges for turning the results of knowledge of knowledge innovation into realistic productive forces. Knowledge management in libraries and user, to strength knowledge internetworking and to quicken knowledge flow. In the knowledge economy era, libraries will carry out researches on development and application of information resources, construction of virtual libraries, protection of intellectual property Right in the electronic era, a thus founding the base for knowledge innovation. Kumar Rajesh (2008) The objectives of knowledge Management including the following : To create knowledge repositories, which store both knowledge and information, often in documentary form. To improve knowledge access, or to provide access to knowledge or to facilitate its transfer among individuals. To enhance the knowledge environment so that the environment conducive to more effective knowledge creation, transfer and use. To manage knowledge as an asset and to recognize the value of knowledge to an organization. Majeed .K.C. (2008) Features of KM: The following are the important features of KM: 1. KM should not be thought of all about people only but of technology also, and therefore human resources department and the technology department should drive km together.

284

2. KM is to implement the concept of sharing information and expertise, by which employees not only share their knowledge but also make it available to the entire organization. 3. Knowledge is very different doing of asset to other assets as property and equipment and at the same time people are always reluctant to change. It is observed that for many worker and managers, knowledge has become the source of authority. Therefore they may not like to share knowledge. This will be the stumbling block to introduce Km in any organization, 4. 5. KM is the subject that accepts intellectual capital as the main management assets. 6. KM provides an environment and opportunities of learning while doing. Components of KM: KM basically involves the following three components. 1. People create share and use knowledge. 2. Processes acquire, organic, share and transfer knowledge. 3. Technology the enabler and facilitator to share and provide access to KM. Amudhavalli, A. (2008) Information Technology : tool for knowledge management in libraries. Knowledge management acquisition is the starting point of knowledge management in libraries. The application of information technology enlarge the scope of knowledge Acquisition, rise knowledge acquisition speed and reduce knowledge acquisition cost. It is impos society in which the knowledge change with each passing day. It will be possible to link closely knowledge sources and knowledge networkers work by computer, thus constructing knowledge networks in libraries workers by realization of single point information. The knowledge acquired must be accumulated and converged into knowledge warehouse of libraries. The priority of information technology in the filed of knowledge storage not only finds expression in quantity, but also in retrieval sorting and security of the knowledge. Information technology is also indispensable in the application and exchange of knowledge and other fields. It functions as a source and tool for knowledge innovation. Kumar Rajesh (2008). Conclusion: Knowledge management is not owned by any group in an organization. Nor by any one profession or industry. Knowledge management require a holistic and a multidisciplinary approach to management processes and an understating of the dimensions of knowledge work. It is an evolution of good management practice sensible if want to be key players in the emerging knowledge management phenomenon, they have to understand the multiple perspective of the other players. Knowledge management occupies a very outstanding position the creation of the knowledge innovation systems and reengineering the various of life how far the library circle to meet the challenges of the knowledge economy are to build a knowledge. Management systems for libraries entre the knowledge age of 21st century libraries should work together with IT professional and other to develop the appropriate knowledge management systems knowledge service based on high speed information networks should be out by. Majeed .K.C. (2008) Reference :

285

1. 2. 3. 4.

Second International Caliber conference, Feb 11-13, 2004 Jamia Millia Islamia University, New Delhi-P 588589 & 591-593. Future librarianship in knowledge society, Kutty. M Bava, Majeed .K.C. 2008. ESS Publication New Delhi. P 28 & 32. Dynamics in Digital Information Systems, edited by. Amudhavalli, A. 2008 Ess publication 2008. New Delhi. P 69-70 & 151-152. Recent Technological Trends in Management and Library system: Issues and Challenges, Kumar, Rajesh. Wisdom publication, 2008, P320-322.

286

MANAGING KNOWLEDGE AND ITS IMPLICATIONS Meenakshi shrivastava Abstract For achieving success in the hypercompetitive and increasing complicated global economy, every management person should know about what is knowledge and when, where and how to use that knowledge in its best and possible way. This paper provides a review and interpretation of knowledge management literatures in different fields with an eye toward identifying the important areas for research. It also presents a detailed process view of organizational knowledge management with a focus on the potential role of information technology in this process. It is framed as organizational characteristics and managerial practices required to establish an effective knowledge transfer process in an organization. The purpose of this paper is to provide an analytical survey of the way previously established theories are used as a foundation for knowledge management. Key Words: Knowledge Management, Knowledge Management Process, organizational knowledge management Introduction Knowledge management is all about managing knowledge the right time to the right person. This in itself may not seem so complex, but it implies a strong tie to corporate strategy, understanding of where and in what forms knowledge exists, creating processes that span organizational functions, and ensuring that initiatives are accepted and supported by organizational members. Knowledge management may also include new knowledge creation, or it may solely focus on knowledge sharing, storage, and refinement. But it never means that knowledge management is only for corporate or strategically use, it also help a person in his/her daily life. It is important to remember that knowledge management is not about managing knowledge for knowledge's sake; the overall objective is to create value and to leverage, improve, and refine the firm's or any personal competences and knowledge assets to meet organizational goals and targets. Knowledge management is a field that arose with rapid practical intellectual strength for management. Within this very short period, virtually every executive was characterizing their most important responsibility as 'leveraging organizational knowledge'. Understanding how pre-existing theories have been used to build proper knowledge management because these theories substantiate and legitimate the field as a field of science.. Theories harmonize research aims that justify methods used in turn to justify the theories themselves. If knowledge management theories had emerged solely from artificial intelligence theories, then the legitimacy of the newer field would be based solely on the legitimacy of the older. If instead knowledge management theories emerged from a broad range of other fields, its legitimacy as a field of science would be broader and stronger. The following sections trace the evolution of the term 'knowledge management' with regard to its definitions and the kinds of knowledge. Finally, the last two sections offer a brief discussion 287

on the relationships among the new knowledge management theories and some conclusions of old knowledge management theories. History A number of management theorists have contributed to the evolution of knowledge management. Some important theories are given by Peter Drucker, Paul Strassmann, and Peter Senge in the United States. Drucker and Strassmann have stressed the growing importance of information and explicit knowledge as organizational resources, and Senge has focused on the "learning organization," a cultural dimension of managing knowledge. The concept of knowledge management has started from the period of 1070 In 1980s also saw the development of systems for managing knowledge that relied on work done in artificial intelligence and expert systems, giving us such concepts as "knowledge acquisition," "knowledge engineering," "knowledge-base systems, and computer-based ontology. The phrase "knowledge management" entered the lexicon in earnest. To provide a technological base for managing knowledge, a consortium of U.S. companies started the Initiative for Managing Knowledge Assets in 1989. By 1990, a number of management consulting firms had begun in-house knowledge management programs, and several well known U.S., European, and Japanese firms had instituted focused knowledge management programs. Knowledge management was introduced in the popular press in 1991, when Tom Stewart published "Brainpower" in Fortune magazine. By the mid-1990s, knowledge management initiatives were flourishing, thanks in part to the Internet. The International Knowledge Management Network (IKMN), begun in Europe in 1989, went online in 1994 and was soon joined by the U.S.-based Knowledge Management Forum and other KM-related groups and publications . Knowledge management efforts have a long history, to include on-the-job discussions, formal apprenticeship, discussion forums, corporate libraries, professional training and mentoring programs. With increased use of computers in the second half of the 20th century, specific adaptations of technologies such as knowledge bases, systems, knowledge, group decision support systems, intranets, and computer-supported cooperative work have been introduced to further enhance such effort. In the enterprise, early collections of case studies recognized the importance of knowledge management dimensions of strategy, process, and measurement. In short, knowledge management programs can yield impressive benefits to individuals and organizations if they are purposeful, concrete, and action-oriented organizational learning processes are essential to the success of a knowledge management strategy; and measurement, benchmarking, and incentives are essential to accelerate the learning process and to drive cultural change. Research KM emerged as a scientific discipline in the earlier 1990s. It was initially supported solely by practitioners, when Skandia hired Leif Edvinsson of Sweden as the world's first Chief Knowledge Officer (CKO). Hubert Saint-Onge (formerly of CIBC, Canada), started investigating KM long before that. The objective of CKOs is to manage and maximize the 288

intangible assets of their organizations. Gradually, CKOs became interested in practical and theoretical aspects of KM, and the new research field was formed. Since its establishment, the KM discipline has been gradually moving towards academic maturity. First, there is a trend toward higher cooperation among academics; particularly, there has been a drop in single-authored publications. Second, the role of practitioners has changed. Their contribution to academic research has been dramatically declining from 30% of overall contributions up to 2002, to only 10% by 2009 (Serenko et al. 2010). A broad range of thoughts on the KM discipline exist; approaches vary by author and school. As the discipline matures, academic debates have increased regarding both the theory and practice of KM, to include the following perspectives: Techno-centric- focus on technology, ideally those that enhance knowledge sharing and creation. Organizational - focus on how an organization can be designed to facilitate knowledge processes best. Ecological - focus on the interaction of people, identity, knowledge, and environmental factors . KM include people, processes, technology (or) culture, structure, technology, depending on the specific perspective(Spender & Scherer 2007). The long and relevant research in KM has been questioned (Ferguson 2005) with action research suggested as having more relevance (Andriessen 2004) and the need to translate the findings presented in academic journals to a practice (Booker, Bontis & Serenko 2008) Definition of Knowledge Management Understanding Knowledge Management requires an understanding of knowledge and the knowing process and how that differs from information and information management. Knowledge management (KM) is an effort to increase useful knowledge within the organization. Improvements in knowledge management promote those factors that lead to superior performance: organizational creativity, operational effectiveness and quality of products and services' (Wiig, 1993, p. xv). The first concept regards knowledge-base management within the field of expert systems. The other concept regarding knowledge management as an organizational resource, this usage appearing as early as 1989 in the management literature (Adler, 1989). A working definition of this broader view of organizational knowledge is 'information embedded in routines and processes which enable action'. erson must Knowledge is a fluid mix of framed experience, values, contextual information, and expert insight that provides a framework for evaluating and incorporating new experiences and 289

becomes embedded not only in documents or repositories but also in organizational routines, processes, practices, and norms. (Davenport & Prusak, 1998, p. 5). describe a particular situation or condition'. Knowledge is distinguished from information by the addition of 'truths, beliefs, perspectives and concepts, judgments and expectations, methodologies, and know-

ously analyzed at different levels of abstraction for the purpose of management. One simple analysis distinguishes information (know-what) from combinational skill (know-how) (Birkett, 1995; Kogut & Zander, 1997), useful for differentiating basic management techniques for passive, stored knowledge from those best Another distinguishes professional knowledge from firm-specific knowledge (Tordoir, 1995), important for determining whether to 'make' or 'buy' knowledge. Yet, another distinguishes scientific, philosophical, and commercial knowledge (Demarest, 1997), useful for managing the goals of the knowledge production process, each type embodying different goals such as knowledge conventions, truth, and effective performance. These hazy distinctions can create boundary problems beyond the concept of knowledge itself, clouding the distinction between knowledge management research and the other fields of research that underlie it. For practical reasons, we must ultimately rely on the words of the researchers themselves, and trust them to label 'knowledge management' as such, thus giving us a reasonably clear boundary line by which to distinguish knowledge management from other intellectual fields that may be distinct from, but related to it. For example, complexity theory could be used to explain knowledge management systems. However, it is important to distinguish 'knowledge management' from the underlying 'complexity theory' rather than conflate the two fields. With hopes that the original distinctions described above will prevail, this paper will continue to use 'knowledge' and 'knowledge management' as described in the seminal literature. The Knowledge Management Process The knowledge management process is necessarily loose and collaborative because knowledge is recognized to be fuzzy and messy (Allee, 1997). It is also a difficult process because the human qualities of knowledge, such as experience, intuition, and beliefs, are not only the most valuable, but are also the most difficult to manage and maximize (Davenport & Prusak, 1998). The knowledge management process integrates theories from at least four distinct fields. First, theories about organizational culture, for example, tacit and articulated knowledge, are applied in the development of the concept of a knowledge culture. Second, organizational structure theories are used to develop ideals for knowledge organizational structures. Third, established work in organizational behavior supplies theories of 290

innovation, learning, and memory for new knowledge management concepts regarding knowledge creation and codification. Fourth, work in knowledge-based systems (KBS) leads to theories about knowledge-support infrastructures. Organizational culture Knowledge is related to human, knowledge management educes heavily from theories dealing with organizational culture. Particularly centered are theories regarding the storage and transfer of knowledge, in particular organizational cultur? Manipulation of knowledge is an essentially human process that cannot be separated from culturally based interpretation and reflection. According to Schein (1985), cultural values are an important mechanism through which an organizational culture reveals its presence. They are a reflection of the underlying cultural assumptions and they correspond to a set of social norms defining social interaction and communication in a particular context. Therefore, cultural values have an impact on the behavior and the attitude of the organizational members. When cultural values have been shared for long enough, culture becomes a product of group experience (Schein, 2004). In this context of shared beliefs and knowledge, Nonaka & Takeuchi (1995) draws on the concepts of tacit and articulated knowledge to introduce four modes of knowledge conversion Dimensions Different frameworks for distinguishing between different 'types of' knowledge exist. One proposed framework for categorizing the dimensions of knowledge distinguishes between tacit konwledge and explicit knowledge. Tacit knowledge represents internalized knowledge that an individual may not be consciously aware of, such as how he or she accomplishes their tasks. On the other hand explicit knowledge represents knowledge that the individual holds consciously in mental focus, in a form that can easily be communicated to others. (Alavi & Leiner2001). Similarly, Hayes and Walsham (2003) describe content and relational perspectives of knowledge and knowledge management as two fundamentally different epistemological perspectives. The content perspective suggest that knowledge is easily stored because it may be codified, while the relational perspective recognizes the contextual and relational aspects of knowledge which can make knowledge difficult to share outside of the specific location where the knowledge is developed.

291

Early research suggested that a successful KM effort needs to convert internalized tacit knowledge into explicit knowledge to share , and the same effort must permit individuals to internalize and make personally meaningful any codified knowledge retrieved from the KM effort. knowledge to be made explicit, it must be translated into information to be used at right time. The field of knowledge management contributes several recent models of the knowledge creation process. Tacit knowledge is socialized into the organization from its customers and knowledge alliance partners. This knowledge is processed iteratively through five processes and combined to support a product or service output of the organization. In this sense, externalization is the process by which tacit knowledge is transformed into articulated knowledge and internalization is the reverse process. Later on, Ikujiro Nonaka proposed a model (SECI for Socialization, Externalization, Combination, Internalization) which considers a spiraling knowledge process interaction between explicit knowledge and tacit knowledge. In this model, knowledge follows a cycle in which implicit knowledge is 'extracted' to become explicit knowledge, and explicit knowledge is 're-internalized' into implicit knowledge. A second framework for categorizing the dimensions of knowledge distinguishes between embedded knowledge of a system outside of a human individual (e.g., an information system may have knowledge embedded into its design) and embodied knowledge nervous and endocrine systems (Sensky 2002). A third proposed framework for categorizing the dimensions of knowledge distinguishes between the exploratory creation of "new knowledge" (i.e., innovation) vs. the transfer or exploitation of "established knowledge" within a group, organization, or community. The full scope of knowledge management (KM) is not something that is universally accepted. It is about making sure that an organization can learn, and that it will be able to retrieve and use its knowledge assets in current applications as they are needed. In the words of Peter Drucker it is "the coordination and exploitation of organizational knowledge resources, in order to create benefit and competitive advantage" (Drucker 1999). Bukowitz and Williams (1999) link KM directly to tactical and strategic requirements. Its focus is on the use and enhancement of knowledge based assets to enable the firm to respond to these issues. Conclusion An exploratory case study obviously has its limitation when considering the results in wider contexts. There are only three groups studied, one from each academic realm (humanist, scientific and technical), which restricts the conclusions drawn. In developing purpose for this research, process definitions, and evaluation approaches, knowledge management researchers have drawn from research in information economics, organizational culture, structure and behavior, artificial intelligence and organizational performance is needed. 292

Furthermore, when knowledge includes abstract, concepts and philosophical thinking, the emphasis should be on physical and structural applications that increase face-to-face contacts, informal discussions and activity outside working hours. In general, bias one key to understanding the knowledge-creation process and it should be considered, especially in international knowledge creation, both in and outside academia. In addition, not all group members were interviewed, which means that the results do not represent the views of the entire groups. Refrences 1. 2. 3. 4. 5. 6.

7. 8. 9.

Adler PS (1989) When knowledge is the critical resource, knowledge management is the critical task. IEEE Transactions on Engineering Management 36(2), 87 94. Ahn J-H and Chang S-G (2004) Assessing the contribution of knowledge to business performance: the KP3 methodology. Decision Support Systems 36(4), 403 416. | Article | Alavi M and Leidner DE (2001) Review: knowledge management and knowledge management systems: Conceptual foundations and research issues. Mis Quarterly 25(1), 107 136. Allee V (1997) 12 Principles of knowledge management. Training & Development 51(11), 71 74. Almeida P, Song JY and Grant RM (2002) Are firms superior to alliances and markets? An empirical test of cross-border knowledge building. Organization Science 13(2), 147 161. | Article | Amidon DM and Macnamara D (2003) The 7 C's of knowledge leadership: innovating our future. In Handbook on Knowledge Management (HOLSAPPLE CW, Ed), Vol. 1 Knowledge Matters, 539pp. Springer-Verlag, Berlin. Argyris C and Schön D (1978) Organizational Learning: A Theory of Action Perspective. AddisonWesley, Reading, MA. Baird L and Henderson JC (2001) The Knowledge Engine: How to Create Fast Cycles of Knowledge-toPerformance and Performance-to-Knowledge. Berrett-Koehler Publishers, San Francisco, CA. Baskerville R and Pries-Heje J (1999) Knowledge capability and maturity in software management. The DATA BASE for Advances in Information Systems 30(2), 26 43.

293

CUSTOMER PERCEPTION TOWARDS THE CASH ON DELIVERY Dr Tarika Singh,Dr Seema Mehta Brajendra Singh Sengar, Sunil Upadhayay,Manish Dubey

Abstract The purpose of the present paper was to find out the perception of customers towards cash on delivery for the online shopping they do. One of the major reasons behind growth of ecommerce in India is, Cash on Delivery model adopted by many shopping websites. The current study takes into consideration the online shoppers of Gwalior region. A self designed questionnaire was used to solicit the responses of one hundred respondents. Factor Analysis was applied to find out the underlying factors of customer perception towards COD. The study resulted in two major factors, Belief & Secure and Marketing Communication which forms the perception of customers. Key Words: Perception, Online shopping, Cash on delivery, E-commerce Introduction As online shoppers become progressively global and multicultural, more cross-cultural Online promotion has been employed by e-marketers to influence the shoppers during online to sway potential customers in Indian online shopping environment (Arora and Dengra, 2010 and Mathen and Abhishek , 2014). Back in late 90s and first decade of 21st century, online travel sites encouraged and built trust for the customers to transact over the internet. While customers patronized online portals for services due to the convenience that they offered, sale of products through online medium did not take off as expected. With service available across 24/7/365 days, online shopping offered an opportunity which the traditional brick and mortar outlets could never offer. Over the years, with busy schedule and lack of time with customers, online shopping portals offered convenience to customers to log in at their free time and convenience which appealed to customers in metropolitan areas (tier-I towns). The online retailers also brought in a number of initiatives to encourage the customers to opt for online shopping. In this section, we list some of the initiatives implemented by online retailers. COD payment option has been the innovation that has added a lot of momentum to online ddressed two major issues related with payment which online retailers faced. To capitalize and appeal to the Indian customer to try online shopping, e-commerce companies have started a number of innovative services. 294

COD is used in other emerging markets such as China, Russia and Vietnam, too. The popular perception is that Flipkart, India's biggest online retailer, pioneered it in India. COD offers a fairly risk-free trial process for a new user. Trust is a driver for COD, considering that none of India's e-commerce companies are huge brands. In India the online shopping has increased drastically as it is B2C marketing that is Business to Customer. Every website show their uniqueness by their different features provided Perception OKIKO (2014) said that Perception leads to decision making and action taking. you give to a stimulus you perceive will fundamentally shape the choices and actions you take in response to it. Yakup and Diyarbakirlioglu (2011) talked about psychology's understanding of perception has progressed by combining a variety of techniques. Psychophysics measures the effect on perception of varying the physical qualities of the input. Cash on Delivery Investopedia defines cash on delivery as a transaction usually done through a shipping company and allows both the seller and the buyer of the product to minimize the risk of fraud or default. COD allows the purchaser to pay at the time of delivery instead of having to pay upfront. Payment is made to the shipping company, and the shipping company then relays the payment back to the seller. Cash on delivery, also known as COD, is a method of payment for goods received, which will be delivered. Payment is given at the time delivery is accomplished. Cash on delivery cash credit card or personal check. It really depends payment are acceptable. Probably one of the most highly practiced forms of cash on delivery today is one we think very little of, but in the US, use often: pizza delivery pizzas may be delivered on a daily basis, and the number is probably staggering if you look phone prior to having their pizza delivered, but many still present the delivery person with cash or check, and hopefully a suitable tip. Review of Literature The current literature on consumer online purchasing decisions and cash on delivery has mainly concentrated on identifying the factors, which affect the willingness of consumers to engage in internet shopping. According to Wolfinbarger and Gilly (2001), consumers make online shopping for both goal-oriented and experiential reasons and Li et al. (1999) proposed that frequent web buyers are higher in the convenience orientation but lower in the experiential orientations than occasional web buyers and no differences were assumed in the second nature (yet) and the adoption of electronic payments has been slow. Almost all of -commerce companies have begun pushing COD. Across categories, 295

players are reporting between 40-60% of their overall transactions coming through COD. Gong, Lynda , Maddox et al and Stump (2013) findings indicated that chinese and american consumers hold significantly different perceptions regarding the relative advantage, ease of use, and risk of shopping on the Internet. Rastogi (2010) suggest that assessment of consumer buying behaviour can contribute to a better understanding of consumer buying behaviour in respect of online shopping. Hemamalini (2013), study found that product involvement, attitude and reason for online shopping varied with different product types. Zuroni and Goh (2012) studied the factors influencing online shopping. The findings revealed that there was a significant relationship between e-commerce experience and attitude towards online shopping among the respondents. The study also indicated that there was a significant relationship between product perception and attitude towards online shopping among the respondents and there shopping among the respondents. Turban et al. (2002) argued that elegant design of web site will serve better to its intended audiences. Following them, crucial for online shopping. Commitment is one of the important factors that have the most influential effect on online shopping. Commitment is closely associated with risk since it is a measure o deliver on their promises (Vijayasarathy and Jones, 2000). According to Jun et al.() ; Van Riel et al., (2003) to be considered as reliable online service providers, must deliver the promised services within the promised time frame (). Burke et al.,(2002) said that e perception for online shopping. COD will continue to be a prominent payment mode in India in the near future, but ecommerce players will continue to encourage/incentivize consumers to move towards other payment modes. Increased consumer confidence in online buying, trust in portals, awareness and improved consumer protection measures will help accelerate such a shift Objectives of the Study To design, develop and standardized a measure for customer perception towards the cash on delivery. To indentify the underlying factors for customer perception towards the cash on delivery. Research Methodology The study was exploratory in nature with the survey method was used for data collection. All Individual of Gwalior region those who were using the internet was the population of this research. An individual customer who is using on line purchase was treated as sampling element of the current research. The sample size was of 100 respondents for the study. Non Probability purposive sampling technique was used. The responses were collected on a 296

Likert type scale of 1 to 5 for all the variables on a self developed questionnaire to measure for customer perception towards the cash on delivery. Tools for Data Analysis Item to total correlation was used to check the internal consistency of the questionnaire. Reliability test was applied to check the reliability of the questionnaire with help of Cronbach Alpha. Factor analysis was applied to identify the underlying factors of Factor Affecting Customer Perception towards the Cash on Delivery. Result and Discussion To fulfill the objectives of the research, questionnaire for Customer Perception towards the Cash on Delivery was got filled from the respondents. On the data first Item to Total Correlation of scale was computed on SPSS 16.0 and corresponding improvement in reliability was also considered. The reported item to total statistics is as follows:Item-Total Statistics Items

Item-Total Correlation

You are aware to the online service. You think that COD is the good service. In India some people not believe on the COD facility. You think that online selling product having no quality. Indian people believe on the market service. You think that most of the educated person used online shopping service. You think that COD facility is good for those people who have no credit card. You think that COD provide the high level of security of the payment system. You think that COD is the good payment system to the other payment system. COD gives the easiest facility of the payment system.

.524 .674 .637 .415 .695 .286 .559 .691 .179 .476

Reliability Test The reliability measure of questionnaire was computed by using SPSS software. Cronbach alpha reliability coefficients were computed to calculate reliability of all items in the questionnaire. Reliability Statistics

297

Cronbach's Alpha .830

N of Items 10

It was seen in the above table that the value of Cronbach Alpha was 0.830 which was greater than 0.5. It is considered that reliability of all measure is adequate. So the statements in the questionnaire were treated as reliable statement. VALIDITY Face validity was tested for the questionnaire and it was found to be high. FACTOR ANALYSIS:The computed value of Kaiser-Meyer-Olkin Measure of Sampling Adequacy was .833 and Bartlett's Test of Sphericity was 287.424 at 0% level of significance. KMO and Bartlett's Test Kaiser-Meyer-Olkin Measure of Sampling Adequacy. .833 Bartlett's Test of Sphericity Approx. Chi-Square 287.424 df 45 Sig. .000 Principal component factor analysis Varimax rotation and Kaiser Normalization was applied. The factor analysis resulted is 2 factors. The detail about factors The Factor Name, Variable Number, and Convergence and their Eigen value is given in the table. Factor Name Eigen % of Items Converged Factor Value variance loads Explained Belief and 4.026 40.260 3 In India some people not believe on the .722 Secure COD facility. 5 Indian people believe on the market .833 service. 6 You think that most of the educated person .432 used online shopping service. 8 You think that COD provide the high level .802 of security of the payment system Awareness, 1.109 11.091 1 You are aware to the online service. .615 Quality and Ease 2 You think that COD is the good service. .817 of Payment 4 You think that online selling product .557 (Marketing having no quality. Communication) 7 You think that COD facility is good for .607 those people who have no credit card. 9 You think that COD is the good payment .316 system to the other payment system. 10 COD gives the easiest facility of the .686 payment system. 298

Description of Factors Belief d Secure: - This Factor has emerged as the most important determinant of research with a total variance of 40.260. In India some people not believe on the COD facility (.722), Indian people believe on the market service (.833), you think that most of the educated person used online shopping service (.432) and (.802). Marketing Communication: - This Factor has total variance of 11.091. You are aware to the online service (.615), You think that COD is the good service (.817), You think that online selling product having no quality (.557), You think that COD facility is good for those people who have no credit card (.607), You think that COD is the good payment system to the other payment system (.316), COD gives the easiest facility of the payment system (.686). Implications The research is intended to be useful for further research studies where researchers want to find out the Customer Perception towards the COD. The study has resulted in standardized measure to evaluate the customer perception towards the COD. Conclusion The study was conducted to find out customer perception towards the COD. A self designed measure was used for the study. The data was solicited from 100 respondents of Gwalior region. The factor analysis resulted in two factors for Customer perception towards the COD. The factors are namely 1) Belief & Secure and 2) Marketing Communication from across the globe, at an affordable price, in a secured environment, with infinite choices has made ePinto (2013), payment options like Cash on delivery have totally eliminated the need for giving any kind of sensitive information online during shopping. New payment options like cash on delivery and internet banking have considerably reduced the apprehensions about the transaction security. nd users help create a favorable or less favorable image of other users who read information posted and consider them in the information or purchase. Interactivity with target brand in social media is perceived differently by consumers depending on the message they convey (Shin, 2008). References 1. 2.

3.

Ankur Kumar Rastogi (2010). A Study of Indian Online Consumers & Their Buying Behaviour. International Research Journal, July 2010 ISSN- 0975-3486 RNI: RAJBIL 2009/30097;I (10). Apoorva Arora and Mukta Dengra (2010).Attraction of consumers towards online shopping in Indore-A descriptive study. Altius Shodh Journal of Management & Commerce. ISSN 2348 8891. http://www.altius.ac.in/pdf/31.pdf. Carolyne Moraa Okiko. Perceived Factors Influencing Intention To Leave Among The Sales Agents In Cfc Life Assurance, Kenya http://erepository.uonbi.ac.ke/bitstream/handle/11295/75817.

299

4.

5.

6. 7.

Dr. Durmaz Yakup And Ibrahim DIYARBAKIRLIOGLU (). A Theoritical Approach To The Role Of Perception On The Consumer Buying Decision Process. www.ajbms.org Asian Journal of Business and Management Sciences ISSN: 2047-2528, 1(4), [217-221 . ©Society for Business Research Promotion | 217 Social Media and its Impact on Consumers Behavior. International Journal of Economic Practices and Theories, 4(2), 2014, Special issue on Marketing and Business Development, e-ISSN 2247 7225 www.ijept.org . 295 . Gefen, D. (2002) Reflections on the Dimensions of Trust and Trustworthiness among Online Consumers, The DATA BASE for Advances in information Systems, 333, pp.38-53. Internet Research: Electronic Networking Applications and Policy, 12(2), 165-180.

8. 9.

10. 11.

12.

13. 14.

15.

16. 17.

Consumption, Markets & Culture, 4(2)2, 187-205. Graff, M., Davies, J., & McNorton, M. (2008). Cognitive Style and Cross Cultural Differences in Internet Use and Computer Attitudes. European Journal of Open, Distance and E-Learning. Munn, P., (1998). Parental Influence on School Policy: Some Evidence from Research. Journal of Education Policy, 13(3), 379-394. Harkness, S, Super C. M., (1991). Parental Beliefs and Theories on Early Childhood Education. Journal Dev Psy, 2: 193-202. Hemamalini K .(2013).. Influence of Product An Empirical Study in the Indian Context. International Journal of Marketing Studies; 5(5); 41; doi:10.5539/ijms.v5n5p41 URL: http://dx.doi.org/10.5539/ijms.v5n5p41. ISSN 1918-719X E-ISSN 1918-7203 Published by Canadian Center of Science and Education Hoover-Dempsey, K.V., Walker, J. M. T., Jones, K.P., Reed, R.P., (2002). Teachers involving parents (TIP)', Results of an In-service Teacher Education Program for Enhancing Parental Involvement. Journal Teaching and Teacher Education, 18(7), pp.843-867. Jacqueline S. E., & Pamela E. D., (2005). Influences of Parent's Education on Their Children Educational Attainments. London Review of Education, 3(3), 191-204 Li, H., Kuo, C. and Russell, M. G., (1999). "The Impact of Perceived Channel Utilities, Shopping Orientations, and Demographics on the Consumer's Online Buying Behavior," Journal of ComputerMediated Communication, 5(2). Nidhi Mathen and Abhishek (2014). Online Promotions: Exploring the emerging opportunity in Indian market. No WP2014-01-09, IIMA Working Papers from Indian Institute of Management Ahmedabad, Research and Publication Department. Shin S., (2008), Understanding purchasing behaviors in a virtual economy: Consumer behavior involving virtual currency in Web 2.0 communities, Interacting with Computers, 20, 433 446. Tina D., & Adriean M., (2001). Examining the Educational Expectations of Latino Children and Their Parents as Predictors of Academic Achievement.

18. -Hall, Inc. 19. Vijayasar Internet Research: Electronic Networking Applications and Policy, 10(3), 191-202. 20. Watson, J. C. (2005).Internet Addiction Diagnoses and Assessment: Implications for Counselors. Journal of Professional Counseling Practice and Research, 33(2), 17-30. 21. Wen Gong , Lynda M. Maddox , Rodney L. Stump (2012).Attitudes toward Online Shopping: A Comparison of Online Consumers in China and the US;IJED, 2(1), 28-35 C 2011-2012 World Academic Publishing 22. shopping in China. Journal of Asia Business Studies, 7(3), 214 230. 23. Zuroni, M. J., & Goh, H. L. (2012). Factors influenci -commerce purchases through online shopping. International Journal of Humanities and Social Science, 2(4).

300

APPLICATION OF TOTAL QUALITY MANAGEMENT IN LIBRARIES Navin Bhargava, Sunder Lal Sanjay Kumar Soni Abstract Libraries stress on maintaining administration, building the collection and serving the users. Libraries have always been committed to provide a high quality of services to its users. To help libraries in the process of improvement, total quality management is an approach that libraries use to improve their internal process and increase user satisfaction. The purpose of this paper is to present an overview of Total Quality Management (TQM) in the libraries. Key Words: Total Quality Management, Quality, Libraries, User satisfaction, Management, Introduction Libraries have been important institutions since time immemorial. The library is obviously the source of power of knowledge in higher education and research. The basic function of any library is to acquire, organize and disseminate the knowledge or information resources to the users, however, the techniques or process may differ library to library. Libraries have always been committed to provide a high quality of services to its users. Within the past, overwhelming a lot of resources, buying a lot of books and moving to large premises are considered as improving quality, now days that approach is not valid. Higher resolution to improve quality is to capture, control, preserve, and store and deliver the right information to the right user at the right time. To help libraries in the process of improvement, total quality management is an overall strategy to accelerate enhancements in process, products and services. Total Quality Management (Tqm) The concept of Total Quality Management was developed by an American, W. Edwards Deming. It had been originally introduced in Japan after World War II for improving the production quality and smart services. Total Quality Management is concept of management consists of o organization continuously improves its ability to deliver prime quality products and services to users. TQM can lead to decreased costs related to corrective or preventive maintenance, higher overall performance and an enlarged variety of happy and allegiant users. TQM assimilates fundamental management techniques, existing enhancements and technical tools under a disciplined approach. Total- Everyone within the organization is involved in creating and maintaining the quality of the services and products offered. Quality- The organization through individual and collective actions focuses on meeting 301

Management- In managing the systems, the emphasis lies on continuously improving the system in order to achieve the best results. management, requiring changes in organizational processes, strategic priorities, individual

of the customer, both internal and external, realigns the organization from detection to

through which the needs of the customer and the community and the objectives of the organization are satisfied in the most efficient and cost-effective way by maximizing the Elements of tqm s expectation, the organization bears the responsibility of managing the different dimensions of TQM in an effective way. The important elements modify the process of improving and maintaining the quality of services offered by organization. Maghaddam and Moballeghi (2007) listed the following elements of the TQM : Customer driven quality Top management leadership and commitment Continuous improvement Fast response Employee participation A TQM culture Basic Tools of Quality Management TQM offers a set of principles, methods and tools that can be applied to improve an Jaafar (1998) the tools used for measuring and documenting quality of the products, process and services are following: Control charts Pareto charts Cause and effect diagram Run chart Histogram Scattered diagram Flow charts Objectives of TQM

302

TQM is a set of management practices to help organizations increase their quality and services. Management and employees involved in this process. According to Halim TQM is designed to accomplish the following objectives: Process improvement Defect prevention Priority effort Developing cause-effect relationship Measuring system capacity Developing improvement checklist and check forms Helping teams make better decisions Developing operational definitions Separating trivial from significant needs Observing behavior changes over a period of time Implementing TQM in Libraries TQM was initially applied as a management philosophy in the manufacturing sector. Following its massive success, this philosophy is being applied in various service sectors, libraries and information centers are no exception. Application of TQM in library services started in the late 1980s and is an American response aiming at customer way of meeting the requirements and expectations of customers. This concept has become more relevant in the current technological era, especially due to the emergence of application of information technology in libraries and changes in information consciousness among users (Raina, 1995). Many libraries have implemented TQM successfully. Harvard college library created task to be made in order to develop a new organization culture(Clack, 1993) with the help of consultants, Harvard learned about TQM, and found that its principles of service excellence, team work, ongoing training and skill building, process/systems focus, continuous improvement and cooperation across boundaries could help them make the changes they needed. In 1992 The British Library Document Supply Centre (BLDC) embarked on it s TQM programs, as greater awareness of customer needs, budget constraints and increasing competition in the document supply business made it vital that the centre should re-examine quality and service (Pilling,1997). In another study Pilling (1997) stressed that several major features of TQM are highly relevant for libraries, such as: the emphasis on customers; the delegation of work; the involvement of staff at all level; 303

process rather than function; and the need for continuous improvement. Byrd (1998) examined a study on TQM implementation in three community college libraries and/or learning resources centers in USA. The results showed that the leadership role is significant in promoting among staff the goal of never-ending improvement and in maintaining the momentum of the quality effort. The results also indicated that TQM does have the power to transform libraries in the following broad categories: management; cross training; staff development; and technology A comparison of the remarks of three staff about what advice they would give to a community college library thinking about implementing quality reveals that they would inform fellow colleagues of four things; 1. TQM takes time and work; 2. 3. top managers at the institutional level must relinquish control in favor of participatory management for TQM to work; and 4. the TQM effort must involve everyone in the institution. A paper published in Library Management addressed the process of implanting TQM in libraries, stating that it involves a conceptual change in library professional and a cultural transformation in the organizational operations (Wang, 2006). In India, the library of the Indian Institute of Management, Lucknow (IIML) has been conducting innovative continuing professional development programs for professionals engaged in the library and information sector for many years. More than 30 such programs are employed in the context of library and information systems and services, in areas such as: quality management; marketing; information technology applications; human relations; and communication. Feedback from the participation has continually been revealing that the programs were very well received, and at times had even surpassed expectations. As a result, significant positive and information systems as their collection development, collection organization, service design and delivery activities are concerned (Raina, 2005). Principles of TQM to improve Library Services 304

According to the (Sirkin, 1993) following are the principles of TQM to improve the library services: Create service broachers and information kits. Conduct a user survey about library services. Improve signage. Change hours of operation. Provide a more convenient material return. Simplify checkout of material. Use flexibility in staff assignment. Co-operate with local government. Ask vendors to give product demonstration. Give new staff a through orientation. Create inter departmental library advisory groups. Improve the physical layout of the library. Track complaints. Develop an active outreach program. Open satellite offices. Develop user and staff training materials. Target services of specific groups. Offer electronic document delivery. Follow the mission statement. Benefits of TQM in Libraries According to (Miller, 1994) if implemented carefully, quality management yield positive benefits to libraries such as:

principles

Incremental changes lead to continuous improvement quick solutions may yield only partial results. Forces library managers to develop leadership skills interested of replying on power with in position to obtain results. Increase staff participation in decisionof decisions and directions once charted. Improves the level of training given to staff, thus increasing skills. Helps break down barriers between library departments and improves communication within the organization. Provides a method of improving services to users in a period to similar resources. Conclusion TQM assimilates fundamental management techniques, existing enhancements and technical tools under a disciplined approach. TQM can lead to decreased costs related to corrective or preventive maintenance, higher overall performance and an enlarged variety of happy and allegiant users. Libraries are ideal places to implement TQM. By formulating a strategic plan and following it with a commitment, librarian can transform and improve their 305

Library. Now the time has come for new approach to managing libraries with recent techniques of management, definitely TQM will helpful for improving libraries. References 1. 2.

Journal of Information Dissemination and Technology 2(4), pp. 266-269. d? The Implication of TQM for Library and th ASLIB Proceedings, pp.283-288.

3. Virginia. Charlottesville. 4. 5. 6. 7.

-43. http://Voctech.org.bn/virtual_lib/programme/regular/;ibrary98/TQM/20/for/20/libraries.pdf. May14,2014 http://ais.org.ge/conference98_text/conference_e.pdf Retrieved Apr 27, 2014,

Retrieved

Libraries News, 55(7), pp. 406-422. 8. The Electronic Library,26(6), pp.912-920. rd International Conference, IFS Ltd. Springer-Verlagm, London,pp. 133-154. 10. stomer first: Total Quality and Customer service at the British Library 9.

11. 12. http://archive.ifla.org/IV/ifla71/papers/042e-La-Raina.pdf Retrieved May 15, 2014 13. 71-83. 14. p.606.

306

-6. 18(1/2),pp. 27(9)

IMPORT EXPORT AND EXCHANGE RATE: EVIDENCE FROM OIC Dr. Navita Nathani Amit Jain Vikas Shrivastava Jaspreet Kour Abstract This paper is an attempt to explore some of the reasons why some import and export price change are relative to a change in exchange rate. In theory, a weaker rupee should raise the cost of foreign goods for Indian consumers, thereby reducing Indian demand for imports, while boosting foreign demand for Indian goods by making exports more price-competitive abroad. However, economic research suggested that the link between the exchange rate and the prices of imported goods is more complex, with fluctuations affecting Indian import and export prices to varying degrees, depending on the industry. These studies showed that the effect of an exchange rate change depends on firms. In our study the main emphasis was on export, import and exchange rates time series against dollar with reference to oil producing countries. The evidence from Least Square test suggested that there is a relationship between exchange rate and imports and exports. The result further found bidirectional causality from exchange rate to imports and imports to exchange rate and similarly exchange rate to exports and exports to exchange rate. Key Words: Import, Export, Exchange rates, causality test, OIC Introduction The relationship between exchange rates and import prices is important to understand the nature of import flows as well as the behavior of consumer prices. For example, a weaker dollar is usually considered to be a key mechanism for increasing the international competitiveness of Indian producers. However, economists have generally found that prices of imported goods do not usually respond with one-to-one to changes in the exchange rate. For example, between February 2002 and July 2008, the dollar fell by almost 35 percent against a broad index of foreign currencies, while U.S. Department of Labor (DOL), Bureau of Labor Statistics (BLS) price index for all imports excluding petroleum rose by 20 percent, and the price index for imported consumer goods rose by only 6 percent.1 The lack of a strong historical relationship between the dollar and import prices is often cited as a factor affecting broad measures of core inflation. Exchange rate studies usually focus on the rate of exchange-rate pass-through the impact of a change in the exchange rate on prices in the importing country. There are three prominent explanations of why exchange-rate pass-through to Indian import prices might be low: exchange rate change. 307

Exporters set their prices in the local currency of the importing country and these prices do not fluctuate with the exchange rate, at least in the short run. Cross-border production which leads to lower pass-through when production costs are denominated in different currencies. In addition, Marazzi and Sheets (2007) found a decrease in exchange rate pass-through for product markets in which Chinese exports gained market share, at least through 2004, when the Chinese yuan was pegged to the dollar. Definition of Import Import is when you buy something from another country and get it shipped to you. Import - send or ship in (from a different location). Import is the process of transporting in, of the products that othernations lack or cannot produce to meet the domestic demand. Definition of Export Export is when you sell something to another country and it then ship it. Export - send or ship out (to a different location). The definition of export is goods and services taken out of one country from another country for purposes of trade. Export is the process of distributing the products that other nations can produce more than domestic demanded to another nation. Definition of Exchange Rate The price of one country's currency expressed in another country's currency. In other words, the rate at which one currency can be exchanged for another. For example, the higher the exchange rate for one euro in terms of one yen, the lower the relative value of the yen. Exchange rate: a number that is used to calculate the difference in value between money from one country and money from another country. Exchange rate: The ratio at which the principal unit of two currencies may be tradedexchange rate price of one country's money in relation to another's. Exchange rates may be fixed or flexible. An exchange rate is fixed when two countries agree to maintain a fixed rate through the use of monetary policy. Historically, the most famous fixed exchange-rate system was the gold standard; in the late 1850s, one ounce of gold was defined as being worth 20 U.S dollars and 4 pounds sterling, resulting in an exchange rate of 5 dollars per pound. An exchange rate is rate through supply and demand. The rate will fluctuate with a country's exports and imports. Most world trade currently takes place with flexible exchange rates that fluctuate within relatively fixed limits. See also exchange control, foreign exchange. 308

Exchange Rate: Price for which the currency of a country can be exchanged for another country's currency. Factors that influence exchange rate include (1) Interest rates, (2) Inflation rate, (3) Trade balance, (4) Political stability, (5) Internal harmony, (6) High degree of transparency in the conduct of leaders and administrators, 7) General state ofeconomy, (8) Quality of governance. Literature Review Gopinath, Itshoki, and Rigobon (2007) suggested that currency invoicing did matter. Using unpublished monthly BLS data for 1994-2005, they estimated the average long-term passthrough for imports priced in dollars to be 0.25, whereas for import goods not priced in dollars, the average pass-through was 0.95, or almost 1. Gopinath, Itshoki, and Rigobon further reported that in 2004, 93 percent of U.S. imports were priced in dollars, up from 88 percent in 1994. This indicated the strong presence of LCP in U.S. imports. Konya and Singh (2008) investigated the presence of equilibrium relationship between exports and imports in India using annual data for the period from 1949/50 2004/05. Exports and Imports variables were measured in current prices in both Indian Rupee and U.S. dollar currencies. Indian exports and imports were found to be integrated of order one. Johansen cointegration method was then performed on data, and failed to reject the no-cointegration hypothesis. The paper concluded that Indian exports and imports do not exhibit a cointegration relationship and therefore, India is in violation of its international budget constraint. Rammadhan and Naseeb (2008) studied the existence of long-run relationship between oil exports and imports in four Gulf Cooperation Council (GCC) countries. Those countries were Kuwait, Oman, Saudi Arabia and the United Arab Emirates (UAE). Qatar was excluded for the lack of data. Johansen method of cointegration was applied and found a strong evidence of longrun relationship between oil exports and imports in three members except Kuwait. The slope coefficients in the Johansen regression were close to unity in the case of Oman, Saudi Arabia and the UAE. This suggested that the long-run trade balance between oil exports and imports will be in equilibrium, and trade policies were effective in sustaining this log-run equilibrium. Atkeson& Burstein (2008) suggested an alternative model in which firms face Cournot competition with nested constant elasticity of substitution across several sectors. If the elasticity of substitution is lower across sectors than within sectors, then the elasticity of demand faced 309

will depend on firms' market share, which is in turn determined by their productivity. In the extreme, a low productivity firm with a market share approaching zero faces a high elasticity of substitution within its own sector, while a high-productivity firm with a market share approaching one faces the lower cross-sectoral elasticity of substitution. Uddin (2009) studied time series behavior of total exports and total imports in Bangladesh. Johansen cointegration method was applied to data, and revealed existence of long-run equilibrium relationship between the two variables. Long term causality was also investigated and found to be bidirectional between exports and imports in Bangladesh Fitzgerald& Haller (2012) applied an approach using micro data from Irish export firms trading the same products domestically and in the UK, with some limited analysis of the role of invoice currency in observed pass-through. They found that when goods are invoiced in local currency (GBP), the relative mark-up across the two markets moves one-for-one with exchange rate fluctuations - that is, exporting firms absorb the full extent of exchange rate changes. In contrast, there is no evidence for mark-ups on goods invoiced in the producer currency (Irish pounds or Euros) being influenced by exchange rate changes. However, the structure of their data prevents robust analysis of producer currency-invoiced trades, as destination data are available only for a cross-section of observations, not the full panel. Berman et al. (2012) explored the issue of heterogeneous PTM associated with differences in firm performance. They argued that more productive firms are likely to face lower elasticity of demand, leading them to react to exchange rate depreciations by increasing their mark-ups while lower productivity exporters instead pass exchange rate savings through to customers and increase the volume of their exports. Berman et al. (2012) discussed three alternative mechanisms through which this relationship may arise. In a Melitz&Ottaviano (2008) model with linear demand and horizontal product differentiation, the price elasticity of demand increases with the price faced by consumers. As high-productivity firms charge lower prices, these firms face a lower demand elasticity. A real depreciation leads to a fall in the prices faced by consumers, and exporters react by increasing their mark-up. Finally, Berman et al. (2012) developed an extension to the Corsetti&Dedola (2005) model of distribution costs incurred in the local currency. If firms face a per-unit distribution cost payable in the importer currency, a depreciation implies that the production cost accounts for a lower proportion of the consumer price relative to the distribution cost. This reduces the elasticity perceived by the exporter in relation to the export price. High performance exporters again increase their export price more than others. Using detailed data on destination-specific export value and volume for French exporters, Berman et al. (2012) find evidence of heterogeneous PTM, including support for the hypothesis of local currency-denominated distribution costs. Specifically, they use Goldberg &Campa's (2010) estimates of distribution cost by sector and destination interacted with the real exchange rate to show that high distribution costs appear to increase the price elasticity, and decrease the volume elasticity, of exports to exchange rate changes. Objectives of the Study 1. To compute the time series of three variables export, import and exchange rates. 310

2. 3. 4. 5.

To check the stationary data series. To know the present trends of import & export and exchange rates OI countries. To establish relationship in between import and exchange rate. To establish relationship in between export and exchange rate. Research Methodology The study was causal and analytical in nature.Population was all OI countries of import export market. Individual country of OIC members was the sampling element. The sample size wasOI Countries. (Malaysia, Pakistan, Jordan, Turkey, Indonesia, Algeria, Yemen, Egypt, Mexico, Denmark, Gambia). Non probability convenient sampling technique was used. For the purpose of data collection, secondary data from the apex indices of nation was used in our study. The study used E-views for data analysis

1. Unit root test was applied to check the stationary data series. 2. Firstly least square test and further Granger causality test was used in time series analysis to examine the association and direction of causality between two economic series. Results and Discussion Unit Root Test To examine dynamic relationship between Import, Export and Exchange Rate unit root tests were employed in their log-levels and log-differenced forms between Import, Export inflow and Exchange Rate. They were ADF test with and without intercept till the data become stationary. The ADF test is based on following regression:

the ADF test is the choice of the maximum lag in the equation (A1). An insufficiently small number of lags will result in a test of incorrect size, but too large choice of lags results in a test of lower power. Unit root table Null Hypothesis LIMP LEXP LEX

P value 0.0000 0.0000 0.0000

Null Hypothesis REJECT ED REJECTED REJECT ED

Result Variable is stationary Variable is stationary Variable is stationary

Ordinary Least Square Ordinary least square (OLS) regression is one of the most popular statistical techniques which is used to predict the values of a dependent variable using one or multiple explanatory variables. It can also identify the strength of the relationship between the variables by the values of probability.

311

The results of the Least Squares (NLS And ARMA) is presented below, using the dependent and explanatory values by list of regression including ARMA AND PDL terms, equation being Y=c(1)+c(2)*x. Least square was applied at the first difference series and found that in most of the cases model was not fit and hence log of the series were computed to generate new series and least square was tested again with the new series. In the test, results the probability value is close to 0.0000. T-STATS values (2.473873 and 1.936608) were significant. R squared value found as 0.105300 and 0.067272 showed that model was highly fit.Also the value of Durbin watson showed significant positive auto co-relation as the value lies between 0 to 2. Hence it can be concluded from the study that exchange rate and import and exchange rate and export are quite associated. The values of maximum likelihood reported more and further the test series were tested through Granger causality test to find out directional relationship between these variables. Least square table

Hypothesis LEX VS LIMP LEX VS LEXP

DURBIN-WATSON STAT 2.284892 0.105300

C(2) C(2)

T -STATS -2.473873 -1.936608

PROB. 0.0167 0.0582

STD.ERROR 0.177048 0.139995

Graphical Representation of Exchange Rate and Import 16 12 8 8

4

6 0

4 2 0 -2 -4 5

10

15

20

Residual

25

30

35

40

Actual

45

50

Fitted

312

55

R-SQUARED 0.105300 0.067272

Graphical

Representation

of

Exchange

Rate

and

Export

14 12 10 8 6 8

4

6

2

4 2 0 -2 -4 5

10

15

20

25

30

Residual

35

40

Actual

45

50

55

Fitted

Optimum Lag Test To find out the optimum lag, optimal lag test was applied through VAR method. Result showed that the value of Akaike Information Criteria (AIC) was minimum in case of lag 2. Hence, granger causality test was applied with lags 2. Granger Causality Test A contemporaneous causality relation between import and exchange rate was applied. The null hypothesis was that import does not contemporaneously lead exchange rate in the market. According to the Granger causality tests presented in Table (1), we reject the null hypothesis, which implies that exchange rate does not contemporaneously lead import and similar to import does not contemporaneously lead exchange rate. The positive causality from import to exchange rate and exchange rate to import indicates that changes in the current import may cause changes in the next series, resulting in changes in the current exchange rate in the same direction in the given sampling frame that is from 2005 to 2012. The study revealed that last 8 years were following the positive causality between import and exchange rate and similar to exchange rate and import. Table (1) of Granger Causality Test Pairwise Granger Causality Tests Date: 04/29/14 Time: 14:35 Sample: 1 55 Lags: 2 313

Null Hypothesis:

Obs

F-Statistic

Probability

LIMP does not Granger Cause LEX

52

6.37741

0.00354

4.11996

0.02246

LEX does not Granger Cause LIMP

According to the Granger causality tests presented in Table (2), we reject the null hypothesis, which implies that exchange rate does not contemporaneously lead export and similar to export does not contemporaneously lead exchange rate. The positive causality from export to exchange rate and exchange rate to export indicates that changes in the current export may cause changes in the next series, resulting in changes in the current exchange rate in the same direction in the given sampling frame that is from 2005 to 2012. The study revealed that last 8 years were following the positive causality between export and exchange rate and similar exchange rate to export. Table (2) of Granger Causality Test Pairwise Granger Causality Tests Date: 04/29/14 Time: 14:37 Sample: 1 55 Lags: 2 Null Hypothesis:

Obs

F-Statistic

Probability

LEXP does not Granger Cause LEX

52

3.76292

0.03049

4.11733

0.02251

LEX does not Granger Cause LEXP

Null hypothesis LIMP does not Granger Cause LEX LEX does not Granger Cause LIMP LEXP does not Granger Cause LEX LEX does not Granger Cause LEXP

Probability 0.00354 0.02246 0.03049 0.02251

Rejected/Non-Rejected Rejected Rejected Rejected Rejected

This table shows the positive bidirectional causality relationship between import to exchange rate and exchange rate to import and similar export to exchange rate and exchange rate to export. 314

Suggestions of the Study The study has been done by taking the data of total import and export inflow and outflow in OIC and total exchange rate of OIC, and with this total country Wise import and export inflow and outflow in OIC and total country wise exchange rate of OIC has also been taken. Sector wise import and export inflow and outflow and year wise exchange rate can also be taken and with this import and export inflow and outflow in individual country and exchange rate of individual country can also be taken in this study so that vast study may be done. It is suggested that this study can be replicated by taking the data for more time duration. Conclusion The relationship between imports, exports and exchange rate has been the subject of intensive research in OI Countries. This relationship is of significant importance due to the fact that it reflects the stability of foreign trade situation of a country. The main objective of this study and investigate the relationship between imports, exports and exchange rate in OI Countries economy. A time series methodology of unit root tests, ordinary least square method and granger causality tests mechanism were applied. Annual data for real imports, exports and exchange rate for the period from 2005 2012 were used. The results of ADF and unit root tests suggest that the three variables imports, exports and exchange rate are integrated of order one. The evidence from Least Square test, that there is a relationship between exchange rate and imports and similar exchange rate and exports in OI Countries. The causal relationship between the variables was investigated by specifying Granger Causality model. The result found bidirectional causality from exchange rate to imports and imports to exchange rate and similarly exchange rate to exports and exports to exchange rate. References 1

Abeysinghe, Tilak and Tan Lin Yeok. 1998. Exchange rate appreciation and Export competitiveness. The Case of Singapore. Applied Economics, 30, 51-55 2 Adebiyi, Michael Adebayo. 2007. Does Money tell us anything about inflation in Nigeria? The Singapore Economic Review, 52(1), 117-134, 3 Ahmed, Habib. 2001. Exchange Rate Stability: Theory and Policies from an Islamic Perspective. Research paper No.57, Islamic Development Bank, Jeddah. 4 Amin, RuzitaMohd and Zarinah Hamid. 2009. Towards an Islamic Common Market: Are OIC Countries Heading the Right Direction? IIUM Journal of Economics and Management 17,(1), 133-76. 5 BussiereMatthieu, and Marcel Fratzscher. 2002. towarda New Early Warning System of Financial Crises. European Central Bank Working Paper No. 145 6 Choudhury, MasudulAlam. 1996. The Nature of Globalization and the Muslim World. Emerald. 22 (5/6 ). 7 Hassan, Kabir. 2009. Economic Performance of the OIC Countries and the Prospect of an Islamic Common Market. Working paper 461 8 Duasa, Jarita. 2009. Asymmetric Cointegration relationship between real exchange rate and Trade variables: The case of Malaysia. MPRA Paper no 14535, http://mpra.ub.uni-muenchen.de/14535. 9 Duasa, Jarita. 2008. Impact of Exchange rate shock on Prices of Imports and Exports. MPRA Paper no.11624. http://mpra.ub.uni-muenchen.de/11624/ 10 Enders, Walter. 1995. Applied Econometric time series 1st ed. United States of America. 11 Ghali, H Khalifa. 2004. Energy Use and Output growth in Canada: a multivariate cointegration analysis. Energt Economics 26 (2004) 225-238. www.sciencedirect.com.

315

12 Gujarati, Damodar. 2009. Basic Econometrics, 5th edition. McGraw-Hill. New York;USA. 13 Konya, L. and J. Singh, 2008. Are indian exports and imports cointegrated? Applied Econometrics and International Development, 8(177-186). 14 Uddin, J., 2009. Time series behavior of imports and exports of bangladesh: Evidence from cointegration analysis and error correction model. International Journal of Economics and Finance, 1(2): 156-162. Rammadhan, M. and A. Naseeb, 2008. The long-run relationship between oil exports and aggregate imports in the gcc: Cointegration analysis. Journal of Economic Cooperation, 29(2): 69-84.

ANNEXURE UNIT ROOT TEST Null Hypothesis: LIMP has a unit root Exogenous: Constant Lag Length: 1 (Automatic based on AIC, MAXLAG=1)

Augmented Dickey-Fuller test statistic Test critical values: 1% level 5% level 10% level

t-Statistic

Prob.*

-8.566445 -3.562669 -2.918778 -2.597285

0.0000

*MacKinnon (1996) one-sided p-values.

Augmented Dickey-Fuller Test Equation Dependent Variable: D(LIMP) Method: Least Squares Date: 04/29/14 Time: 14:41 Sample (adjusted): 4 55 Included observations: 52 after adjustments Variable

Coefficie nt

Std. Error

t-Statistic

Prob.

LIMP(-1) D(LIMP(-1)) C

1.702465 0.421773 9.005220

0.198736 0.128585 1.077750

-8.566445 3.280102 8.355575

0.0000 0.0019 0.0000

0.673393

Mean dependent var

R-squared

316

0.015658

Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat

0.660062 1.772723 153.9849 102.0107 1.889107

S.D. dependent var Akaike info criterion Schwarz criterion

3.040471 4.038872 4.151444

F-statistic Prob(F-statistic)

50.51359 0.000000

Null Hypothesis: LEXP has a unit root Exogenous: Constant Lag Length: 1 (Automatic based on AIC, MAXLAG=1)

Augmented Dickey-Fuller test statistic Test critical values: 1% level 5% level 10% level

t-Statistic

Prob.*

-8.656211 -3.562669 -2.918778 -2.597285

0.0000

*MacKinnon (1996) one-sided p-values.

Augmented Dickey-Fuller Test Equation Dependent Variable: D(LEXP) Method: Least Squares Date: 04/29/14 Time: 14:40 Sample (adjusted): 4 55 Included observations: 52 after adjustments Variable

Coefficie nt

Std. Error

t-Statistic

Prob.

LEXP(-1) D(LEXP(-1)) C

1.722135 0.440987 9.708127

0.198948 0.128861 1.165538

-8.656211 3.422195 8.329308

0.0000 0.0013 0.0000

R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat

0.675355 0.662104 2.282152 255.2028 115.1460 2.001070

Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion

0.005091 3.926027 4.544076 4.656648

F-statistic Prob(F-statistic)

50.96707 0.000000

317

Null Hypothesis: LEX has a unit root Exogenous: Constant Lag Length: 0 (Automatic based on AIC, MAXLAG=1)

Augmented Dickey-Fuller test statistic Test critical values: 1% level 5% level 10% level

t-Statistic

Prob.*

-8.982624 -3.560019 -2.917650 -2.596689

0.0000

*MacKinnon (1996) one-sided p-values.

Augmented Dickey-Fuller Test Equation Dependent Variable: D(LEX) Method: Least Squares Date: 04/29/14 Time: 14:39 Sample (adjusted): 3 55 Included observations: 53 after adjustments Variable

Coefficie nt

Std. Error

t-Statistic

Prob.

LEX(-1) C

1.220766 7.075621

0.135903 0.867394

-8.982624 8.157328

0.0000 0.0000

R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat

0.612720 0.605126 2.587623 341.4855 124.5736 2.055000

Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion

0.031657 4.117861 4.776362 4.850713

F-statistic Prob(F-statistic)

80.68753 0.000000

318

LEAST SQUARE TEST Dependent Variable: LEX Method: Least Squares Date: 04/29/14 Time: 14:17 Sample (adjusted): 2 55 Included observations: 54 after adjustments LEX=C(1)+C(2)*LIMP Coefficie nt C(1) C(2) R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood

Std. Error

t-Statistic

Prob.

8.123168 0.437994

0.989282

8.211175

0.0000

0.177048

-2.473873

0.0167

0.105300 0.088094 2.497596 324.3754 125.0314

Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion

5.824781 2.615455 4.704868 4.778535

Durbin-Watson stat

2.284892

Dependent Variable: LEX Method: Least Squares Date: 04/29/14 Time: 14:19 Sample (adjusted): 2 55 Included observations: 54 after adjustments LEX=C(1)+C(2)*LEXP

319

Coefficie nt C(1) sC(2) R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood

Std. Error

t-Statistic

Prob.

7.339715 0.271116

0.855781

8.576623

0.0000

0.139995

-1.936608

0.0582

0.067272 0.049335 2.550123 338.1625 126.1553

Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion

5.824781 2.615455 4.746494 4.820160

Durbin-Watson stat

2.347098

GRANGER CAUSALITY TESTS Pairwise Granger Causality Tests Date: 04/29/14 Time: 14:35 Sample: 1 55 Lags: 2 Null Hypothesis: LIMP does not Granger Cause LEX LEX does not Granger Cause LIMP

Obs

F-Statistic

Probability

52

6.37741 4.11996

0.00354 0.02246

Obs

F-Statistic

Probability

52

3.76292 4.11733

0.03049 0.02251

Pairwise Granger Causality Tests Date: 04/29/14 Time: 14:37 Sample: 1 55 Lags: 2 Null Hypothesis: LEXP does not Granger Cause LEX LEX does not Granger Cause LEXP

320

A STUDY ON ATTITUDE OF BANK CUSTOMERS ABOUT CREDIT CARD Dr. Nandan Velankar Satyam Dubey Ankit Sharma ABSTRACT

indicated to some people was positive attitude towards credit cards. Availability of emergency funds through credit cards and shopping without paying cash immediately contributed more towards the positive attitude of credit card and some people are negative attitude and higher rate of interest charged was the cause of this negative attitude. The present study is an attempt to find out the factors determining the attitude of bank customers about credit cards. The validity, reliability and explanatory factor analysis test were applied to make this study more effective and trustworthy. Finally the significance of the outcomes has been tested applying independent t-test to understand the deference towards Attitude about Credit Card (dependent variable) and gender of respondents as (independent variable). Key Words: Credit Card,Attitude, Explanatory factor analysis, Bank customers. Introduction India has come out of self-binding shackles to look "young" again and the enthusiasm shared by the young work force of the country is driving the economy like never-before. In the present day world, no one wants to be bothered by the presence of huge cash in his or her wallet and the Indians are no exceptions. The unprecedented growth in the number of credit card users has stimulated the Indian economy by a significant extent. The arrival of malls, multiplexes, online shopping stores and shopping complexes have contributed to the growth of the use of plastic cards. It will not be wrong to say that such a scenario in context of the Indian market is not driven by style statement and is driven more by needs. The benefits of plastic money have offered unmatched ways to create equilibrium and offer an amicable solution when it comes to purchases and the inability to possess or carry cash. The modern day Indian customers find it easier to make physical payment (credit card payments) rather than carrying too much cash. The introduction of credit card facilities to pay for mobile, electricity, movie tickets and other related transactions have also contributed to the growth of plastic money in the country. Conceptual Framework 321

Attitude An attitude is an expression of favor or disfavor toward a person, place, thing, or event (the attitude object). prominent psychologist Gordon Allport once described attitudes "the most distinctive and indispensable concept in contemporary social psychology. Attitude can be formed from a person's past and present. Attitude is also measurable and changeable as well as influencing the person's emotion and behavior. Credit card A credit card is referred to as 'plastic money'. Carrying a lot of cash can be cumbersome, risky and sometimes, you run short of it, just when you most need it. A credit card is the smart solution to these problems. It is a safe and convenient alternative for cash. Most people associate a credit card with a prestige, and it also says that the customer have taken the onus of being responsible to be extended credit. Credit card therefore can be considered as a good substitute for cash or cheques.Although a sizeable majority of respondents believe that having a credit card is good because one can pay later for current purchases, a large majority also agrees with the statements that credit cards are bad because one has to pay interest if she/he does not pay the full outstanding balance, that they can lead to a loss of control over expenses. Literature Review Phyllis M. Mansfield and Mary Beth Pinto (2013) ustomers differs attitude toward credit cards and card ownership, outstanding balance, and repayment credit cards have been conducted in the last twelve years, making it a relatively recent phenomenon. M.S.Ramananda and K. Ravichandran (2012) mer attitude towards credit card services and bank performance towards the credit card business shows the people in they need not carry cash further people were empowered to spend the money wherever they want, with their credit card within the fixed limits prescribed by the concerned banks since payment was a crucial part of the whole transactions. J.C. Arias and Robert Miller (2010) This research was that the marketing of credit cards for college students, both to their parents and to the students themselves, was extremely effective. While not measured to statistical relevance, it was clear that to a high level of confidence the messages of control and the rationalization of emergencies were working very well. 322

-2010): It was indicated to some people was positive attitude towards credit cards. Availability of emergency funds through credit cards and shopping without paying cash immediately contributed more towards the positive attitude of credit card and some people are negative attitude and higher rate of interest charged was the cause of this negative attitude. Mansfield & Pinto (2007): found that Credit card holders must be aware of various features of credit card such as safety and security, utility, operational difficulties and support provided by the credit card department .Similarly bankers should extend credit card facilities only to the credit worthy customers This article brings out the relationship if any between age, occupation and education with reference to use of credit card and also establishes awareness and utilization of credit cards associated with features of credit card. Dr. Mrs. N. Yesodha Devi and Mrs. A. Gomati (2007) there spondents are aware and using the facilities like interest free credit period and insurance cover where as only very less number of respondents were aware of the facilities like add-on-card facility, accident cover and rolling credit but they were not using it. A study of it cards have the capacity to drive customers towards a lifestyle. The credit cards convey certain values and lifestyle patterns about the users. The study revealed that credit card use enable customers to attain desired lifestyle. Objective of the study To design, develop and standardize the measure to evaluate the attitude of bank customers about credit cards. To find out the factors determining the attitude of bank customers about credit cards. To analyze the attitude of bank customers about credit cards on gender basis. To open new vistas for further researches. Research Methodology The study is exploratory in nature. It is aimed to evaluate the attitude of bank customers about credit cards. The study is done to analyze the relationship in Indian context.Questionnaire survey method is applied to take the responses from respondents. Population was all the individual customers of credit card of different banks.Sampling Frame was Customer of Credit Card from different banks of Gwalior City. Individual respondent was the sampling element. Sample size was 200 individuals including 100 male and 100 female. To draw the sample non probability quota Sampling was used. For the purpose of data collection, a self-designed questionnaire was utilized. The measure was Likert-type 1 To 5 Scale,where 1 indicates the minimum agreement & 5 indicates maximum agreement. 323

Tools used for data analysis were: Item to total correlation was used to check the internal consistency of the questionnaire. Reliability test was applied to check the reliability of the questionnaire with the help of Cronbach Alpha. Factor analysis was applied to find out the underline factors of the attitude of bank customers about credit cards. T-test was applied to compare the attitude of bank customers about credit cards on gender basis. Empirical analysis Reliability and Consistency Measure: questionnaire. Reliability test using SPSS software and the reliability test measure is given below: Cronbach's Alpha

Cronbach's Alpha Based on Standardized Items .813

.815

N of Items 13

It is considered that the reliability value more than 0.7 is good and it can be seen that reliability value(0.815) is quite higher than the standard value, so all the items in the questionnaire are highly reliable. Consistency measure for Attitude of Bank Customers about Credit Card Scale Mean If item Deleted

Scale Variance If Item Deleted 40.673

Corrected ItemTotal Correlation .440

Squared Multiple Correlation .302

Alpha If Item Deleted .803

VAR00001

46.5200

VAR00002 VAR00003 VAR00004 VAR00005 VAR00006 VAR00007

46.4550 46.7600 46.6450 46.6000 46.5550 46.6450

41.274 40.997 40.662 40.503 42.419 39.245

.435 .423 .409 .484 .340 .547

.354 .379 .342 .441 .266 .436

.803 .804 .806 .799 .810 .794

VAR00008 VAR00009 VAR00010 VAR00011 VAR00012 VAR00013

46.4450 46.6750 46.6250 46.5350 46.1550 46.1850

42.489 40.120 42.487 39.828 40.041 38.996

.333 .504 .316 .539 .505 .571

.214 .421 .268 .407 .371 .421

.811 .798 .812 .795 .798 .792

324

It can be seen from item-total statistics that the dropping of any item do not increases the reliability very significantly, so it was decided not to drop any question from the measure and it was used as it is for further analysis.Validity was checked through face validity and found to be very high.Item to total correlation was computed using SPSS and all the items in the measure were accepted. Factor analysis Kaiser-Meyer-Olkin Measure of Sampling Adequacy. Bartlett's Test of Sphericity Approx. Chi-Square Df Sig.

.788 684.556 78 .000

The KMO score is 0.788 which is quite higher than 0.5,thus it can be said that, sample is adequet for factor analysis.The degree of common variance among the Bartlett's Test of Sphericity is highely significant, becauce (P < 0.05) hence,corelation matrix is not an identity matrix and therefore factor analysis is appropriate. Principle component factor analysis with Varimax rotation and Kaiser Normalization was Period details about factors, the factor name variable number and convergence and that Eigen Values are given in the table Factor Name

Eigen Value

Total

31.525

Status Symbol 1.470

Utility

Benefits

1.281

1.109

Loadings Value

% of variance

Convenient Purchasing 4.098

Variable Convergence/ Statement

11.308

9.856

8.532

05. The customer tends to over spend out of immense happiness. 03. Interest factor discourage me to opt a credit card. 01. I feel credit card is the most convenient instrument for spending. 02. I feel having a credit card uplift my status. Facility of free credit period attracts me to opt credit card. 09. Rewards and benefit packages offers on credit card attract me to opt a credit card. 07. Credit card is considered as status symbols. Interest on over drawing discourages me to opt credit card. 12. It is easy to handle. 13. It is a useful instrument in emergencies. 11. Various type of credit card as per requirement attracts me to opt a credit card. 06. Credit card lead to spending habit. 08. Membership fees attract me to opt credit card.

Discription of factor analysis: 325

.793 .782 .628

.804 .639 .609 .608 .737 .732 .578 .526 .767 .631

1. Convenient Purchasing : the most important factor that come out of the study

to over spend out of immense happine

2. Status Symbol 4 variables and explains 11.308 % of variance. Total Eigen value is 1.470 The .804 .639 packages offers on 3. Utility explains 9.856% of variance. Total Eigen value is 1.281. The included variables are (.737 .732 credit card as per requirement attracts me to opt a cre 4. Benifits and explains 8.532% of variance. Total Eigen value is 1.109. The included variables .767 act me to opt .631). Independent t-Test Independent T-Test is applied to compares the means between two unrelated groups on the same continuous, dependent variable. Here we have applied the independent ttest to understand the deference of mean between Attitude about Credit Card (dependent variable) and gender of respondents as (independent variable).

H0: There is no significant difference between gender and their Attitude about Credit Card as a bank customer. Group statitistics Gender Male Female

Levene's Test for Equality of Variances

N 106 94

Mean 49.8962 50.9681

t-test for Equality of Means

326

Std. Deviation 7.48068 6.08091

Std. Error Mean .72659 .62720

95% Confidence Interval of the Difference

Equal variances assumed Equal variances not assumed

F

Sig.

t

df

.484

.487

-1.103

198

-1.117

196.560

Sig. (2tailed) .271

Mean Difference

Std. Error Difference

Lower

Upper

-1.07186

.97174

-2.98815

.84444

.265

-1.07186

.95985

-2.96478

.82106

On the basis of independent sample t-test column labeled Levene's Test for Equality of Variances, which indicates the p value (.487) which is greater than standard value (.05) so, H0 is not rejected.Thus, and their Attitude about Credit Card as a bank customer. Conclusion The study performed necessary analysis to derive the research. The questionnaires were filled by individual respondent and by applying test like; to test the validity, face validity test was applied and to check the reliability of the questionnaire, reliability test was applied. Factor analysis test was applied to find out the underling factors to know the attitude of bank customers about credit cards.Many factors has came out from this study which reflect their attitude. Independent T- test was applied to compare the attitude of bank customers about credit cards on gender basis. It has been concluded by testing the hypothesis and following results were obtaining that there is no significant difference between gender and Attitude about Credit Card as a bank customer. References Ausubel, Lawrence M. 1991. The failure of competition in the credit card market. American Economic Review, 81(1): 50-81. Allport, Gordon. (1935). "Attitudes," in A Handbook of Social Psychology, ed. C.Murchison. of Afshan Ahmed and Ayesha Amanullah (2006-07). Who pays for credit cards? Federal Reserve Bank of ChicagoAusubel, Lawrence M. 1991. The failure of competition in the credit card market.American Economic Review, 81(1): 50-81 Consumer Research , Vol. 32, June, pp. 130-45. Bowers, Jeans. 1979. Consumer credit use by low income consumers who have had a consumer education course: An exploratory study. The journal of consumer 13(2): 340-341. Calem, Paul S. and Loretta, J. Mester. 1995. Consumer behavior and the Stickiness of credit- card interest rates. American Economic Review, 85 (5)1327 1336. Hamid 57 Arias J.C., Miller R. -

327

ORGANIZATIONAL CULTURE: A STUDY OF INTERNET SITES Dr. Garima Mathur, Dr. Richa Banerjee Sonali Srivastava Abstract The rate of usage of the Internet has grown exponentially. Organizations are using website to display their culture to the potential applicants. In this regard recruitment by organizations has been on the rise noticeably since last two the past decade. Nowadays organizations are frequently using internet for recruitment purpose. The present study cultur (displaying null and culture specific features). The results indicated that the websites communicating strong culture related features were more likely to be assessed positive by the job seekers. Keywords: INTRODUCTION Culture refers to the values, beliefs or combination of both in a society. Culture describes Organizational culture is the understanding shared by all the individuals who are part of the company for values and beliefs, norms of behavior that are acceptable or unacceptable, policies, and expectations of organizational members towards each other. The culture of an organization is a product of history, a variety of external and internal influences, and priorities and values of key people in it. Culture is reflected in the artifacts - rituals, design of space, furniture and ways of dealing with various phenomenons. Organizational culture would consist of the uniquely patterned beliefs, feelings, values and behavior that connect the members to the organization and simultaneously distinguish that organizational culture from the cultures of other organizations. According to Smircich (1983) stated organizational culture is a fairly stable set of taken-for- granted assumptions, shared beliefs, meanings, and values that bring forth a new way of understanding of organizational life, similarly, Denison (1984) defined organizational culture as a set of values, beliefs, and behavior patterns that form the core identity of an organization. There are numerous aspects related to culture. The culture may be innovative where the individuals are motivated to be creative. There may be culture where management focuses on long term aspects or they may be focusing on each specific issue. The culture may also be outcome or process oriented, internal or external oriented, customer or cost control focused. The organization may emphasis either on team working or on individual working orientation. As stated above, according to Braddy et al (2006) there are nine dimensions of culture organizations can convey to applicants via their websites, depending upon which attributes they believe are most descriptive of their values. These culture attributes include 328

innovation, emphasis on rewards, supportiveness, outcome-orientation, attention-to-detail, teamCaldwell, 1991). Moreover, the pace of technological improvement has also triggered the changes on organizational culture. The employees whether new, old or prospective ones perceive culture of the organization on the basis of its technological advancement especially use of internet. The internet has resulted in to change in processes of the organization. The way organization used to do their work has been changed altogether. For example, the Internet replaced the traditional recruiting system with Web-based recruiting (iLogos, 2004) as reviewed by Lyons & Marler (2011) and the use of the Internet in recruitment has been increased tremendously (Cober, Brown, Blumental, Doverspike, & Levy, 2000). Way back in 1999 also (Elliot & Tevavichulada, 1999) reported that internet recruiting is most used Human resource function as compared to any other function. The sources of recruitment are discovered with the help of internet and almost all the candidature for job. Now this led to an interesting field of research where the researchers tried to investigate the attributes that attract an applicant. Allen et al., (2007) stated that at initial level prospective candidates mostly on websites to gather information about the organization. Dineen, Ling, Ash, and DelVecchio (2007) studied the impact of web site attributes on organizational attraction and applicant self-selection behaviors. The website attributes help individuals to assess their suitability for the job. Hence, it has become an area of concern for the organization to pay attention towards website contents. A website contents are the means through which an idea needed to achieve a website objective is given a shape. These features are actually used to persuade, sell or reassure the visitors of the website. A content format is the means by which a feature is presented. In traditional recruitment methods, organizations used to provide job seekers with information about vacant positions only. On the other hand, Organizational recruitment websites allow organizations to provide prospective applicants with both job descriptions and other information about the organization. A study connected to this reported that both website design features and information about organizational values, policies, awards, and ure (Phillip W. Braddy, Adam W. Meade and Christina M. Kroustalis, 2006). They further stated that culture attributes were more strongly conveyed by culture-specific website content features than by culture-neutral website content features. REVIEW OF LITERATURE Braddy et al (2008) conducted an experimental study whereby they tested the effects of organizational familiarity, website usability, and website attractiveness on viewers' attractiveness were affected by recruitment websites displays. In the pretest posttest research 329

they discovered that the organizations which are able to maintain appealing websites and easy to navigate have scores organizational evaluations from the pretest measures to the posttest measures. However, the organizational culture is much more important aspect to be analysed. In the context of this research the work of Braddy et al (2009) is worth considering. They examined the effects of careers website content features such as pictures, organizational culture. He also considered various culture attributes while evaluating website content features. They tried to discover whether website features could convey organizational culture attributes or not. Lyons & Marler (2011) conducted a research on aesthetic features of website where organizational image mediates the relationship between n. They suggest that features and organizational attraction but also moderates the relationship between perceived P-J fit and organizational attraction. Earlier, Braddy et al (2006) studied the aspects of recruitment websites that may give rise to design features and information about organizational values, policies, awards, and goals 2004) suggest that web sites influence the impressions job seekers form of an organization and ultimately, applicant attraction. Furthermore, Palmer (2008) reported the relationships -efficacy, computer selfefficacy, and motivation to reduce uncertainty and their corresponding perception of personorganization fit and attraction to the organization. In the study, Computer self-efficacy moderated the relationship between web site characteristics and perception of the organization. Pfieffelmann et al (2010) studied a person organization (P O fit) framework to investigate job seeker attraction to organizations in the context of e-recruitment. Recruitment information posted on real corporate web sites was presented to active job seekers in order to better understand reactions to online recruiting. In the similar lines Kroustalis and Meade (2007) researched the P-O fit as well as the use of pictures on the recruitment websites on the likeliness of perception of organizational culture. perceptions of organizational attractiveness reported that web site orientation and outcome expectancy influenced organizational attractiveness perceptions through influencing the perceived usability of the website. Xiang and Hui (2007) analyzed online organization attraction process. Their study indicated that organizations use internet to present human resource policies and organization social information to applicants, which significantly e interaction effect with organization social information. OBJECTIVES 1. To standardize measure in Indian context to evaluate perception of viewers towards organizational culture. 2. To compare the viewer perception of organizational culture on the basis of website content.

330

METHODOLOGY Procedure: The study was completed by using experimental design. The potential applicants

were shown two websites with different contents on the website and then asked to fill questionnaire on a scale of 1 to 5. The contents of two website were manipulated to see the ex research. The first website contained minimum required features and the second website contained the features communicating various dimensions of organizational culture. Sample: the data was collected from the students of post graduate program of the institution in the final year of the course. All the students who were seeking job were asked to stay in the experimental room. Those who were interested in joining their family profession or to start their own business were requested to not to participate in the experiment. A list of such students was prepared and out of 246 students 120 students were selected randomly, with the help of simple random method. The students were aged between 21 to 25 years. Measure: an organizational culture characterized by each of the nine culture attributes. Items assessing diversity were adapted from Braddy et al. (2006), whereas items assessing the other eight al., 1991). The reliability was .817 for all 32 items in the questionnaire. RESULTS AND DISCUSSION The data was collected through experimental design where the respondents were asked to fill data in the questionnaire two times, firstly after viewing first website, which was communicating weak organizational culture and secondly after viewing next website, which included the contents communicating more organizational culture dimensions. In order to find out the results the data was put to various tests. Since, the same respondents were assessed both the websites the difference between them regarding perception of organizational culture was analysed through Paired t-test. Analysis of Paired T-test The following hypothesis was formed to test the significance level of factor affecting the variables. Ho1: There is no significance difference in the perception of organizational culture between different websites. Table 1: Indicating differences in Perception of Organizational Culture of two websites Sample website1 website2

Mean X1 = 182.1833 X2 =218.8750

N 120 120

Std. Deviation SD1 =30.91408 SD2 = 18.60585

t-value -10.798

Rejected/ not rejected Rejected

P value .000

The calculated value of t is significant. Therefore null hypothesis is rejected. The result shows significance difference in the perception of organizational culture between different websites. 331

Ho2: There is no difference in perceived global P-O fit measure between website1 and website2. Table no. 2 indicating differences in P-O Fit Sample P-O fit1 P-O fit2

Mean X1 = 5.1833 X2 = 6.4583

N 120 120

Std. Deviation SD1 =1.66518 SD2 = 1.28923

t-value -6.416

Rejected/ not rejected not rejected

P value .000

The calculated value of t is significant. Therefore null hypothesis is rejected. The result shows significance difference in perceived P-O fit measure between website1 and website2. Paired t test is applied to test the effects of the two respective website features on shaping etween the two websites i.e. website1 and website2 in general. A paired t test result indicated that websites features had significant effect on the perception of organizational culture. Website containing culture specific features had more effect on people than website containing weak features. Hypothesis 2 stated that there is no significance difference in perceived P-O fit measure between the two websites. The test results indicated that there is difference in perceived global P-O fit measure between null website and culture specific websites. This -O fit perceptions completely judged by the contents of website. The results indicated that more strongly a company conveyed the culture attribute under consideration, participants with strong culture preferences would form more favorable P-O fit perceptions and individuals with weak culture preferences would form less favorable P-O fit perceptions. In sum, organizations wishing to convey any of the mentioned culture attributes should include relevant features on their websites as specified above. Limitations of the Study 1. First, participants in this study were relatively young, so it is difficult to infer whether these results would generalize to older job seekers. 2. Participants viewed websites maintained by a fabricated pharmaceutical company; the degree of realism incorporated in them was certainly lower than would be the case with most real company websites. 3. different dimensions of organizational culture, but there are many other features and types of information e.g., company size and growth and expansion plans that organizations may include on their careers websites that were not investigated in this study. 4. The study has been conducted at Gwalior. 5. The total respondents in our study are less due to the limitation of time. CONCLUSION study has been studied in this research. This study examined the effects that feature (pictures, testimonials, organizational policies, and awards won) on websites had on shaping

332

these website features can be used to effectively convey these different aspects of culture to job seekers. Under this study we had shown website 1 which contained null website features that conveyed a weak sense of the culture dimension under consideration and website 2 in which Website Feature conveyed a strong sense of a given culture dimension. Website 1 is termed as null website and website 2 is termed as culture specific website. This study also highlighted the importance of conveying organizational culture to job seekers by illustrating the effects their culture perceptions have on their P-O fit perceptions and in turn their attraction to organizations. References 1.

2.

3. 4.

5. 6. 7.

-based recruitment: Effects of information, organiza Journal of Applied Psychology, Vol 92(6), Nov 2007, 1696-1708. Braddy, P. W., Meade, A. W., Michael, J. J., & Fleenor, J. W. (2009). Internet recruiting: Effects of website content features on viewers' perceptions of organizational culture. International journal of Selection and Assessment, 17(1), 19-34. Braddy, P.W., Meade, A.W. and Kroustalis, C.M. (2006) Organizational Recruitment Website Effects on ational Culture. Journal of Business and Psychology, 20, 525 543. Online recruiting: The effects of organizational familiarity, website usability, and website attractiveness on viewers' impressions o of Business and Psychology, 37(3), 522-553. Braddy, P.W., Thompson, L.F., Wuensch, K.L. and Grossnickle, W.F. (2003) Internet Recruiting: The effects of web page design features. Social Science Computer Review, 21, 374 385. Cable, Organization Fit, Job Choice Decisions, and Organizational 311. Job Fit Using a Profile Comparison Process. Journal of Applied Psychology, 75, 648 657.

8. Journal of Management, Vol. 30, pp. 623-46. 9. Assessment, 11, 158 169. 10. Cober, R.T., Brown, D.J., Blumental, A.J., Doverspike, D. and Levy, P. (2000) The Quest for the Qualified 496. 11. Denison D. R. (1984). Bringing Corporate Culture to the Bottom Line. Organizational Dynamics. 13(2), 4-22. 12. 35672. 13. Elliot, R.H. and Tevavichulada, S. (1999) Computer Literacy and Human Resource Management: A public/private sector comparison. Public Personnel Management, 28, 259 274. 14. iLogos. (2004), Internet recruiting data and stats. Retrieved March 22, 2010, from http://www.recruitersnetwork.com/poll/trendwatch/2004/1.htm 15. ial and Organizational Psychology, New York. 16. Lyons, B. D., & Marler, J. H. (2011). Got image? Examining organizational image in web recruitment. Journal of managerial psychology, 26(1), 58-76. 17. ple and Organizational Culture: A profile comparison approach to assessing person 516. 18.

333

19. Pfieffelmann B., Wagner S. H., Libkuman T., (2010). Recruiting on Corporate Web Sites: Perceptions of Fit and Attraction. International Journal of Selection and Assessment, Vol. 18, Issue 1, pp. 40-47. 20. Applied Psychology, 68, 147 154. 21. Smircich, L. (1983): Concepts of Culture and Organizational Analysis. Administrative Science Quarterly: 28(3). Pp. 339-358. 22. the applicant 312. 23. Journal of Vocational Behavior, 63(2), 242-263. 24.

334

OPTIMIZATION AND NATURE- INSPIRED ALGORITHMS: A REVIEW Vani Agrawal Pratiksha Kulshrestha

Abstract Optimization is everywhere, from engineering design to business planning and from the routing of the Internet to holiday planning. In almost all these activities, we are trying to achieve certain objectives or to optimize something such as profit, quality and time. As resources, time and money are always limited in real-world applications, we have to seek out solutions to use these valuable resources under various constraints. Nowadays, computer simulations become an indispensable tool for solving such optimization problems with various efficient search algorithms. Also, to satisfy the aim of solving various complex optimization problems, some nature inspired algorithms are developed. This paper specially focuses on optimization, its concepts and nature inspired algorithms. Key words: Optimization, No free lunch Theorems, Nature-Inspired Algorithms. INTRODUCTION Optimization is an art of selecting the best alternative(s) amongst a given set of options. The process of finding the largest or smallest possible value, which a given function can attain its domain of definition, is known as optimization. The applicability of optimization is different in different disciplines. For instance, Mathematicians are interested in finding the maxima or minima of a real function from allowable set of variables whereas, in computing and engineering, the goal is to maximize the performance of a system or applications with minimal runtime and resources. In case of the business industry, the aim is to optimize the efficiency of a production process or the quality of the product. The various examples stated above show that optimization is indeed a part of our day-to-day life. Various techniques have emerged for tackling with various types of optimization problems. Subsequently, various nature inspired algorithms have been developed to see if algorithms, Artificial Bee Colony Optimization, Biogeography-based Optimization, Particle Swarm Optimization are popular due to their high efficiency. OPTIMIZATION Finding an alternative with the most cost effective or highest achievable performance under the given constraints, by maximizing desired factors and minimizing undesired ones. In comparison, maximization means trying to attain the highest or maximum result or outcome without regard to cost or expense. The function to be optimized should be linear, non-linear and fractional. Sometimes even the explicit mathematical formulation of the function may 335

not be available. Often the function has to be optimized in a prescribed domain which is specified by a number of constraints in the form of equalities and inequalities. This domain is called the search space. The process of optimization addresses the problem of determining those values of the independent variables which do not violate the constraints and at the same time gives an optimal value of the function being optimized. Thus, the mathematical Obtaining the solution of many real life problems is not possible without the help of robust optimization techniques. Optimization problems arise in various fields of science, engineering and industry. In view of their practical utility there is a need to develop efficient and robust computational algorithms, which can solve problems, numerically irrespective of their size. Statement of an optimization problem An optimization or a mathematical programming problem can be stated as follows: x1 Find X=

x2

which minimizes f(X)

xn subject to the constraints gj

j

lj (X) = 0,

j

where X is an n-dimensional vector called the design vector, f (X) is termed the objective function, and gj (X) and lj (X) are known as inequality and equality constraints, respectively. The number of variables n and the number of constraints m and/or p need not be related in any way. The problem stated in Eq. (1.1) is called a constrained optimization problem. Some optimization problems do not involve any constraints and can be stated as

(1.2) Such problems are called unconstrained optimization problems [1]. No free lunch theorems One of the more interesting developments in optimization theory was the publication of the No Free Lunch (NFL) theorems [2]. This theorem states that the performance of all the 336

optimization algorithms, amortized over the set of all the possible functions, is equivalent. The implications of this theorem are far reaching, since it implies that no algorithm can be designed so that it will be superior to a linear enumeration of the search space, or even a purely random search. The theorem is only defined over finite search spaces, however, and it is as yet not clear whether the result applies to infinite search spaces, e.g., D. All computer implementations of search algorithms will effectively operate on finite search spaces, though, so the theorem is directly applicable to all existing algorithms. Although the NFL theorem states that all algorithms perform equally well over the set of all functions, it does not necessarily hold for all subsets of this set. The set of all functions over a finite domain includes the set of all the permutations of this domain. Method for global optimization Global optimization focuses on determining the best of the local minima. Designing global optimization techniques is not an easy task since, in general, there is no criterion to decide whether a global optimal solution has been achieved or not. In view of the practical necessity and with the ready to available fast computing machines, many computational techniques are now being reported in literature for solving nonlinear optimization problems. The methods currently available in literature for solving nonlinear global optimization problems may be broadly classified as deterministic methods and probabilistic methods. The deterministic methods try to guarantee that a neighborhood of a global optima attained. Such methods do not use any stochastic techniques, but use the same sequence of states to produce the same output for the same input, with the underlying machine. However, they are applicable only to a restricted class of functions. On the other hand, probabilistic methods are used to find the near optimal solution. This is achieved by assuming that the good solutions are near to each other in the search space. This assumption is valid for most real life problems. The probabilistic methods make use of stochastic approach to search for the global optimal solutions. Although probabilistic methods do not give an absolute guarantee, these methods are sometimes preferred over the deterministic methods because they are applicable to wider class of functions. In most of the optimization problems, complicated systems are modeled with complicated multi-dimensional functions that cannot be easily addressed. In such cases, algorithmic procedures that take full advantage of modern computer systems can be implemented to solve the underlying optimization problems, numerically. Thus, computation accuracy, time criticality and implementation efforts become important aspects of the numerical optimization procedure. Typical properties of such problems are the existence of discontinuities, the lack of analytical representation of the objective function, and noise dissemination. In these circumstances, the applicability and efficiency of classical and deterministic optimization algorithms are questionable. The main drawbacks of deterministic algorithms are that they are not robust, i.e., can only be applied to restricted class of problems and too time consuming or sometimes unable to solve real world problems. Therefore, a new class of algorithms comes into existence, i.e., nature inspire algorithms.

337

Nature Inspired Algorithms Researchers are gradually realizing that several systems observed in nature are able to cope efficiently with similar optimization problems. Thus, the trend to study and incorporate models of natural procedure in optimization algorithms gained ground and researchers have analyzed such behaviors and designed algorithms that can be used to solve numerical optimization problems in many science and engineering domains. The advantage of nature inspired algorithms is that they are applicable to wider set of problems as they do not require convexity, continuity or explicit definition of functions and they use the stochastic search strategy. Simultaneously, they have some inherent drawbacks such as converges to the global optima probabilistically and sometimes get stuck at local optima. Researchers are continuously working to get rid of these drawbacks of these algorithms. There are two fundamental processes which drive the members of a society or a set of potential solutions to update in the field of nature-inspired algorithms: the variation process, which enables exploring different areas of the search space, and the selection process, which ensures the exploitation of the previous experience. However, it has been shown that these algorithms may occasionally stop proceeding towards the global optimum even though the set of potential solutions has not converged to a local optimum [3, 4]. Population-based optimization algorithms find near-optimal solutions to the difficult optimization problems by being motivated from nature. Fitness based updation of all the potential solutions is a most and common feature of all the population based algorithms. Hence, the population is moved towards better solution areas of the search space. Two important classes of population-based optimization algorithms are evolutionary algorithms [5] and swarm intelligence-based algorithms [7]. Although Genetic Algorithms (GA) [8], Genetic Programming (GP) [8], Evolution Strategy (ES) [9] and Evolutionary Programming (EP) [10] are the popular evolutionary algorithms. In recent years, swarm intelligence has also attracted the interest of many research scientists. Swarm intelligence is a meta-heuristic method in the field of artificial intelligence that is used to solve optimization problems. It is based on the collective and cooperative behavior of social insects, flocks of birds, or school of fish. However, a swarm can be considered as any collection of interacting agents or individuals. These individuals sometimes can solve 338

complex tasks without centralized control. The definition given by bonabeau for the swarm -solving devices inspired by the collectiv Researchers have analyzed intelligent behavior of swarm and designed algorithms that can be used to solve nonlinear, nonconvex or combinatorial optimization problems in many science and engineering domains. Previous research [12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26] have shown that algorithms based on swarm intelligence have great potential to find a solution of real world optimization problem. The algorithms based on swarm intelligence that have emerged in recent years include Ant Colony Optimization (ACO) [15], Particle Swarm Optimization (PSO) [16], Bacterial Foraging Optimization (BFO) [27], Artificial Bee Colony Optimization (ABC) [28] etc. Nature as an optimizer 1. Aircraft wing design is inspired from birds to minimize drag

Wind turbine design is an inspiration from humpback whale to maximize maneuverability (enhanced lift devices to control flow over the flipper and maintain lift at high angles of attack).

Bionic Car is an inspiration from boxfish so as to minimize drag and maximize rigidity of exoskeleton.

339

4. The shape of bullet train is inspired from kingfisher to minimize micro-pressure waves.

Conclusion Researchers have confirmed that several systems observed in nature are able to solve efficiently many complex optimization problems which cannot be dealt with existing deterministic algorithms. Thus, the trend to study and incorporate models of natural procedure in optimization algorithms has gained popularity. Researchers have analyzed food foraging behaviors of natural species and designed algorithms that can be used to solve numerical optimization problems in many science and engineering domains. These algorithms have been developed to see if they can cope up with the challenging problems of optimization. References 1. 2.

Rao, S. S. (2009). Engineering Optimization Theory and Practice. New Jersey: John Wiley & Sons, Inc., Hoboken, New Jersey. D.H. Wolpert and -82,1997.

3. -132, 2009. -

4.

Mezura-

5.

evolutionary computation, pages 485-492. ACM,2006. Sindhya, K. (2012). An Introduction to Nature Inspired Algorithms. University of Jyvaskyla.

340

6. Science, 1990. 7. pp. 297-309, 1998. 8. 9.

-IEEE press, 2009. Conference of the North American on Fuzzy Information Processing Society (NAFIPS), pages 524-527. IEEE, 1996.

10. Oxford University Press, USA, 1999. 11.

and

12. Xinintelligence and bio13. VSV Rao, R.S. Shekh 6. IEEE, 2012. 14. 15.

ress, 2004. on Neural Networks, Proceedings, volumn 4, pages 1942-1948. IEEE, 1995.

16.

ractical approach to global

17. on Evolutionary Computation (CEC), volume 2, pages 1980-1987. IEEE, 2004. 18. or Optimization (NICSO), 129 (4) : 221-238, 2008. 19. 316, 2004. 20. KN Krishnanand, P Amruth, MH Guruprasad, Sharschchandra V Bidargaddi, and Debasish Ghose. Of IEEE International Conference on Robotics and Automation, pages 958-963. IEEE, 2006. 21. -369, 2007. 22. Swanti Satsangi, Ashish Gula Science, Engineering and Technology, number 63. World Academy of Science, Engineering and Technology, 2012. 23. -212, 1996. 24. Sung Soo Kim, Il-

zation for sonnet ring -

1626, 2008. 25.

-inspired computing: Theory and applications j. Journal of Universal Computer Science, 18(13): 1757-1759, 2012.

26. (IJSIR), 1(1): 1-16, 2010.

341

Antecedent of performance of students in mathematics * Smrita Bhadouria

** Reeta Chauhan *** Shailja Bhakar

Abstract The main objective of this study was to find the factors that affect the performance of students in the field of mathematics from different classes. Impact of factors that affect student’s attitude and student’s perception towards mathematics such as parental involvement, teacher factor and diligence was identified. The study also identified the effect of demographics variables i.e. class, gender and amount of study spent after school or college on student’s attitude, student’s perception, parental involvement, teacher factor and diligence. This study was based on survey method. The sample size was 270 which was collected from 12th grade students, under graduate and post graduate students studying in private management institutions. The findings indicated that the parents involvement in academic performance of the students will lead to positive perception of students towards mathematics but student’s perception alone does not contribute to higher performance in mathematics, student’s attitude should also be positive towards mathematics. A positive attitude can be attained if teacher factor is strong as well as student’s diligence i.e. hard work towards mathematics is high. Keywords: Attitude, Perception, Parental involvement, Teacher factor and Diligence. INTRODUCTION: In this modern era mathematics is a back bone of all the services which encircle human life because mathematics is a great motivator for all humans as the career starts with zero but it never ends (infinity)(Vignesh.R). This study was essential as a outcome of poor performances of students in mathematics in 12th, undergraduate and post graduate. Mathematics play important role in every field such as planning, business money handling and research etc. mathematics helps to understand the concept of profit, data presentation data handling, construction and optimization etc. The study of Mathematics and Sciences are facing major problems in the whole world. In this era student consider mathematics as a difficult subject and they do not take mathematics as a main subject. These researches concentrate on the factor that effect student performance in mathematics. Once the factors are known that leads to improvement in performance in mathematics they can be improved and student’s interest can further be generated in mathematics. Literature has indicated that parental involvement, teacher factor and diligence are important variables that effect student’s performance. Parental involvement is the amount of involvement parent’s show in 342

there wards study, teacher factor is evaluated through the way in which a teacher explains the topics, provides motivation to the students towards learning pays attention towards individual student’s learning; finally diligence is the carefulness, attentiveness etc towards learning. Therefore in the current research a combined effect of these variables i.e. parental involvement, teacher factor and diligence as independent variable was evaluated on dependent variable student’s attitude and student’s perception taken as indicator of student performance.

Review of literature: Al-Agili et al (2012) identified the key factors that affecting student’s achievement towards mathematics. Sample size of this study was 201 in which six factors were identified teacher’ attribution, classroom climate, students’ attitude and students’ anxiety etc towards mathematics. The results indicated that there was highest impact of teacher’ attribution on student’s achievement in mathematics and lowest impact of student’s attitude on student’s achievement in mathematics. The results indicated that there was lowest association between student’s attitude towards mathematics and student ‘achievement, teaching pedagogy, teacher attribution and classroom climate whereas high association with student’s anxiety. Bhakar et al (2014) revealed that the selection of student career in the field of mathematics depends on the student attitude towards Mathematics. The result indicated that there was strong contribution of usefulness of mathematics and self confidence on their career choice whereas there was no contribution of student’s interest and diligence on their career choice in the field of mathematics. The authors revealed that student’s interest, self confidence, usefulness and career choice was highest in 10th class and diligence was highest in 11th class. There was significant effect of gender on student’s interest towards mathematics.

Githua(2013) explored strong and considerable relationship between student’s perceptions of formative assessment in mathematics and their inspiration to learn mathematics in secondary schools of Nairobi and Rift Valley provinces, Kenya . The findings of this study indicated that only some characteristic of student’s perceptions of formative assessment in mathematics classrooms were influenced by gender gap.

Mbugua et al (2012) explored three factors which was student, socio- economic and school influencing the performance of secondary school students in mathematics. Student factor include entry behavior of student, motivation and attitude of student towards mathematics. Socio-economic factor include education background of parents as well as their source of income. At last, school factor contain accessibility and usage of teaching and learning facilities, school type, teacher characteristics, teachers workload and teacher attitude towards mathematics. Mokhtar et al (2012) applied factor analysis using the maximum likelihood method of extraction and varimax, orthogonal rotation and identified four factors affecting the performance of students in Business Mathematics at UiTM Kedah. These four factors were student’s attitude, responsibility of teacher in learning and teaching, peers and interest of students. 343

Murray(2013)conducted a study on the Factors that Influence Mathematics Achievement. data was based on students. The variable for the study was prior academic performance, self-efficacy, selfregulation, academic resources, performance and learning style the result indicated that there were a positive relationship between the independent variable and all the variables were best predictors of mathematics. They proposed that academic performance can be improved by classroom learning and variety of learning; students need to coverage to a selection of learning style in the teaching and learning sequence and prior knowledge of mathematics have high mathematics self-efficacy and attitude access to the necessary academic resources.

Yemi et al (2013) a study examined on a sample of 120 male and female students from senior secondary schools survey method was used for the study; t-test and chi-square test were applied for the study all the teachers teaching Mathematics were used as samples for the study. The result indicated that qualification of teacher and gender differences affect the performance of student towards mathematics. This study discovered that the performance of female students were weaker than the male students towards mathematics. They also discovered qualification of teacher affect the performance of the students therefore lack qualified teacher of mathematics affect the performance of the students negatively. Brew (2011) proposed a study on 140 students from high school and 32 teachers. This study was based on mathematical activities and Classroom learning factors that Support the performance of Senior High School Students. ANOVA and f-test was applied for the study. The result indicated that there are many factors that affect the performance of the students like if the teacher has a good knowledge and effective teaching style in mathematics then defiantly performance of the students are improved. This study also indicated that if there were limited students in class and the facility provided by the teacher during the classes such as study material, assignments and text books help the students to perform better in mathematics. Adeyinka et al (2013) focused on teacher’s motivation on performance of the students in mathematics. Data was collected from four government school. The result indicated that the teachers have to be aggravated and effective to improve the performance of the students in mathematics the teachers non monetary benefit payment can relatively influence student`s academic performance in mathematics. Promotion and training of teacher can also affect the performance of students. Manoah et al (2011) findings indicated that the overall score of all the students on four element of mathematics curriculum that is Content, Objectives, Evaluation, Methods. Difference between differentiating variable gender was also evaluated all these elements. Male students were positive towards objectives and contents of mathematics curriculum while they were neutral on methods and evaluations. Female students were positive towards objectives, neutral towards content and negative towards both evaluation and methods of mathematics curriculum.

Objectives of study:

344

1. To standardized questionnaire on student attitude, student perception, parental involvement, teacher factor and diligence. 2. To identify impact of parental involvement, teacher factor and diligence as independent variable on student attitude, student perception as dependent variable. 3. To identify difference between class, gender, amount of study on all the variables of the study. 4. To identify further areas for future research. Research Methodology:

The study is causal in nature. Data collection was based on survey method. The population included student of 12th grade student, graduate and post-graduate from private management institute in Gwalior region. Individual student was the sampling element. Non probability purposive sampling technique was used to select the sample. The sample size was 270 students. Standardized questionnaire (Breiteig et al, 2005; Kyoung Um et al, 2005; Kiamanesh , 2004; Lamb & Fullarton, 2001) was used to measure the antecedent of students performance in mathematics This questionnaire was designed on a Likert scale of 5 points where 1 stands for strongly disagree and 5 stands for strongly agree to collect quantitative data. The data was analyzed using the PASW 18 for calculating reliability and underlying factors. Cron Bach Alpha Reliability test was used to check the reliability of the questionnaire. Validity was checked through face validity method. M-ANOVA was applied to find out the difference between the demographic variables of the study such as class, gender and amount of study. MANCOVA was applied to find the impact of dependent variable on independent variable. Reliability Test: Reliability coefficient was applied by using PASW 18 software on questionnaire of Student Attitude, Student Perception, Parental Involvement, Teacher Factor and Diligence. The reliability test values of all questionnaires are given below: S.No. 1. 2. 5. 6. 8.

Variable Name Student’s Attitude Student’s Perception Parental Involvement Teacher Factor Diligence

Cronbach’s Alpha 0.866 0.699 0.595 0.758 0.477

No. of Items 10 6 4 4 2

The reliability value from the above table indicates that the reliability coefficient cronbach alpha value is more than 0.5 for all the questionnaire of the study indicating that the reliability of the questionnaire was high and it is suitable for the study. Kaiser-Meyer-Olkin Measure of Sampling Adequacy and Bartlett’s Test of Sphericity: 345

S.No.

Variable Name

KMO Test Adequacy

1. 2. 5. 6. 8.

Student’s Attitude Student’s Perception Parental Involvement Teacher Factor Diligence

0.871 0.744 0.679 0.758 0.558

of Bartlet’s Test of Sphericity (Chi Square) 1269.047 254.928 101.666 254.725 38.973

Significance Level

0.000 0.000 0.000 0.000 0.000

Kaiser-Meyer-Olkin(KMO) Measure of Sampling Adequacy is an index used to evaluate the appropriateness of factor analysis. High values (between 0.5 and 1.0) indicate factor analysis is appropriate. Values below 0.5 indicated that factor analysis may not be appropriate. The KaiserMeyer-Olkin Measure of Sampling Adequacy value for student’s attitude, student’s perception, parental involvement, teacher factor and diligence was 0.871, 0.744, 0.679, 0.758 and 0.558 indicating that the sample was adequate to consider the data suitable for factor analysis. Bartlett’s Test of Sphericity: Bartlett’s Test of Sphericity is a test statistic used to examine the hypothesis that the variables are uncorrelated in the population. In other words, the population correlation matrix is an identity matrix; each variable correlates perfectly with itself (r=1) but has no correlation with the other variable(r=0). The Bartlett’s Test of Sphericity was tested through ChiSquare value represented in the table significant at 0.000 level of significance. Therefore, the above hypothesis is rejected and indicated that the data was suitable for factor analysis. Factor Analysis: Principal component factor analysis with varimax rotation was applied to find out the underlying factors of the questionnaire on student’s attitude. Factor analysis converged on two factors: Curious and attentive after three iterations. Factor analysis for student attitude: S.No.

Factor Name

1

Curious

2

Attentive

Eigen Value 3.580

2.882

% of Items Converged Variance 35.798 2 Mathematics is interesting for me.

28.815

Item Loading 0.864

3 Mathematics is an easy subject for me.

0.829

6 Mathematics is one of the subjects I like most.

0.816

4 I enjoy learning Mathematics. 1 Mathematics is exciting for me. 14 I always punctual in the classroom.

0.784 0.772 0.856

16 I always concentrate in the classroom.

0.805

15 I always do my home work on time. 0.773 11 I keep good attendance in Mathematics 0.608 Lecture. 17. I always identify my problem to solve the 0.591 Mathematical questions.

346

Factor Analysis: Principal component factor analysis with varimax rotation was applied to find out the underlying factors of the questionnaire on student’s perception. Factor analysis converged on two factors: Considerate and Noteworthy after three iterations. Factor Analysis for Student Perception S.No.

Factor Name

1

Considerate

2

Noteworthy

Eigen Value 1.784

1.718

% of Items Converges Variance 29.739 26 I need to do well in mathematics to please myself. 29 Most of my friends think it is important for me to do well in mathematics at school. 27 I need to do well in mathematics to please my parents. 28.629 35 I would like a job involving mathematics

Item Loading 0.773 0.731 0.700 0.825

37 I need to do well in mathematics to get into the 0.769 university/post-school course I prefer 36 I need to do well in mathematics to get the job I 0.615 want

MANCOVA Multiple Analysis of Covariance was applied between parental involvement, teacher factor and diligence as independent variables and student attitude and student perception as dependent variable Tests of Between-Subjects Effects Source Corrected Model Intercept Parental_Involvement Teacher_Factor Diligence Error Total

dimension1 dimension1 dimension1 dimension1 dimension1 dimension1 dimension1

Dependent

Type III Sum

Variable

of Squares

df

Mean

F

Sig.

Square

6651.011

a

3

2217.004

55.301

.000

Student_Perception

2105.829

b

3

701.943

62.389

.000

Student_Attitude

1486.009

1

1486.009

37.067

.000

Student_Perception

260.625

1

260.625

23.164

.000

Student_Attitude

19.761

1

19.761

.493

.483

Student_Perception

597.172

1

597.172

53.077

.000

Student_Attitude

1580.029

1

1580.029

39.413

.000

Student_Perception

151.386

1

151.386

13.455

.000

Student_Attitude

2301.892

1

2301.892

57.419

.000

Student_Perception

322.989

1

322.989

28.707

.000

Student_Attitude

10663.789

266

40.089

Student_Perception

2992.778

266

11.251

Student_Attitude

353124.000

270

Student_Attitude

347

Corrected Total

dimension1

Student_Perception

109686.000

270

Student_Attitude

17314.800

269

Student_Perception

5098.607

269

a. R Squared = .384 (Adjusted R Squared = .377) b. R Squared = .413 (Adjusted R Squared = .406)

The model having parental involvement, teacher factor and diligence as independent variables and student attitude and student perception as dependent variable was tested through F-test value of corrected model in test between subject effect tables. the model was having a good fit as indicated by F-test values i.e.55.301 and 62.389 significant at 0% level of significance Parental involvement teacher factor and diligence all were having significant effect on student perception tested through F-test values 53.077, 13.455 and 28.707 significant at 0% level of significance. Teacher factor and diligence were also having significant effect on student attitude tested through F-test values 39.413 and 57.419 significant at 0% level of significance whereas parental involvement was not having a significant effect on student attitude tested through F-test value 0.493 significant at 0.483 level of significance MANOVA Multiple analysis of variance was applied taking Gender, amount of study and class as independent variables and parental involvement, teacher factor diligence ,student attitude and student perception Box's Test of Equality of Covariance Matrices Box's Test of Equality of Covariance Matricesa Box's M

371.356

F

1.407

df1

210

df2

7121.667

Sig.

.000

Tests the null hypothesis that the observed covariance matrices of the dependent variables are equal across groups. a. Design: Intercept + Gender + Amount_of_Study + Class + Gender * Amount_of_Study + Gender * Class + Amount_of_Study * Class + Gender * Amount_of_Study * Class

Box’s M was applied to check equality of error co-variances and it was found that the F-value 1.407 significant at 0.000 indicating that the observed covariance matrices of the dependent variables were not equal across groups. Levene's Test of Equality of Error Variances Levene's Test of Equality of Error Variancesa

Student_Attitude

F

df1

df2

Sig.

1.571

17

252

.072

348

Student_Perception

.735

17

252

.765

Parental_Involvement

1.192

17

252

.271

Teacher_Factor

1.894

17

252

.019

Diligence

1.667

17

252

.049

Tests the null hypothesis that the error variance of the dependent variable is equal across groups. a. Design: Intercept + Gender + Amount_of_Study + Class + Gender * Amount_of_Study + Gender * Class + Amount_of_Study * Class + Gender * Amount_of_Study * Class

Levene’s test for equality of error variances indicated that for student attitude, student perception, Parental_ Involvement were equal with F-value 1.571,0.735 and 1.192 significant at .071,0.765 and0.271 in other variables teacher factor and diligence error variances were not equal tested through F-test value 1.894and 1.667 significant at 0.019 and 0.049. Tests of Between-Subjects Effects Tests of Between-Subjects Effects Source

Dependent Variable

Type

III

Sum

of

Df

Mean

F

Sig.

Square

Squares Corrected Model

Student_Attitude

17

169.400

2.957

.000

567.299

b

17

33.371

1.856

.022

226.447

c

17

13.320

1.359

.158

488.561

d

17

28.739

2.347

.002

Diligence

108.146

e

17

6.362

1.966

.014

Student_Attitude

116945.392

1

116945.392

2041.583

.000

Student_Perception

35575.007

1

35575.007

1978.435

.000

Parental_Involvement

16973.123

1

16973.123

1731.442

.000

Teacher_Factor

19454.206

1

19454.206

1589.053

.000

Diligence

3373.987

1

3373.987

1042.739

.000

Student_Attitude

590.149

1

590.149

10.303

.002

Student_Perception

87.595

1

87.595

4.871

.028

Parental_Involvement

18.585

1

18.585

1.896

.170

Teacher_Factor

42.524

1

42.524

3.473

.064

Diligence

.127

1

.127

.039

.843

Student_Attitude

129.417

2

64.709

1.130

.325

Student_Perception

117.616

2

58.808

3.270

.040

Parental_Involvement

60.748

2

30.374

3.098

.047

Teacher_Factor

116.993

2

58.497

4.778

.009

Diligence

17.918

2

8.959

2.769

.065

Student_Perception Parental_Involvement Teacher_Factor Intercept

Gender

Amount_of_Study

2879.805a

349

Class

Gender

*

Amount_of_Study

Gender * Class

Amount_of_Study

*

Class

Student_Attitude

1.435

2

.718

.013

.988

Student_Perception

14.514

2

7.257

.404

.668

Parental_Involvement

.121

2

.060

.006

.994

Teacher_Factor

22.241

2

11.120

.908

.405

Diligence

30.688

2

15.344

4.742

.010

Student_Attitude

37.077

2

18.539

.324

.724

Student_Perception

.317

2

.158

.009

.991

Parental_Involvement

3.462

2

1.731

.177

.838

Teacher_Factor

26.178

2

13.089

1.069

.345

Diligence

3.395

2

1.698

.525

.592

Student_Attitude

319.844

2

159.922

2.792

.063

Student_Perception

90.230

2

45.115

2.509

.083

Parental_Involvement

7.978

2

3.989

.407

.666

Teacher_Factor

10.297

2

5.148

.421

.657

Diligence

2.659

2

1.330

.411

.663

Student_Attitude

423.185

4

105.796

1.847

.120

Student_Perception

128.670

4

32.168

1.789

.132

Parental_Involvement

60.065

4

15.016

1.532

.193

Teacher_Factor

117.616

4

29.404

2.402

.050

Diligence

31.129

4

7.782

2.405

.050

Gender

*

Student_Attitude

53.019

4

13.255

.231

.921

Amount_of_Study

*

Student_Perception

57.286

4

14.322

.796

.528

Parental_Involvement

15.444

4

3.861

.394

.813

Teacher_Factor

86.813

4

21.703

1.773

.135

Diligence

9.875

4

2.469

.763

.550

Student_Attitude

14434.995

252

57.282

Student_Perception

4531.309

252

17.981

Parental_Involvement

2470.327

252

9.803

Teacher_Factor

3085.146

252

12.243

Diligence

815.395

252

3.236

Student_Attitude

353124.000

270

Student_Perception

109686.000

270

Parental_Involvement

51527.000

270

Teacher_Factor

59417.000

270

Diligence

11352.000

270

Student_Attitude

17314.800

269

Student_Perception

5098.607

269

Parental_Involvement

2696.774

269

Teacher_Factor

3573.707

269

Class

Error

Total

Corrected Total

350

Diligence

923.541

269

a. R Squared = .166 (Adjusted R Squared = .110) b. R Squared = .111 (Adjusted R Squared = .051) c. R Squared = .084 (Adjusted R Squared = .022) d. R Squared = .137 (Adjusted R Squared = .078) e. R Squared = .117 (Adjusted R Squared = .058)

The model having gender, amount of study and class as independent variable and student attitude, student perception, parental involvement, teacher factor and diligence as dependent variable were tested through Ftested value of corrected model for test between subject effects. The model was having good fit for student attitude, student perception, teacher factor and diligence tested through F-test value 2.957,1.856, 2.347 and1.966 significant at 0.000, 0.022, 0.002 and 0 .014 level of significance. In other case that is parental involvement the model was having poor fit as indicated by F-test value 1.359, significant at 0.158 level of significance. Gender was having significant effect on student attitude, student perception as indicated by F-test value 10.303, 4.871 significant at 0.002, 0.028 level of significance gender was not having significant effect on parental involvement, teacher factor and diligence tested through F-test value 1.896, 3.473 and 0.039 significant at 0.170, 0.064 and 0.843 level of significance The results indicated that there is significant difference in male and female student’s attitude, perception considered by them Amount of study was having significant effect on student perception, parental involvement, and teacher factor as indicated by F-test value 3.270, 3.098, and 4.778 significant at 0.040, 0.047 and 0.009 level of significance amount of study was not significant effect on student attitude and diligence tested through F-test value 1.130, and 2.769 significant at 0.325 and 0.065 level of significance The results indicated that there is significant difference among amount of study on student perception, parental involvement, and teacher factor of mathematics considered by them Class was having significant effect on diligence as indicated by F-test value4.742 significant at 0.010 level of significance. Class was having no significant effect on the student attitude, student perception, parental involvement and teacher factor tested through F-test value 0.013, 0.404, 0.006 and 0.908 significant at 0.988, 0.668, 0.994 and 0.405 level of significance. From the results it can be understood that there is significant difference in diligence of students belonging to different classes i.e. 12th, undergraduate and post graduate.

Post Hoc Multiple Comparisons Tukey HSD Dependent Variable

(I)

(J)

Mean

Std.

Amount_of_Study

Amount_of_Study

Difference

Error

Sig.

(I-J) Student_Attitude

dimension2

1.00

dimension3

2.00

351

-2.5766

1.43673

.174

95% Confidence Interval Lower

Upper

Bound

Bound

-5.9639

.8107

2.00

3.00

Student_Perception

dimension2

1.00

2.00

3.00

Parental_Involvement

dimension2

1.00

2.00

3.00

Teacher_Factor

dimension2

1.00

2.00

3.00

Diligence

dimension2

1.00

2.00

3.00

dimension3

dimension3

dimension3

dimension3

dimension3

dimension3

dimension3

dimension3

dimension3

dimension3

dimension3

dimension3

dimension3

dimension3

3.00

-5.0647*

1.42045

.001

-8.4136

-1.7158

1.00

2.5766

1.43673

.174

-.8107

5.9639

.99276

.034

-4.8287

-.1475

1.42045

.001

1.7158

8.4136

*

3.00

-2.4881

1.00

5.0647

*

2.00

2.4881*

.99276

.034

.1475

4.8287

2.00

-1.5586

.80497

.131

-3.4564

.3393

.79585

.004

-4.4787

-.7260

.80497

.131

-.3393

3.4564

*

3.00

-2.6023

1.00

1.5586

3.00

-1.0438

.55622

.148

-2.3552

.2676

1.00

2.6023

*

.79585

.004

.7260

4.4787

2.00

1.0438

.55622

.148

-.2676

2.3552

2.00

-.1532

.59435

.964

-1.5544

1.2481

3.00

-1.2712

.58762

.080

-2.6566

.1142

1.00

.1532

.59435

.964

-1.2481

1.5544

.41069

.019

-2.0863

-.1497

3.00

-1.1180

1.00

1.2712

*

.58762

.080

-.1142

2.6566

2.00

1.1180

*

.41069

.019

.1497

2.0863

2.00

-1.0631

.66421

.247

-2.6290

.5029

.65668

.011

-3.4589

-.3625

.66421

.247

-.5029

2.6290

.45896

.157

-1.9297

.2344

.65668

.011

.3625

3.4589

3.00

-1.9107

1.00

1.0631

3.00

-.8477 *

*

1.00

1.9107

2.00

.8477

.45896

.157

-.2344

1.9297

2.00

-.2973

.34147

.659

-1.1024

.5078

3.00

-.6834

.33760

.108

-1.4794

.1125

1.00

.2973

.34147

.659

-.5078

1.1024

3.00

-.3861

.23595

.232

-.9424

.1702

1.00

.6834

.33760

.108

-.1125

1.4794

2.00

.3861

.23595

.232

-.1702

.9424

Based on observed means. The error term is Mean Square(Error) = 3.236. *. The mean difference is significant at the .05 level.

The post hoc test was applied to identify individual differences between the students of different amount of study i.e. 1hr, 2hrs and 3hrs on all the variables of the study i.e. student attitude student perception, parental involvement teacher factor and diligence. Students who spent 3hrs for studies at home have significantly different attitude compared to those who spent 1hr and 2hrs on study at home. Students who spent 3hrs for studies at home have significantly different perception and teacher factor compared to those who spend 1hr on study at home. Parental involvement in students study is also significantly different for students who spent 3hrs 352

on studies and students who spent 2hrs on studies. No significant difference was found in the diligence of students who spent 1hr, 2hrs and 3hrs on studies

Multiple Comparisons Tukey HSD Dependent Variable

Student_Attitude

(I) Class

(J) Class

Mean

Std.

Lower

Upper

(I-J)

Bound

Bound

1.36248 .064

-6.2910

.1335

1.20771 .023

-6.0441

-.3494

1.36248 .064

-.1335

6.2910

1.09241 .994

-2.6936

2.4575

1.20771 .023

.3494

6.0441

1.09241 .994

-2.4575

2.6936

dimension2 1.00 dimension3 2.00 -.2824

.76337

.927

-2.0822

1.5173

3.00 -.5463

.67665

.699

-2.1416

1.0490

.76337

.927

-1.5173

2.0822

.61206

.903

-1.7069

1.1791

3.00 dimension3 1.00 .5463

.67665

.699

-1.0490

2.1416

2.00 .2639

.61206

.903

-1.1791

1.7069

.56364

.927

-1.5372

1.1205

3.00 .2014

.49961

.914

-.9765

1.3793

2.00 dimension3 1.00 .2083

.56364

.927

-1.1205

1.5372

3.00 .4097

.45191

.637

-.6557

1.4752

.49961

.914

-1.3793

.9765

.45191

.637

-1.4752

.6557

.62988

.040

-3.0267

-.0566

.55833

.565

-1.8858

.7469

.62988

.040

.0566

3.0267

3.00 .9722

.50503

.134

-.2185

2.1629

3.00 dimension3 1.00 .5694

.55833

.565

-.7469

1.8858

.50503

.134

-2.1629

.2185

dimension2 1.00 dimension3 2.00 -3.0787 *

2.00 dimension3 1.00 3.0787 3.00 -.1181 3.00 dimension3 1.00 3.1968

*

2.00 .1181

2.00 dimension3 1.00 .2824 3.00 -.2639

Parental_Involvement dimension2 1.00 dimension3 2.00 -.2083

3.00 dimension3 1.00 -.2014 2.00 -.4097 Teacher_Factor

dimension2 1.00 dimension3 2.00 -1.5417

*

3.00 -.5694 2.00 dimension3 1.00 1.5417

*

2.00 -.9722 Diligence

95% Confidence Interval

Difference Error

3.00 -3.1968

Student_Perception

Sig.

dimension2 1.00 dimension3 2.00 -1.1481 3.00 -.7315

.32382

.001

-1.9116

-.3847

*

.28704

.031

-1.4082

-.0547

*

.32382

.001

.3847

1.9116

.25963

.245

-.1955

1.0288

.28704

.031

.0547

1.4082

2.00 dimension3 1.00 1.1481 3.00 .4167 3.00 dimension3 1.00 .7315

353

*

*

2.00 -.4167

.25963

.245

-1.0288

.1955

Based on observed means. The error term is Mean Square(Error) = 3.236. *. The mean difference is significant at the .05 level.

The post hoc test was applied to identify individual differences between the students of different classes i.e. 12 th ,undergraduate and post graduate on all the variables of the study i.e. student attitude student perception ,parental involvement teacher factor and diligence. Significance difference was found between the attitude of 12th class and post graduate students, significant difference was also found on responses for teacher factor by student of 12th class and undergraduate. Further difference was found in diligence of 12th class and undergraduate student,12th class and post graduate student. No differences were found in students of 12th class under graduation and post graduation on parental involvement and student perception.

Descriptive Statisticss

Gender

1 2 Amount 1 of study 2 3 class 1 2 3

Student Attitude Student perception Parental Involvement 33.4470 19.0530 13.0152 37.0072 20.8000 13.8623 31.9189 17.8649 12.8108 34.4955 19.4234 12.9640 36.9836 20.4672 14.0820 32.7407 19.3148 13.5000 35.8194 19.5972 13.7083 35.9375 19.8611 13.2986

Teacher Factor 13.7879 14.9493 13.0811 14.1441 14.9918 13.6667 15.2083 14.2361

Diligence 5.9242 6.4928 5.7838 6.0811 6.4672 5.5185 6.6667 6.2500

Descriptive statistics test indicated that the attitude, perception and diligence of female student were higher than the male students. Parental involvement and teacher factor was also higher in case of female students compared to male students. Students who spent three hours on studies have higher diligence, perception and positive attitude towards studies compared to those who spent one or two hours on studies, effect of parental involvement and teacher factor was also high on these students. Attitude and perception of post graduate students was higher than under graduate and least was in 12th class students. Parental involvement was highest on under graduate students than 12th class students and least was in post graduate students. Teacher factor and diligence were highest on undergraduate students than post graduate students and least was in 12 th class.

Conclusion Questionnaires on student’s attitude, student’s perception, parental involvement, teacher factor and diligence were standardized using Cronbach’s Alpha. The reliability test indicated the questionnaires were reliable and can be used for current study. Factor analysis with varimax rotation was applied to find out the underlying factors of the questionnaire on student’s attitude and student’s perception. Factor analysis generated two factors for student’s attitude and student’s perception. 354

Results of Multiple analysis of covariance indicated that parental involvement, teacher factor and diligence significantly effects student perception as well as teacher factor and diligence was having a significant on student’s attitude. Result of multiple analyses of variance indicated that gender differences exist in student’s attitude and student’s perception. Parental involvement and teacher factor was found higher with increase in amount of study which led to higher student’s perception and diligences of student also vary according to class. Finally from the results, it can be concluded that parents involvement in academic performance of the students will lead to positive perception of students towards mathematics but perception alone does not lead to higher performance in mathematics, student’s attitude should also be positive towards mathematics. A positive attitude can be attended if teacher factor is strong as well as student’s diligence i.e. hard work towards mathematics is high. Attitude, perception parental involvement, teacher factor and diligence of female student were higher than the male students also attitude, perception parental involvement, teacher factor and diligence increase with increase in amount of study.

Reference: 1. Adeyinka,A., Asabi,O., Adedotun.O.(2013). Teacher Motivation on students performance in Mathematics in Government Secondary School, Makrudi Lg Area. International journal of Humanities and Social Science Invention ISSN, 2(5), 35-41. 2. Al-Agili,M.Z.G.,Mamat,M.B., Abdullah,L.,& Maad,H.A. (2012) The Factors Influence Students’ Achievement in Mathematics: A Case for Libyan's Students, World Applied Sciences Journal 17 (9): ISSN ,1818-4952 ,1224-1230. 3. Bhakar, S., Chauhan, R., & Bhadouria, S.(2014), The impact of student’s attitude and belief on their Career Choice, Sustainability Management and the Power of Innovation (Eds S S Bhakar, Vinod K Bhatnagar, Richa Banerjee), 1, 539-556, New Delhi: Bloomsbury India (ISBN 978-9382951-49-0) 4. Breiteig, T., Grevholm, B., & Kislenko, K. (2005), Beliefs and Attitudes in Mathematics Teaching and Learning, Retrieved December 24 2014 from http://prosjekt.uia.no/lcm/papers/TB_BG_KK_Beliefs_rev.pdf

5. Brew,L.(2013). Mathematical Activities and Classroom Based Factors That Support Senior High School Student’s Mathematical Performance.British Journal of Arts and Social Sciences ISSN, 2(1), 11-20 6. Kiamanesh, A.L. (2004), Factors Affecting Iranian Student’s Achievement in Mathematics. Paper presented in the First IEA International Research Conference, Cyprus. Retrieved December 24 2014 from http://www.iea.nl/fileadmin/user_upload/IRC/IRC_2004/Papers/IRC2004_Kiamanesh.pdf

355

7. Githua B N (2013), Secondary School Student’s Perceptions of Mathematics Formative Evaluation and the Perception’s relationship to their motivation to learn the subject by gender in Nairobi and Rift Valley Province, Kenya, Asian Journal of Social Science & Humanities, 2(1), 174-183, ISSN 2186-8492 8. Lamb, S., & Fullarton S. (2001), Classroom and School Factors Affecting Mathematics Achievement: A Comparative Study of the US and Australia using TIMSS, Annual Conference of the American Educational Research Association, Seattle, Washington, Australian journal of education,46(2), 154-171. 9. Manoah, S. Indoshi, F.C., & Othuon,L.O.A.(2011).Influence of attitude on performance of the Student in Mathematics Curriculum. International Research journal, 2(3), 965-985

10. Mbugua Z.K, Kibet K, Muthaa G.M, Nkonke G.R (2012), Factors Contributing To Students’ Poor Performance in Mathematics at Kenya Certificate of Secondary Education in Kenya: A Case of Baringo County, Kenya, American International Journal of Contemporary Research, 2(6), 87-91.

11. Mokhtar S.F, Yusof Z Md & Misiran M (2012), Factors Affecting Students’ Performance in Mathematics, Journal of Applied Sciences Research, 8(8): 4133-4137, ISSN 1819-544X 12. Murray, J.(2013). The Factors that influence mathematics Achievement at the berbice Campus. International Journal of Business and Social Sciences, 4(10), 150-164. 13. Um E .K, Corter J.,& Tatsuoka K.(2005), Motivation, Autonomy Support, and Mathematics Performance:A Structural Equation Analysis, supported by the National Science Foundation (REC NO. 0126064), 1-13 14. Yemi,T.M.,&Adeshina, A.N.G.(2013). Factors Influencing Effective Learning of mathematics at senior Secondary Schools within Gombe Metropoils, Gombe State Nigeria. Journal of education and practice ISSN,4(25), 61-66.

356

Antecedents of Acceptance and Recommendation: A Study of WhatsApp Pinky Sodhi * Shailja Bhakar** Abhay Dubey***

Abstract The current study was conducted to identify antecedents of acceptance and recommendation of WhatsApp. The study was conducted on a sample size of 100 male and female students studying in undergraduate and post graduate courses in different institutions in Gwalior region. The findings indicated that the main motive of using social networking mobile applications is enjoyment and relaxation and since more and more people can connect at one platform easily through these applications, feeling of enjoyment and relaxation increases and that is the sole reason of recommendation of such applications by current users to new users.

Key words: Personalization, Effort Expectancy, Flow, Recommendation Intention, Acceptance, WhatsApp

Introduction In the age of Smart Phones, Social Networking Mobile Applications have gained a booming popularity. There has been a drastic change in the ways of social interaction and communication due to the advancement in Internet Technologies. Social Networking Mobile Applications like Facebook Messenger, WhatsApp, Google Hangouts, Viber, WeChat, Line, etc. have provided users with a platform where they can keep in touch with their family and friends 24x7 and can share anything like videos, images, life events, voice messages, etc. WhatsApp is a cross-platform Messenger that allows Smart Phone users having different phones such as Android, iPhone, BlackBerry, and Windows Phone, to share messages and all other kinds of information with each other without any extra cost. The users of WhatsApp, along with sending plain and multimedia messages individually, can create customized groups based on their interest and can also send messages to multiple recipients at one go through broadcast feature. Users can also share their location and send and receive contacts as well through WhatsApp. All of these features and facilities have made WhatsApp a very popular and widely used Social Networking Mobile Application. People of various age groups can use it easily because of its convenient to use interface, fast installation, and free availability. 357

*, **, *** Assistant Professor, Prestige Institute of Management, Gwalior In the current study, antecedents of acceptance and recommendation of WhatsApp were identified. From the literature, the factors that influence the acceptance and recommendation of Social Networking Applications were identified, specifically the study of Lee, Kim, & Choi, (2012) was used and the factors that were identified were Personalization, Effort Expectancy, and Flow. The study used these factors (Personalization, Effort Expectancy, and Flow) as independent variables and Recommendation Intention and Acceptance as dependent variables.

Literature Review Dhami, Agarwal, Chakraborty, Singh, & Minj (2013) findings indicated that users’ trust in Facebook was significantly affected by perceived security and perceived privacy. Findings also indicated that privacy alone does not have any effect on information sharing whereas when it is combined with trust, they both had significant effect on information sharing. John (2013) findings indicated that there was a significant effect of previous computer experience and computer knowledge on Computer Self-Efficacy. Self efficacy also improves the usefulness of information perceived by the users as well as intention to use the system. Significant relation was also found between perceived usefulness and intention to use and no significant relationship was found between self efficacy and social factors as well as anxiety and usage intention. Lin & Lu (2011) found that enjoyment is the major factor responsible for increased involvement of users in Social Networking Sites. The results also indicated that usefulness of Social Networking Sites and enjoyment are significantly affected by perceived complementarity and number of peers. Gender differences were found in continued intention to use Information Technology. Both male and female indicated that enjoyment increases Social Networking Sites’ continues intention to use. Cheung, Chiu, & Lee (2011) findings indicated that out of all the variables, Social Presence and Group Norms had a significant influence on We-Intention to use Facebook. Results also indicated that out of the five factors, namely, Purposive Value, Self Discovery, Maintaining Interpersonal Interconnectivity, Social Enhancement, and Entertainment Value, both the social factors i.e. Maintaining Interpersonal Interconnectivity, and Social Enhancement as well as Entertainment Value had a significant effect on We-Intention whereas social identity was found to have no effect. Lee, Suh (2013) developed a model based upon earlier models such as Network Externality, Innovation Diffusion Theory, and Technology Acceptance Model and conducted an empirical research using Structural Equation Modeling through AMOS 16. The model included five factors, namely, Perceived Ease of Use, Compatibility, Members, Actual Use, and Perceived Usefulness and it was found that Perceived Usefulness, Perceived Ease of Use, Members, and Compatibility had significant affect on Actual Use. Difference between Facebook Usage and Twitter Usage was 358

also studied and it was found that usage of Facebook was significantly affected by Perceived Usefulness and Members whereas usage of Twitter was affected by Perceived Ease of Use and Compatibility. Shambare (2014) developed a model based upon Technology Acceptance Model and conducted a research using Structural Equation Modeling. The results indicated that WhatsApp’s Perceived Ease of Use significantly affects its Perceived Usefulness, Attitude towards using it, and Usage Intention. Futher, it was also found that Perceived Usefulness had a significant impact on Intention to use WhatsApp. No significant effect of perceived usefullness was found on attitutude towards using WhatsApp as well as, of Attitude towards Using WhatsApp on Usage Intention. Yeboah & Ewur (2014) conducted a study using survey method. Initially, 50 students were interviewed and further 500 filled-in questionnaires were collected on usage of WhatsApp. The results indicated that social networking applications such as WhatsApp have inferior affect on Students’ Performace in Academics. Findings also indicated that usage of WhatsApp has no significant affect on communication, flow of information and idea sharing amongst students. WhatsApp also degrades grammatical constructions and spellings of students. Maheshwari (2014) identified the difference between usage of WhatsApp by Beginners and Long-Time users. Similarities were found among the features used by both types of users and the most used features were sending videos and images and group chat, but there was a difference in the number of friends they communicated with, beginners communicated with minimum number of friends and long-time users communicated with maximum friends including international chatting.

Objectives 1. To standardize questionnaires on Personalization, Effort Expectancy, Flow, Recommendation Intention and Acceptance. 2. To identify the impact of Personalization, Effort Expectancy, and Flow as Independent Variables on Acceptance and Recommendation Intention as Dependent Variables. 3. To identify difference between Genders and Classes on all the variables of the study. 4. To identify new areas for further research.

Research Methodology The study was Causal in nature with Survey method being used for data collection. Data was collected from male and female WhatsApp users studying in post graduate and undergraduate programmes from Gwalior region. Individual respondents were taken as the sample element and non probability Quota Sampling Technique was used to identify the sample. Data was collected using Self Designed Questionnaires on Personalization, Effort Expectancy, Flow, Recommendation Intention and Acceptance on a Likert type scale of 1 to 7 where 1 indicated minimum agreement and 7 indicated maximum agreement. Cronbach’s Alpha Reliability Test was applied to check the reliability of the questionnaires, Principle Component Factor Analysis with Varimax Rotation was applied to find out the underlying factors of the 359

questionnaires. Multiple Analysis of Variance was applied to identify the difference between Gender and Qualification on all the variables of the study and Multiple Analysis of Covariance was applied to find out relationship between Personalization, Effort Expectancy, and Flow as independent variables and Recommendation Intention and Acceptance as dependent variables.

Results Reliability Analysis Cronbach’s Alpha Reliability Coefficient was calculated using PASW 18. The Cronbach’s Alpha Coefficient represents internal consistency reliability. The results of Cronbach’s Alpha Reliability of all the Questionnaires of the study are given below: S. No. Variable Name Cronbach’s Alpha No. of items 1 Personalization 0.829 3 2 Effort Expectancy 0.892 3 3 Flow 0.593 2 4 Recommendation Intention 0.743 2 5 Acceptance 0.715 2 If the computed reliability of a measure is greater than 0.7, the measure can be considered as Reliable. The Cronbach’s Alpha Reliability value for all the measures was higher or very near to 0.7 as indicated by table above; therefore, all the questionnaires can be considered Reliable. KMO and Bartlett's test of Adequacy and Sphericity S. No. Variable Name 1 2 3 4 5

KMO Value Bartlett’s Test Sig. (Chi Square Value ) Personalization 0.709 113.096 .000 Effort Expectancy 0.721 180.566 .000 Flow 0.500 21.053 .000 Recommendation Intention 0.500 40.955 .000 Acceptance 0.500 36.281 .000

Kaiser-Meyer-Olkin measure for Sampling Adequacy value for all the measures was higher than 0.5 indicating that the sample was adequate to consider the data suitable for Factor Analysis. The Bartlett’s Test of Sphericity was tested through Chi-Square values which were significant at 0% level of significance indicating that the data was not spherical and can be considered for factor analysis. Factor Analysis 360

Principle component factor analysis with Varimax Rotation was applied on Personalization, Effort Expectancy, Flow, Recommendation Intention and Acceptance to find out the underlined factors of the questionnaires. The questionnaires converged on one factor only; therefore, the names of the variables can be taken as it is for future researchers. MANOVA Box’s M Test of Covariance Box's M 57.425 F 1.155 df1 45 df2 22812.296 Sig. .220 Tests the null hypothesis that the observed covariance matrices of the dependent variables are equal across groups. a. Design: Intercept + Gender + Qualification + Gender * Qualification Overall Homogeneity of Error Variances among groups formed on the basis of Gender and Qualification for all the Dependent Variables Personalization, Effort Expectancy, Flow, Recommendation Intention and Acceptance together was tested through Box’s M Test. The test value F was found to be 1.155 significant at .220 indicating that the Error Variances were equal for all the groups and the data was homogenous.

Levene’s Test of Equality Variances F df1 df2 Sig. Personalization .912 3 96 .438 Effort_Expectancy .727 3 96 .538 Flow 1.349 3 96 .263 Recommendation_Intention .184 3 96 .907 Acceptance .227 3 96 .877 Tests the null hypothesis that the error variance of the dependent variable is equal across groups. a. Design: Intercept + Gender + Qualification + Gender * Qualification Homogeneity of Variances among groups formed on the basis of Gender and Qualification for all the Dependent Variables Personalization, Effort Expectancy, Flow, Recommendation Intention and Acceptance independently was tested through Leven’s Test for Equality of Error Variance. The test value F was found to be 0.184 for Recommendation Intention significant at 0.907, 0.227 for Acceptance significant at 0.877, 0.912 for Personalization significant at 0.438, 0.727 for Effort Expectancy significant at 0.538, and 1.349 for Flow Significant at 0.263, indicating that there was no difference between the variances of the groups formed on the basis of gender and qualification in the above mentioned variables, indicating that the groups were homogenous for all these variables.

Test between Subject Effects 361

Source

Corrected Model

Intercept

Gender

Qualificati on

Gender * Qualificati on

Error

Total

Tests of Between-Subjects Effects Dependent Variable Type III df Mean Sum of Square Squares Personalization 128.360a 3 42.787 Effort_Expectancy 404.990b 3 134.997 Flow 30.000c 3 10.000 d Recommendation_ 8.350 3 2.783 Intention Acceptance 9.480e 3 3.160 Personalization 89401.000 1 89401.000 Effort_Expectancy 87202.090 1 87202.090 Flow 10160.640 1 10160.640 Recommendation_Inte 11004.010 1 11004.010 ntion Acceptance 10941.160 1 10941.160 Personalization 92.160 1 92.160 Effort_Expectancy 176.890 1 176.890 Flow 11.560 1 11.560 Recommendation_Inte 5.290 1 5.290 ntion Acceptance 7.840 1 7.840 Personalization 4.840 1 4.840 Effort_Expectancy 141.610 1 141.610 Flow 14.440 1 14.440 Recommendation_Inte .810 1 .810 ntion Acceptance 1.000 1 1.000 Personalization 31.360 1 31.360 Effort_Expectancy 86.490 1 86.490 Flow 4.000 1 4.000 Recommendation_Inte 2.250 1 2.250 ntion Acceptance .640 1 .640 Personalization 2618.640 96 27.278 Effort_Expectancy 2313.920 96 24.103 Flow 725.360 96 7.556 Recommendation_Inte 694.640 96 7.236 ntion Acceptance 835.360 96 8.702 Personalization 92148.000 10 0 Effort_Expectancy 89921.000 10 0 Flow 10916.000 10 0 Recommendation_Inte 11707.000 10 ntion 0 Acceptance 11786.000 10 0 362

F

Sig.

1.569 5.601 1.323 .385

.202 .001 .271 .764

.363 3277.463 3617.844 1344.741 1520.766

.780 .000 .000 .000 .000

1257.364 3.379 7.339 1.530 .731

.000 .069 .008 .219 .395

.901 .177 5.875 1.911 .112

.345 .675 .017 .170 .739

.115 1.150 3.588 .529 .311

.735 .286 .061 .469 .578

.074

.787

Corrected Total

Personalization 2747.000 Effort_Expectancy 2718.910 Flow 755.360 Recommendation_Inte 702.990 ntion Acceptance 844.840 a. R Squared = .047 (Adjusted R Squared = .017) b. R Squared = .149 (Adjusted R Squared = .122) c. R Squared = .040 (Adjusted R Squared = .010) d. R Squared = .012 (Adjusted R Squared = -.019) e. R Squared = .011 (Adjusted R Squared = -.020)

99 99 99 99 99

Model having Gender and Qualification as Independent Variables and Personalization, Effort Expectancy, Flow, Recommendation Intention and Acceptance as Dependent Variables were tested through F-Test Value of Corrected Model in the Test between Subject Effects Table and the F Values were 1.569, 5.601, 1.323, 0.385, 0.363 significant at 0.202, 0.001, 0.271, 0.764, 0.780 level of significance indicating that the model was having a Good Fit in case of Effort Expectancy whereas in all other cases the model was having a Poor Fit. Gender Differences were found in Effort Expectancy tested through F-Test Value of 7.339 significant at 0.008 level of significance whereas in all other cases i.e. Personalization, Flow, Recommendation Intention and Acceptance, no differences were found in both male and female students tested through F-Test values, 3.379, 1.530, 0.731, and 0.901 significant at 0.069, 0.219, 0.395, and 0.345 level of significance. Differences were found in Post graduate students and Under graduate students in case of Effort Expectancy tested through F-Test value of 5.875 significant at 0.017 level of significance, whereas in all other cases i.e. Personalization, Flow, Recommendation Intention and Acceptance, no differences were found in both post graduate students and under graduate students tested through F-Test values 0.177, 1.911, 0.112, and 0.115 significant at 0.675, 0.170, 0.739, and 0.735 level of significance. No differences were found in the interaction between Gender and Qualification in case of all the variables of the study i.e. Personalization, Effort Expectancy, Flow, Recommendation Intention and Acceptance, tested through F-Test values 1.150, 3.588, 0.529, and 0.311, and 0.074 significant at 0.286, 0.061, 0.469, 0.578 and 0.787 level of significance. MANCOVA Multiple Analysis of Covariance was applied between Personalization, Effort Expectancy, and Flow as independent variables and Acceptance and Recommendation Intention as Dependent Variables.

363

Tests of Between-Subjects Effects Dependent Variable Type III df Mean F Sum of Square Squares Corrected Recommendation_Intention 268.121a 3 89.374 19.730 Model Acceptance 333.170b 3 111.057 20.837 Intercept Recommendation_Intention 4.618 1 4.618 1.019 Acceptance 4.785 1 4.785 .898 Personaliz Recommendation_Intention 7.893 1 7.893 1.742 ation Acceptance 4.923 1 4.923 .924 Effort_Exp Recommendation_Intention 6.530 1 6.530 1.442 ectancy Acceptance 2.569 1 2.569 .482 Flow Recommendation_Intention 158.798 1 158.798 35.056 Acceptance 245.701 1 245.701 46.099 Error Recommendation_Intention 434.869 96 4.530 Acceptance 511.670 96 5.330 Total Recommendation_Intention 11707.000 100 Acceptance 11786.000 100 Corrected Recommendation_Intention 702.990 99 Total Acceptance 844.840 99 a. R Squared = .381 (Adjusted R Squared = .362) b. R Squared = .394 (Adjusted R Squared = .375) Source

Sig.

.000 .000 .315 .346 .190 .339 .233 .489 .000 .000

Model having Personalization, Effort Expectancy, and Flow as independent variables and Acceptance and Recommendation Intention as Dependent Variables was tested through Corrected Model of Test between Subject Effect Table in Multiple Analysis of Covariance and the F-Test values indicated that the model was having a Good fit for both the dependent variables i.e. Recommendation Intention and Acceptance tested through F-Test Values 19.730 and 20.837 significant at 0% level of significance. Significant effect of Flow was found on Recommendation Intention and Acceptance tested through F-Test values 35.056, 46.099 significant at 0.000 level of significance, whereas no significant effect of Personalization and Effort Expectancy was found on Recommendation Intention and Acceptance tested through F-Test values 1.742, 0.924 and 1.442, 0.482 significant at 0.190, 0.339, 0.233 and 0.489 level of significance.

Conclusion Questionnaires on Personalization, Effort Expectancy, Flow, Recommendation Intention and Acceptance were standardized using Cronbach’s Alpha Reliability and Principle Component Factor Analysis with Varimax Rotation and the results indicated that the questionnaires were reliable for the current study and the Factor Analysis converged on one factor for all the variables. The results of Multiple Analysis of Variance indicated that Post Graduate Students and Under Graduate Students differ on Effort Expectancy of WhatsApp and also significant difference was found in Effort Expectancy of both Male and Female usrs 364

of WhatsApp. The findings also indicated that liking towards features of WhatsApp was found higher in Undergraduate students compared to Postgraduate students, also Female students were liking features of WhatsApp more than Male students. Results of Multiple Analysis of Covariance indicated that Flow as independent variable was having significant effect on Acceptance and Recommendation Intention which indicates individuals use and recommend WhatsApp for enjoyment and relaxation. Finally from the results it can be concluded that the main motive of using social networking mobile applications is enjoyment and relaxation and since more and more people can connect at one platform easily through these applications, feeling of enjoyment and relaxation increases and that is the sole reason of recommendation of such applications by current users to new users. Since undergraduate students have more time to chat with friends and lesser burden of studies and career, they can be very attractive segment to target by such application developers. Females are more talkative than males and therefore can be a very attractive segment for such application developers; this was also supported by the results.

References Cheung, C. M., Chiu, P.-Y., & Lee, M. K. (2011). Online Social Networks: Why Do Students Use Facebook? Computers in Human Behavior , 1337-1343. Dhami, A., Agarwal, N., Chakraborty, T. K., Singh, B. P., & Minj, J. (2013). Impact of Trust, Security and Privacy Concerns in Social Networking: An Exploratory Study to Understand the Pattern of Information Revelation in Facebook. In B. Kalra, D. Garg, R. Prasad, & S. Kumar (Ed.), Proceedings of the 2013 3rd IEEE International Advance Computing Conference (IACC) (pp. 465-469). Ghaziabad, India: IEEE. John, S. P. (2013). Antecedents and Effects of Computer Self-Efficacy on Social Networking Adoption Among Asian Online Users. AMCIS 2013 Proceedings (pp. 115). Chicago, Illinois: http://aisel.aisnet.org. Lee, H. S., Kim, T. G., & Choi, J. Y. (2012). A Study on Factors Affecting Smart Phone Application Acceptance. IPEDR , 27, 27-34. Lin, K.-Y., & Lu, H.-P. (2011). Why People Use Social Netorking Sites: An Empirical Study Integrating Network Externalities and Motivation Theory. Computers in Human Behavior , 1152-1161. Maheshwari, D. P. (2014). Frequency of Using WhatsApp Messenger among College Students in Salem District, Tamil Nadu. International Journal of Computer Science and Mobile Applications , 2 (ISSN: 2321-8363), 12-22. Shambare, R. (2014). The Adoption of WhatsApp: Breaking the Vicious Cycle of Technological Poverty in South Africa. Journal of Economics and Behavioral Studies , 6 (7), 542-550. 365

Yeboah, J., & Ewur, G. D. (2014). The Impact of WhatsApp Messenger Usage on Students Performance in Tertiary Institutions in Ghana. Journal of Education and Practice , 5 (6), 157-164. Lee Jaeyoung and Suh Euiho, "An Empirical Study of the Factors Influencing Use of Social Network Service" (2013), PACIS 2013 Proceedings, Paper 181. http://aisel.aisnet.org/pacis2013/181

366