Annals of Library and Information Studies Vol. 58, June 2011, pp. 139-150
Evaluation indicators of library websites of selected research institutions in India Jaya Kalra1 and R K Verma2 1
Research Scholar, Kurukshetra University, Kurukshetra, Email:
[email protected] Scientist G, National Institute of Science Communication and Information Resources (CSIR-NISCAIR), 14-Satsang Vihar Marg, New Delhi-110067, Email:
[email protected]
2
An attempt has been made to evaluate the library websites of selected research institutions in India both quantitatively and qualitatively on the basis of Web Impact Factor, pre-defined check list of indicators, and online questionnaire survey. Studies the existing procedures and practices of evaluation indicators of websites at national and international levels as reported in the literature. Reveals that there are many inconsistencies and terminological issues and many effective methodologies/techniques for websites evaluation which are practiced at international level but are not being used in such studies in India. Two major quality components viz. ‘usability’ and ‘usefulness’ covering the various evaluation indicators at ‘indicative’ and ‘illustrative’ level have been examined. Concludes that there is scope of further research for developing and standardizing the indicators with multiple evaluation perspectives at various levels. Suggests the active and crucial role of library and information scientists in the evaluation process from design to content management in future.
Introduction With the application of information and communication technologies in libraries, the concept and the role of library and librarian is dramatically changing, especially with the invention of internet. The importance of the internet and the World Wide Web in libraries can no longer be questioned. Critically evaluating websites is essential to conduct quality research. With the advent of the World Wide Web, the availability and accessibility of information in these electronic formats in libraries and other type of organizations has been made easier because of the web’s graphic and interactive capabilities. These capabilities allow users to search databases, view full text articles including pictures and tables, etc. As a result, organizations of all types are recognizing the importance of the World Wide Web as a tool, not only for gaining access to information, but also as a means of disseminating information about their activities, products and services. However, even a high quality library may have a low quality website that turns users off. Different libraries have a wide range of target groups, level of service, resources, etc. Some organizations and institutions are designing and developing their own library websites. It has been observed that despite the effort made by the in-house experts or outside agency, most of the
library websites are not updated regularly. At the same time, the content and information available on the library websites are also not up to the mark. Many libraries have created websites to serve their patrons and the general information community, but how useful are these websites beyond providing information about the library and its collections is yet to be fully explored. Although a body of literature pertaining to the systematic study of websites’ content and structure is still developing, it seems that studies of the content of library and information web sites are not explored much in India. Since the web itself is still developing and websites are in a constant state of development, current literature on what makes a website useful is lacking. Many guidelines and recommendations on what makes a good or useful website is abound especially on the web, e.g., Yale University (http://webstyleguide.com); research-based National Cancer Institute (http://usability.gov); IBM web design guidelines; and W3C guidelines, etc. However, no concrete standards for library websites have been set yet. The most serious and widespread objections against information found on the library websites are: variations in terminology related to website quality and evaluation; lack of adopting multilevel approaches/methodologies for evaluating the library
ANN. LIB. INF. STU., JUNE 2011
140
websites; and absence of established standard mechanism to have users feedback of website design. In this context, an attempt has been made to evaluate the library websites (or simply the websites) of selected research institutions in India. Objectives of the study ●
To study the existing procedures and practices of evaluation indicators (henceforth to be mentioned as ‘indicators’ only) of websites at national and international level;
●
To study selected websites both quantitatively and qualitatively on the basis of calculation of Web Impact Factor (WIF), testing with pre-defined check list of indicators and questionnaire survey; and
●
To address certain issues for future studies in area of websites evaluation which may help in improving the quality of such websites.
Review of literature
convenient e-mail address to a responsible party, and links to the library’s/parent institution’s home pages at indicative level. While at illustrative level, inclusion of library’s/institution’s names and logos, online forms for request or feedback, information about links pertinent instructions or warning statements to file/document types, heading and titles clear, coherent, and concise headings, clearly titled screen, etc. are included. In this context, Sasikala4 reports the findings taking a case study for Pace University Library which framed certain set of questions at illustrative (micro) level, and are helpful in evaluating the validity and usability of web site. For example, a few questions covered are: Is there another information source that you know of where you could find this information or check for accuracy? If so where? Can you identify the author or producer of the site? Who are they?; Is contact information for the author or producer available?; Is the author affiliated with an organization agency, company, or institution? If so, which one?; Is there a link to more information about the organization etc.
Similarly Tsakonas and Papatheodorou2 present a model which analyses the attributes of electronic information services’ components taking ‘system’, ‘user’ and ‘content’ in to account as framework in the process of their mutual interaction. For usefulness, they suggest relevance, format, reliability, level, and timeliness as the five resource attributes, whereas for usability, again five attributes viz., ease of use, aesthetic appearance, navigation, terminology, and learnability have been taken in to account.
Kumar and others5 conducted a research study in which they have analyzed and compared the contents and usability of six Indian Institutes of Management (IIMs’) library websites. A checklist was designed to evaluate the content of the web sites on the basis of this theoretical analysis of possible roles of IIM library web sites on the basis of previous evaluations After studying, they found that only 67 percent of the IIMs’ library websites give information on e-books and 85 percent of the library websites provide information on e-databases and CD-ROM databases. It was also observed that only 16 percent of the library websites have information on DVD’s collection, institutional repository and new archives and only 50 percent have got the information related to INDEST. The general information about library includes mission, statement of the library, working hours, library rules, sections, committee and other information. It also includes authority, copyright, domain name, Webmaster and aesthetic features of the site.
Chao3 provides detailed features (indicators) as checklist both at indicative (macro) and illustrative (micro) level. These are summed up as: presentation, suitable background, colour, font, icon, image, size, layout, and text. Organized and consistent scheme, reliable links, and concise home page, integration,
Chowdhury6 provided a detailed checklist of usability features for digital libraries: (i) interface features: types of interface (e.g. simple vs expert search interface); language(s) of the interface; navigation options, shortcuts, and system information; screen features, i.e. use of colours, typography, layout and
Literature search led to studies on the efforts made by foreign institutions particularly the academic institutions for evaluating websites. Buchanan and Salako1 highlight the issue of evaluating usability and usefulness as a challenge on ‘what’ to measure and ‘how’ to achieve the same. They have identified attributes combined with integrated measurement framework derived from the ‘goal’, ‘question’, and ‘metric’ paradigm which could be adapted and further refined on a case-by-case basis.
KALRA & VERMA: EVALUATION INDICATORS OF LIBRARY WEBSITES OF SELECTED RESEARCH INSTITUTIONS
graphics; and personalization of the interface, e.g. permanent choice of interface language and/or retrieval level, number of records on one page, sort options, (ii) search process, database/resource selection: options for selection; and cross-database search facilities, (iii) query formulation, (iv) search options for text, multimedia, specific collection, etc. access points: search fields, (v) search operators, (vi) results manipulation: format(s) for display of records; number of records that can be displayed; navigation in the list of records; marking of the records; sort options; and printing, exporting and e-mailing of records, and (vii) help: appropriateness; usability; consistency of terminology, design and layout; and linguistic correctness. Koovakkai and Thensi7 conducted a study of library related information on thirty-eight university sites of two South Indian states – Kerala and Tamil Nadu. They found that only 54.54 percent of the websites of these States have direct link to libraries and 31.82 percent has indirect link to libraries. They also found that only 50 percent of the websites of States provide library history and general information about libraries, 57.89 percent provides the details regarding the total number of library collection, 47.37 percent of the websites give information about library working hours, 13.64 percent of the state universities provide information about the library staff in their websites. Raju and Harinarayana8 studied 30 library websites of top science universities for their design features with special reference to ‘usability’. They made use National Cancer Institute guidelines for features of the same. Various usability features included are : FAQs; time out; link back to home page; news and events; navigation ant its various types e.g. global, tophorizontal, left-vertical, embedded, local, and breadcrumb (hierarchical level like ‘You are here” on the website); links; graphics, and multimedia; logo of the library website, and global search feature. This shows that besides including indicative and illustrative indicators, a few of these pertaining to ‘usefulness’ are also covered, for example, FAQ, and news and events. This implies that there is no sharp line of distinction and standards between the two, though broadly at indicative level, the former pertains mainly to design or accessibility aspect and the later covers content display and availability of required information in desired formats. Thus at the indicative level, ‘usability’ includes effectiveness, efficiency,
141
aesthetic appearance, terminology, navigation, and learnability as key parameters, while ‘usefulness’ covers reliability, relevance and currency as the key attributes of the system. The literature survey shows that there are wide variations in the status of the various library websites. Terminological issues Usability vs. Usefulness
According to the International Standards Organization (ISO 9421-11:1998), “Usability refers to the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of user. The quality of being useful or to what extent something is useful. It is actually the act of using or the state of being used. For example, information about library collection, services, up-to-dateness of information etc. There are different methods for evaluating the usability of the website, e.g., web service, focus groups, group tests, thinking aloud, observations and transaction logs etc. As far as usefulness is concerned, its measurement is mainly based on the feedback given by the users survey incorporating the prechecklist which may include a few parameters like about library, working hours, library rules, library staff, membership, information sources, etc. However, there is no universal agreement on what these two quality aspects i.e., usability and usefulness include in their scope. For example, and as mentioned in previous section, inconsistencies in the usage have been observed where, FAQ & news and views have been included in usability9 though these features fall under the scope of ‘usefulness’. In some papers the terms ‘accessibility, ‘availability’ and ‘quality’ features are also used without referring to the above two aspects. It has been observed that there is some agreement on usage of these two major quality aspects, i.e., usability and usefulness, however, what indicators under the same level (indicative and illustrative) should be included, have not been streamlined and standardized. This scenario is reflected in a conceptual diagram given in Figure 1. Website/Portal level
The use of both the terms ‘Portal’ and ‘Website’ have been in existence in the previous studies. A portal is generally a vehicle for gaining access to a multitude
ANN. LIB. INF. STU., JUNE 2011
142
Evaluation (Quality Criteria)
Usability
Indicative (Macro Level)
Usefulness
Illustrative (Micro Level)
Easy availability Navigation Terminology ----
Indicative (Macro Level)
Colour Code, Back link to Home Page, Appropriate content headings----
Format Relevance Reliability ----
Illustrative (Micro Level)
Specified Downloading, Content vs Worktask,Credibility of Information.----
Fig. 1—Conceptual framework
of 'services'. A website is a destination in itself. As such the term website refers to a location on the internet that is unique and can be accessed through a URL. By that definition, a web portal seems to be a website. However, there is a distinction between the two terms based on the subject and content of the website. A website is also a web portal if it transmits information from several independent sources that can be, but not necessarily are, connected in subject thus offering a public service function for the visitor which is not restricted to presenting the view(s) of one author. The Indian websites evaluation literature uses websites as library portals also. For example, Malhan and Rao10 advocated a portal approach to library websites, Shokeen11 highlights seven attributes viz., context, content, community, customization, communication, connection, and commerce. In another paper, Krishnamurthy and Chan12 introduce a new library portal, i.e., Library Search Aid, which performs metasearching across the library’s four online search databases. These are ISIB Online subscription, ISIB Periodical holdings, Science Direct consortium, and Union Catalog of Publications in Bangalore Libraries which were separately stored in the open web.
Methodologies/techniques at international level In the context of qualitative evaluation indicators, there exist various methods/techniques used at international level such as focus group sessions, card sort protocol, user-centered approach, think aloud protocol, and survey questionnaires- to name a few. A short description of the three of these viz. card sort protocol, user-centered approach and think aloud protocol is presented as reference. These techniques do not seem to have been practiced much in India as evident from the select literature survey. Card sort protocol
This method is used when a new website is planned for various elements to be organized and is treated as the first step. Jakob Nielsen13 describes this usability method as “generative” and one used when you “don’t have a design, and [your] goal is to find out how people think about certain issues” – the issue for the redesign team being how to organize the vast array of library-supported collection and services on the website. The protocol allows the team to focus solely on the information architecture of the site (i.e., where to put what) without concern for the design. A
KALRA & VERMA: EVALUATION INDICATORS OF LIBRARY WEBSITES OF SELECTED RESEARCH INSTITUTIONS
card sort protocol is executed using note cards containing words or phrases. Participants are asked to organize the cards into groups most meaningful to them, putting cards with similar concepts together in the same group. Depending on the goals of the protocol the categories are predefined and named, or the participant determines the number of the categories and their names. User-centered approach
Usability evaluation can be both formative and summative, and is commonly conducted by inspection and/or test, the former without involvement of the user, the latter typically does include it. Inspection methods include heuristic evaluation, cognitive walkthrough, and action analysis, while test methods include questionnaire, thinking aloud, and field observation as described by Holzinger14. In contrast to usability, usefulness is much more dependent upon user involvement. It can be considered during formative stages of system design (based on user input/statement of requirement or functioning prototype/simulation), but evaluation is dependent upon user interaction. This includes an attempt to discretely video record participants completing tasks, which may prove difficult in practice, as both interface and user must be in detailed and close shot to facilitate observation. Video recording can also be time consuming, both in setup and later analysis. Think aloud protocol
Based on the information gathered from the structured analysis, the surveys, and the card sort protocol, a professional website designer, Guerard Design Office, developed a prototype of the new UCLA Library website15. This prototype served as the basis for another usability test known as a “think aloud protocol”. For this test, participants are asked to complete tasks using a prototype website and to “think aloud”, i.e., to say everything they are thinking while they complete the tasks. One observer notes whatever the subject says and if and how the participant completes the task. Another observer directs the participants and answers any questions asked. This type of user test provides essential realtime feedback on potential problems in the design and organization of a website.
143
Evaluation Indicators for study
The unfiltered, free-from nature of the web provides unique challenges in determining a website' appropriateness as an information source. Many guidelines and checklists are available on the internet, for evaluating internet sources as mentioned in the earlier section. Generally, the information sources have to be evaluated in terms of coverage, scope, intended users, timeliness, authority, objectivity, and documentation. In the current studies, the indicators for usability and usefulness aspects have been based on the attributes of the electronic Information services’ components that affect user interaction and correlates them in usability and usefulness evaluation as studied by Tsakonas16. These are – (1) usability : effectiveness, efficiency, aesthetic appearance, terminology, navigation, learnability, content presentation, labels, search process, easy availability and (2) usefulness: relevance, reliability, level of details, clarity, specificity, timeliness, authenticity, time constraints, completeness, and topicality. Further these indicators are used at ‘indicative’ level and represent ‘what’ aspect and for ‘how’ aspect various examples to evaluate at ‘illustrative’ level are being used by the information seekers. Methodology used in the study Quantitative aspects
Among the quantitative studies, the concept of using Web Impact Factor, i.e., measuring average link frequencies as one of the quantitative indicators is very much in use. This concept was introduced during 1998 by Ingwersen17. Though the same was based on an analogy between hyperlinks and citations and as adaptation of ‘Impact Factor’ of Garfield (1972)18 for the web, the time periods for WIF and journal IF are different. Though WIF can be regarded as a useful tool for measuring the relative visibility of an organization on the web, it should not be taken as the only indicator of the use, visibility, and popularity of the website. Ramesh Babu et.al.19 in a paper on ranking universities indicates other measures also with the result that different measures of webometrics lead to different rankings of the same institutes. So, the question arises as to what should be the true evaluation indicator in quantitative terms? In the current preliminary study, an attempt is made to
144
ANN. LIB. INF. STU., JUNE 2011
assess the relative visibility of the websites of 46 Research institutions through the use of Web Impact Factor (WIF) employing Yahoo Explorer facility. For this purpose the formula WIF = A/B has been used where A = total no. of inlinks (links coming into a site from other sites) pages only and B = number of web pages published in the web site which are indexed by the search engine, not all web pages available in the web site. Only inlinks were taken in to account to make it a simple one and also because the objective was to assess the importance of the indicators and not the ranking of institutions which may require more such parameters like external link WIF, revised WIF, and using self links, sublinks, and sub-domains etc. on a continuous basis. The data for respective WIFs were taken twice at an interval of 15 days represented as ‘old’ and ‘revised’ in Table 1. Qualitative aspects
For qualitative studies, 10 indicators at ‘Indicative’ level both for usability and usefulness components were taken. A checklist of indicators for usability and usefulness were used based on certain minimum level criteria and using info scientist’s perception (researcher in the present case). The details are available in Annexure I and II. The availability of respective numbers of available indicators (out of total 10) under each institution were decided by a predefined set of common questions as the criteria. The resulting figures out of this exercise of check points (at ‘Illustrative’ level) have been tabulated in Table 1. Feedback survey
A questionnaire survey was conducted to investigate as to which indicators (features or attributes) relevant to system and content are most important and affect the successful completion and work tasks of information seekers. For this purpose, an online questionnaire using KwikSurveys.com was used with three parts viz. (A) general, (B) usability features, and (C) usefulness features There are 5, 11 and 15 questions in 3 parts respectively. Section (A) pertains to general information about the status, background, computer literacy level etc., Section (B) contains usability related questions and Section (C), the usefulness related aspects including two open ended questions soliciting information on any new
information services and new features. Every question employed a four-item scale excluding neutral value. These questions investigated assigned and perceived importance of the specific features, ranging from least important to very important. A total of 70 questionnaires were e-mailed to scientists and library and information scientists affiliated to 46 research institutions selected at random from the list provided at URL:http://cyberjournalist.org.in/frame3.html Analysis From Table 1, it is observed that in more than 50 percent cases, the WIFs have registered a downfall, and almost no change was noticed in respect of 7 (15%) institutions. A comparison among these indicators under usability and usefulness components in respect of the individual indicates that in more than 50 percent (27) institutions there is no direct relationship between the two aspects, as perceived through a test of common set of criteria for all the institutions. For the two sets of figures, a correlation coefficient was also calculated using online calculator at URL: http://www.easycalculation.com/statistics/correlation. php. The same was found to be as 0.08539. This clearly shows that no significant relationship exist among the indicators of usability and usefulness components. Though inconclusive in absence of systematic correlation analysis with large sample, it certainly indicates that there are websites which are very good in usability i.e., ‘design; features but their usefulness i.e., ‘content’ may not be up to the mark and vice versa. This also requires a further systematic study with questionnaire survey using large sampling and supplemented by different methodologies like focus groups, think aloud, and card sort besides which are prevalent at international level and not practiced in India as seen through the literature survey. It would also be interesting to explore further the correlation, if any between WIF’s (i.e., quantitative) and qualitative indicators under usability and usefulness used in websites evaluation with predefined check points. Usability
It was seen that representation from various subject fields is ensured while finalizing the list of such institutions. Thirty respondents (42.85%) completed the questionnaire and most of them were from
KALRA & VERMA: EVALUATION INDICATORS OF LIBRARY WEBSITES OF SELECTED RESEARCH INSTITUTIONS
Table 1—List of selected research institutions along with their WIFs, and number of available indicators under usability and usefulness components Sl. No. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46
Institution Agharkar Research Institute, Pune Birla Institute of Technology and Science, Pilani Central Arid Zone Research Institute, Jodhpur (Rajasthan) Central Building Research Institute, Roorkee (U. P.) Central Council for Research in Homoeopathy, Delhi Central Electrochemical Research Institute, Karaikudi Central Electronics Research Institute, Pilani Central Inland Capture Fisheries Research Institute, West Bengal Central Institute of Agricultural Engineering, Bhopal Central Marine Fisheries Research Institute, Kochi Central Mechanical Engineering Research Institute, Durgapur Central Plantation Crops Research Institute, Kasaragod Centre for Development Studies, Trivandrum Centre for Earth Science Studies, Trivandrum Council of Scientific and Industrial Research, Delhi Defence Research and Development Organisation, Delhi Electronic Research and Development Centre, Trivandrum The Energy and Resources Institute, Delhi Forest Research Institute, Dehradun Harish Chandra Research Institute, Allahabad Indian Council of Agricultural Research, Delhi Indian Council of Medical Research, Delhi Indian Institute of Chemical Biology, Kolkata Indian Institute of Management, Ahmedabad Indian Institute of Pulses Research, Kanpur Indian Institute of Technology, Delhi Indian Statistical Institute, Kolkata Institute for Plasma Research, Ahmedabad Institute of Mathematical Sciences, Chennai Institute of Physics, Bhubaneswar Kerala Forest Research Institute, Peechi National Botanical Research Institute, Lucknow National Brain Research Centre, Delhi National Centre for Biological Sciences, Bangalore National Geophysical Research Institute, Hyderabad National Institute for Interdisciplinary Science and Technology, Trivandrum National Institute of Mental Health and Neuro Sciences, Bangalore National Institute of Ocean Technology, Chennai National Institute of Oceanography, Goa National Institute of Science Communication and Information Resources, Delhi National Transportation Planning and Research Centre, Trivandrum Petroleum Conservation Research Association, Delhi Rajiv Gandhi Centre for Biotechnology, Trivandrum Science and Engineering Research Council, Delhi Sree Chitra Tirunal Institute of Medical Sciences and Technology, Trivandrum Tata Institute of Fundamental Research, Mumbai
WIF (Old) 1
WIF (Revised) 2
Usability indicators 3
Usefulness indicators 4
0.682 0.761
0.69 0.78
3 7
3 3
0.444 1.59 1.346 0.701 0.652
0.43 2.18 1.32 0.69 0.614
7 5 3 5 3
3 5 10 5 7
1.058 1.873 6.91
1.058 1.82 7.241
5 3 3
5 3 7
0.23 1.256 4.175 2.224 2.345 0.467
0.22 0.5 4.17 2.2 2.29 1.519
3 5 3 7 5 3
10 5 7 3 8 3
4.3 0.277 2.97 2.082 0.622 0.416 1.905 1.431 0.798 0.365 0.143 0.057 2.32 2.16 0.726 1.083 0.565 0.11 2.761
2.76 0.198 5.78 2.04 0.07 0.42 1.88 1.45 1.081 0.359 0.43 0.057 2.13 2.38 0.708 1.072 0.56 0.103 2.73
3 3 7 10 10 10 10 10 3 5 10 3 3 7 5 7 3 10 3
3 8 3 5 8 10 3 5 10 5 10 2 3 3 5 3 3 10 3
0.617
0.617
7
3
4.483 0.905 3.118
4.35 0.8 3.061
2 7 5
2 3 5
1.578
1.54
10
8
7.636 1.072 1.25 0.379
1.09 1.048 1.23 0.37
5 5 3 7
5 5 7 3
2.553 0.086
2.517 0.085
3 2
10 8
145
ANN. LIB. INF. STU., JUNE 2011
146
information science field (80%) followed by scientists from medical, agriculture and engineering fields having only about 20 percent share. From the responses of 30 respondents and assuming relative values ranging 1 to 4 for ‘not very important’ to ‘very important’ options, the two categories of groups by adding first two columns figures and another by adding last two columns were made for convenience. It was found (Table 2) that indicators like search process (29, 96%), navigation (27, 90%), easy availability (26, 86%), efficiency (26, 86%), learnability (26, 86%), labels (22, 73%), and effectiveness (21, 70%) were assessed to be either important or very important by more than 50 percent of the respondents. Rest of the features viz., content
presentation (16, 53%), and terminology (15, 50%), were treated in the category of ‘uncertainty’ or at ‘moderate’ level. They did not attach any importance to aesthetic appearance, since only 10 (33%) respondents felt the importance of the same. Usefulness
From Table 3, the analysis of 30 responses leads to the observation that reliability (27, 90%), authenticity (26, 87%), level of details (25, 83%), specificity (25, 83%), relevance (23, 77%), timeliness (20, 67%), time constraints (17, 57%), and completeness (17, 57%) are the most liked features by the scientists. Topicality (15, 50%) may be treated as at ‘uncertain’ level. However, it was observed that they had given
Table 2—Survey analysis – usability Checkpoints
Not very important
It depends
Important
Very important
Navigation Labels Easy Availability Effectiveness Aesthetic Appearance Learnability Terminology Efficiency Content Presentation Search Process
0 1 3
3 7 1
14 12 9
13 10 17
2 7
7 12
19 9
2 1
1 0 1 3
4 15 3 11
16 7 20 14
10 8 6 2
0
1
7
22
Table 3—Survey analysis – usefulness Checkpoints Relevance Reliability Level of Details Completeness Topicality Time Constraints Clarity Authenticity Timeliness Specificity
Not very Imp./Rarely/Easy
It depends/Sparingly
Important/ Regularly/Difficult
Very Imp./ Frequently/Very Difficult
1 0 2
5 3 2
16 19 20
7 8 5
0 2 2
13 13 11
7 12 14
10 3 3
11 0 0 0
15 4 10 2
4 18 16 18
0 8 4 7
KALRA & VERMA: EVALUATION INDICATORS OF LIBRARY WEBSITES OF SELECTED RESEARCH INSTITUTIONS
least importance to ‘clarity (4, 13%)’ feature. Regarding open questions, a few suggestions from questionnaire survey were as: “Provision of up to date and authentic contact information for the author or producer should be available; To provide more links leading to more information about the organization, date of creating various links in the website when was the web site last revised; provision of advanced search facility, online help, indication about scope of coverage and treatment of the subject (scholarly or general), feed back, provision to add new links by users, online question and answer facility etc; immediacy, accuracy and currency of information should be treated as the most important factors in fulfilling users information needs; and usability training to be imparted to the staff managing contents of the website”. The analysis in respect of the other parameters like status, organization, subject areas, reasons for using website, subject-wise, and computer literacy level did not yield any conclusive results in view of the poor response rate i.e. 42.85 percent only, so the same are not discussed. However, it appears that the information and computer literacy plays an important role than the scientific discipline and that respondents from various fields share almost the same concerns about the quality of interaction. Limitations Since the present study is ‘preliminary’ in nature and does not cover a large sample, the actual users with respect to a particular research institution library websites could not be taken. However, the same was done collectively for all users who might have used the respective websites. Discussion As observed from the analysis of survey, reliability, authenticity, level of details, specificity, relevance, timeliness, level of details, time constraints, and completeness are the most liked indicators by the respondents. Out of these, two indicators viz. relevance and timeliness have been considered critical for the successful conclusion of the respondents works/tasks among other indicators as reported in previous studies concerning reliability of online scholarly information by Liu20. Easy to use systems lead to perform much better and easy to navigate system contributes to fast learning abilities. Aesthetic
147
appearance, however, is not important and does not affect the work/task also. The two concepts viz. ‘usability’ and ‘usefulness’ constitute the objects of two main evaluation research areas for further exploring the complex phenomena of increasing the quality of user interaction. There may be an agreement on the number of indicators both in the context of usability and usefulness at the indicative (macro) level, the standardization in respect of illustrative, i.e., ‘how’ aspect level indicators is not an easy task, because of the complexities in perceived interaction of the website users with different cognitive, and computer literacy level and in absence of any agreed upon methodologies/techniques. Given the limitations of quantitative indicators, the qualitative indicators seem to be more reliable and well explored in the earlier studies. Particularly at international level, Indian literature survey reveals that as for the methodologies and techniques of evaluating websites, we are very much lagging behind. Just like the concept of ‘quality’ in conventional form the websites evaluation is also to be designed focusing on the interest of the intended audience (customers) interest at different intervals of time. Conclusion The present preliminary study leads to certain important issues for quality improvements in websites particularly in the context of usability and usefulness. The earlier study21 finds that some evaluation parameters, i.e., 5 features each (in contrast to 10 in the present study) under usefulness and usability are interconnected and users do not find discriminating differences between them. However, the current study does not find any significant relationship between the two as revealed from the correlation coefficient in qualitative study. Also the quantitative study in terms of WIF does not lead to any conclusive findings. This aspect is further open for future studies where a more rigorous and large sampling would be required to arrive at some meaningful findings. This may require an agreement on adoption of evaluation parameters not only for usefulness features at illustrative (micro) level, where the role of LIS professionals is more important, but also for standardizing usability features at indicative (macro level). Another aspect is closely related with the context of ‘work’ and ‘content’ suggesting more features that increase relevance to
ANN. LIB. INF. STU., JUNE 2011
148
the intended audience. This refers to one of the suggestions made by the respondent in the current preliminary feedback survey which says that ‘there should be provision of online feedback surveys on website’ for regular interaction and websites to be regularly redesigned based on the users feedback also. The above findings lead to the necessity of filling certain gaps related to standardization of evaluation indicators under the scope of ‘usability’ and ‘usefulness’ both at indicative (macro) level, and illustrative (micro) level. It may include formulation of the guidelines (mandatory and optional) with suitable examples/illustrations under the scope of each indicator. At this juncture, the role of LIS professionals as intermediaries for users, comes in to play. They should assess how well the information resources are selected, represented, organized, structured and managed under the various typical activities/services such as collection development; user studies; information organization; classification and indexing; resource discovery (metadata); access and file management, information retrieval; and standards etc. It is believed that, the findings and issues emerging out of this preliminary study would provide useful input towards streamlining the issue of evaluation indicators of library websites. Further, by adopting the suggested approach the effective role of library professionals in evaluating library websites from design to content management stage will go a long way in effective evaluation of the same.
5.
6.
7.
8.
9. 10.
11.
12.
13.
14. 15.
References 1.
2.
3.
4.
Buchanan S and Salako A, Evaluating the usability and usefulness of a digital library, Library Review, 58(9) (2009) 638-651. Tsakonas G and Papatheodorou C, Analysing and evaluating usefulness and usability in electronic information services, Journal of Information Science, 32(5) (2006) 400-419. Chao H, Assessing the quality of academic libraries on the web : The development and testing of criteria, Library and Information Science Research, 24 (2002)169-194. Sasikala C, Evaluation of websites : A study, Proceedings of the 4th International Convention on Automation of Libraries in Education and Research (INFLIBNET, Ahmedabad), (2003) 1-12.
16. 17. 18. 19.
20.
21.
Kumar S B T, Prithvi R K R, Naik A S and Reddy R, Content analysis of Indian Institute of Management library websites : An analytical study, Proceedings of the 7th International Convention on Automation of Libraries in Education and Research (INFLIBNET, Pondicherry) (2009) 194-201. Chowdhury S, Landoni M and Gibb F, Usability and impact of digital libraries : a review, Online Information Review, 30(6) (2006) 656-680. Koovakkai D and Thensi P, Library related information on the university sites: A study of the websites of universities in two south Indian states, IASLIC bulletin, 53(2) (2008)114118. Raju N V and Harinarayana N S, An analysis of usability features of library websites, Annals of Library and Information Studies, 55 (2008) 111-122. Ibid. Malhan I V and Rao S, Portal approach to library websites : libraries need to discover new integrated platforms, Proceedings of the 4th International Convention on Automation of Libraries in Education and Research (INFLIBNET, Gulbarga), 2006, 627-635. Shokeen N S, Use of portals for improved access to libraries and information services in the web environment, H.A. Bulletin, 45(1-2) (2009) 5-8. Krishnamurthy M and Chan W S, Implementation of library portals for information resources : A case study of the Indian Statistical Institute, Bangalore (ISIB), The International Information and Library Review, 37 (2005) 45-50. Nielson J, Card Sorting : how many users to test, Available at :www.useit.com/alertbox/20040719.html (accessed on 10th September 2010). Holzinger A, Usability Engineering for Software Developers, Communications of the ACM, 48(1) (2005) 71-74. Turnbow D, Kasianovitz K, Snuder L, Gilbert D and Yamamoto D, Usability testing for web redesign : a UCLA case study, OCLS Systems and Services, 21 (3) (2005) 22634. Tsakonas. Op. Cit. Ingwersen P, The calculation of web impact factors, Journal of Documentation, 54(2) (1998) 236 -243. Noruzi A, The Web Impact Factor: a critical review, The Electronic Library, 24(4) (2006) 490-500. Ramesh Babu B, Jeyshankar R and Nageswara R P, Websites of Central Universities in India: a webometric analysis, Bulletin of Library and Information Technology, 30(4) (2010) 33-43. Liu Z, Perceptions of credibility of scholarly information on the web, Information Processing and Management, 40(6) (2004) 1027-38. Tsakonas. Op. Cit.
KALRA & VERMA: EVALUATION INDICATORS OF LIBRARY WEBSITES OF SELECTED RESEARCH INSTITUTIONS
149
Annexure I ‘Usability’ Table Indicator
What it means
Scope of Coverage/Examples (based on minimum criteria)
1. Aestheic appearance
Appearance is visually appealing not cluttered or busy
Home page may or may not require scrolling to another page but not at the cost of visual appeal which may distract the user and may not be helpful
2.Content presentation
Consistent page headings and associated links for easy page recognition
Users generally scan the page instead of reading consecutively. They spend minimum time so should be able to recognize how they can get to required information e.g. ‘Contact information’ should be accessible from every page
3. Easy availability
How easy it is to use all functions provided by the system
(i)While interacting whether there is clear indication for visited and nonvisited links by using colour code and Graphic/text combination is balanced (ii)’Undo’ or ‘back’ function is easy and user-input is not lost with the ‘back’ button (iii)Page loading/information retrieval in progress conveys accurate status messages
4. Effectiveness
(i)Assess the use of the website product to achieve the desired task with ease (ii)How important is system effectiveness is in the information seeking process
(i)Should not get lost while interacting with various resources such as databases, catalogue search etc.; (ii)Provide appropriate and informative feedback about the sources and what is being searched for; (iii) Should obviate the necessity of remembering user name & password in every interaction
5. Efficiency
Pages load faster – speed is important
Downloading process should not lead to ‘hang’ situation and time out feature should be clearly specified
6. Labels
A user interface control which displays text on a form and a static control having no interactivity
Toolbars, buttons, icons, drop-down features should be sensibly presented and labeled and should be unambiguous
7. Learnability
(i)Assess how easy it is to learn the site, the progression from novice to skilled user (ii)A process of self instruction or attending structured courses for making system more usable
Most important features and purpose of site should be easily determined e.g, Help file, Guide, Online tutorials, online learning materials, etc.
8. Navigation
Ability to traverse a site using available navigation site tools, e.g. back buttons, links. Includes Link back to home page and Navigation scheme through many layers of links between pages
Majority of visitors do not land directly on home page but deep in to page within site from a search engine so well structured navigation will enable them to pick up navigation immediately e.g. a search with the term ‘nucssi’ under Google search does lead to the required link directly under the link NUCSSI
9. Search Process
(i)Enable different types of searches for different skill levels and preferences (ii)Various options for searching on their home pages like A-Z list, or a general ‘search’
(i)Most users prefer a guided tour before performing their required search (ii)Search function should include library catalogue, types and strategy of searches with examples. (ii)It should be clearly stated what is searched when a user clicks on ‘search’. i.e whether ‘Catalogue’ or ‘Library website’ or ‘Institution’ or even ‘Internet’ (iii)Search functions should have default values with specifiable preferences besides cross-database search facilities. (iv) facility for printing, exporting and navigation in the list of records; (v) marking of the records; . sort options, and e-mailing of records, crossdatabase search facilities
10.Terminology
Adequacy & Comprehensibility of terms & phrases to describe functions or content which are familiar to the users
(i)The distinction among the terms ‘serial’, ‘periodical’ and ‘journal’ should be clearly brought out w.r.t scope whether the annual reports professional journals, popular magazines etc. are covered under the appropriate headings (ii)The scope of the term ‘e-resources’ should be defined whether it covers only databases, or e-journals and external links to other resources etc.
ANN. LIB. INF. STU., JUNE 2011
150
Annexure II ‘Usefulness’ Table Indicator
What it means
Scope of Coverage/Examples (based on minimum criteria
1.Authenticity
Refers to the truthfulness of origins, attributions, commitments, sincerity, devotion, and intentions.
(i) Whether the data on status of physical availability of a particular book, identified through the respective library website, is true after verification from the library (ii) To know the acquisition policy or list of Indian journals covered of a particular library the best authentic source would be the library website. (ii) Need to pay for accessing to required information which is not specified properly
2. Completeness
Sufficient/full information available relating to a particular topic.
(i) Insufficient information to understand the structure of the digital libraries e.g, incomplete information in bibliographic data like phone no’s, emails, city codes etc. (ii) Access to information published in the past
3. Clarity
Clearness of thought
When an attempt is made to contact some staff member for particular task through “Staff directory”, the information is not clear whether temporary staff working in projects also included.
4. Format
Resource attribute connecting users’ work practice
Print contents output in a usable format e.g, while downloading application or registration form, only contents of the form are saved or printed and not associated labels/emblem information
5. Level of details
Various representations of information provided e.g. abstracts, full texts etc.
Whether full text information is available or just summary / abstract is given.
6. Relevance
How the content corresponds to the work task
When the user searches (through OPAC) for ‘subscribed journals’ only but finds list of those including gifts, and exchange, online, CD-ROMs, digital publications also
7. Reliability
How credible the resource is and how well it satisfies present & future aspects of the work task
To obtain an article through interlibrary loan, the outdated information is available on the site unlike the previous experience of transactions
8. Specificity
There should be a close association between a category of information and the associated link/sublink
(i) Information on ‘subject experts’ to be provided in a specified category like ‘Reference & Research Help’ or ‘Subject experts’ and not on ‘About the Library’ link (ii) When an article is searched in a full text database by the title “usefulness of library website” and hit results retrieve articles on “usefulness of website or webportal” also
9. Timeliness
How current the information resource is and how well it satisfies the information need
Site, date and time last updated and by whom (desirable) should be displayed
10. Topicality
Currently of interest; contemporary. Relating to a particular topic (s)
To find and sort material for a research project on a specific topic/subject e.g, what a library website should cover under “e-resources” like subscribed e-resources, open access resources, digital contents, audiovideo, CD-ROMs, Patents, Standards, etc.