ERIC Number: EJ848664
Record Type: Journal
Publication Date: 2004
Pages: 22
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-1547-9714
EISSN: N/A
Web Log Analysis: A Study of Instructor Evaluations Done Online
Klassen, Kenneth J.; Smith, Wayne
Journal of Information Technology Education, v3 p291-312 2004
This paper focuses on developing a relatively simple method for analyzing web-logs. It also explores the challenges and benefits of web-log analysis. The study of student behavior on this site provides insights into website design and the effectiveness of this site in particular. Another benefit realized from the paper is the ease with which these concepts can be discussed with students. The purpose and context of the website used are easily understood by instructors and students, providing the basis for some rich discussions regarding: web-logs in general, web-log analysis and its' challenges, the use of data for decision-making purposes, and other aspects. Web-logs record every page that users access during their visit to the site, thus providing a picture of their behavior based on a number of things, including the order in which they hit pages (i.e., the "traversal path"), the length of time spent on each page, whether they go to pages in error, and whether they return more than once to a page. This is a vast data resource which many companies have difficulty using in an efficient and effective way. Using the on-line instructor evaluation site provided a finite database to study; in this case raw data of approximately 35,000 records represented 3,368 student-class evaluations done over a two-week period (a small web-log). Open-source software was used to calculate a number of statistics, and it was found that students do access the evaluation site from home, taking advantage of the 24-hour availability. However, many students fill out evaluations on weekdays between 9 am and 3 pm, suggesting that a fair proportion do the evaluations in between classes when they are on campus. The main portion of this work involved identifying and analyzing the traversal paths of the students through the site. There were a huge number of unique paths, highlighting the difficulty in isolating common user behaviors. It was found that students are able to complete the entire process (signing in, filling out one or more evaluations, etc.) in just over 8 minutes (on average). Students spend on average 82 seconds answering 12 multiple choice questions and 110 seconds filling out qualitative comments, suggesting that they spend a sufficient amount of time to adequately evaluate the course. In fact, students provide more qualitative comments on the online evaluations than they do on the in-class paper evaluations. Further path analysis helped to show why there are so many unique paths; for instance, if a user hits "reload", the page will register again in the web-log, creating a different path than a user who only loads the page once even though these two students are identical from a user-need standpoint. Also, the use of the "Back" button in the web browser and the ease with which users can jump from page to page around the site also creates many different paths. This study found that the vast majority of users to the site were able to complete evaluations, and to do so in an efficient, time-effective manner. Thus, this is a well designed site that helps and directs users to the appropriate pages. It was determined that future analyses would benefit from some prior planning on the part of the web-log administrator, including some additional information in the form of "markers" in the logs to track user behavior, which would reduce the complexity of the analyses required. The main reason for evaluating courses online is that it can drastically reduce the administrative cost and time required. The results here suggest that this is also an effective and efficient method for both students and instructors. Finally, as mentioned above, the information presented here can provide the basis for interesting class discussion of these issues. (Contains 5 figures and 5 tables.)
Descriptors: Web Sites, Student Behavior, Path Analysis, Internet, Course Evaluation, Computer Assisted Testing, Navigation (Information Systems), Search Strategies, Online Searching, User Satisfaction (Information), Critical Incidents Method, Computer Software Evaluation, Use Studies
Informing Science Institute. 131 Brookhill Court, Santa Rosa, CA 95409. Tel: 707-537-2211; Fax: 480-247-5724; Web site: http://JITE.org
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A