Summary of the Student IEI Surveys for 1997-1998



I. Introduction

The Instructional Enhancement Initiative (IEI) aims to enhance the undergraduate educational experience in the College of Letters and Science at UCLA by providing broad support for new instructional technologies. The Initiative funds are used to support many activities including: a) creation of new student computer labs, b) expansion of existing labs, c) extension of the computer labs hours, d) creation of Web pages for all classes, and e) people to staff the computer labs, assist the faculty in utilizing the instructional technologies, and teach students how to use these resources. Few policies have had such a major impact on undergraduate education.

The UCLA College of Letters and Science Instructional Computing Committee has taken on the challenge of sampling the impact of the initiative on students and faculty and making recommendations to improve its effectiveness. The attached report describes the results of three broad surveys of students in the 1997-1998 academic year, the first year of the IEI. Although only a small portion of the funds collected through IEI directly support class Web pages, such pages are the most visible, and contentious, component of IEI. As such, much of the survey focused on use of the Web. An executive summary is presented first and then the statistical results are provided. The committee's intent was not to evaluate instructors or their use of online instructional resources, but rather to find changes in the educational experience for students as a result of the resources provided by the IEI.

II. Background on the Survey

The College of Letters and Science Instructional Computing Committee (ICC), with assistance from the Office of Instructional Development (OID), developed a short survey questionnaire for a cursory evaluation of student response to the IEI. The survey focused on course Web sites, with an emphasis on identifying areas for improvement rather than assessing content. The first two surveys did not explore other resources supported by the IEI such as student computing centers, but the Spring Quarter survey did. Issues such as the pedagogical value and effectiveness of Web-based instruction were specifically not addressed, as these would be significant research projects in their own right.

For logistical as well as economical reasons, OID recommended a sampling of approximately 5,000 - 10,000 class attendees each quarter rather than all student enrollments which would total approximately 140,000. The ICC thus developed a list of 60-70 courses each quarter for the survey which crossed most boundaries of small vs. large class, upper vs. lower division, and representing all four divisions of the College (Humanities, Life Sciences, Physical Sciences, Social Sciences). Since the intent of the survey was to gauge the impact of online instructional resources on students, only courses which had active Web sites were selected. The ICC is well aware that some courses made minimal use of the Web for instruction and those were not selected.

The forms were delivered, collected, handled and processed using the existing mechanisms set in place by the Evaluation of Instruction Program under the OID. The ICC intended the survey to provide broad indications of areas for improvement and it was not intended to provide evaluation data on individual instructors and their courses. Therefore, individual course data was not distributed and the student response forms were returned to the individual faculty members.

III. Summary of Results

A. Response

Of the 60-70 courses selected each quarter, 40-50 returned completed questionnaires. This is a reasonable response given the time pressures in most classes, that the faculty did not have input on whether their classes were selected and that the survey was not required to be administered. The numbers of students answering the questionnaires were 3398, 4207, and 4125 for Fall 1997, Winter 1998 and Spring 1998 Quarters, respectively.

B. Distribution of Students by Year in School and Gender

There was an even distribution of students surveyed based upon year in school. Each category (Freshman, Sophomore, Junior, and Senior) had a roughly 25% population each quarter. There was a somewhat surprising dominance of female to male responders (60% to 40%) over the three quarters.

C. Residence

Roughly a third (36%) of the respondents each quarter lived in the residence halls.

D. Experience Using the Web

The vast majority of students reported experience using the Web prior to the particular surveyed course. However, what is remarkable is that this number rose from 87% in Fall Quarter 1997 to 97.5% by Spring Quarter 1998, with a full 80% of the students ranking their ability as "intermediate or advanced". This is an important result as it can allay faculty fears of student inability to retrieve course materials posted on a class Web site. Clearly, efforts towards basic Web training for students may only need to focus on introductory classes in Fall Quarter.

E. Access of the Course Web Site

The Fall and Winter Quarter surveys merely asked if students had accessed the course Web site and the responses were 85% and 93%, respectively. The Spring Quarter survey asked the frequency of access to the course Web site and 50% of the respondents reported accessing the course Web site once or more per week.

F. Access to the Course Web Site

The three principal points by which students accessed course Web sites were from remote dial-in via Bruin OnLine (36%), the residence halls (27%) or campus computer labs (22%).

G. Satisfactory Access to the Course Web Site

Only a small minority (10%) of students reported unsatisfactory access to the course Web sites. The principal reasons for dissatisfaction were speed of connection, problems making a connection, and lack of information. The intent of these two questions was to gauge hardware issues, but one interpretation of the "lack of information" response is that a few students responded based upon the content of the course web site. The problem of making a connection is clearly a campus-wide issue with which the administrators at Communications Technology Services and Bruin OnLine are familiar. Out of 4125 respondents to the Spring 1998 survey, only 119 cited access to a computer as a problem.

H. Contact with Instructors, Teaching Assistants and Other Students

A specific requirement of the course Web sites under IEI was some mechanism for interaction with the instructor. Over 60% of the respondents indicated that the web site did facilitate their contact with the instructor in some way, with more than 40% indicating either moderate or significant enhancement of contact with faculty. In contrast with the instructor requirement, student-teaching assistant and student-student interaction mechanisms were not required under IEI. However, about 40% of the respondents indicated that the web site facilitated their contact with the TA and 40% of the respondents indicated that the web site facilitated their contact with other students.

I. Useful Components of Course Web Sites

The five components of the course Web sites which students found the most useful were (in order): Exams/ Exam Keys, Lecture Notes, Assignments/Problems, Discussion Bulletin Board and Syllabi.

J. Use of Student Computing Labs

The Spring Quarter survey found that 60% of the respondents used on-campus student computing labs. (Note that other surveys indicate roughly 70% of students own their own computers.) Only 8% of the respondents reported dissatisfaction with the support staff in the on-campus student computing labs.

K. Written Comments About Class Web Sites and the IEI

Of the nearly 12,000 forms returned, about half had written comments (positive and/or negative) on the back of the evaluation form. All of the comments were recorded and categorized. Comments from the Fall Quarter 1997 survey were approximately 40% positive and 60% negative. The negative comments diminished significantly as the academic year progressed.

Positive comments tended to duplicate the numerical data regarding what students found most useful, with the noteworthy addition of elements which provided "visual interest and clarity". Aside from this last item, useful elements of the Web sites are mostly duplicates of non-Web activities. Centralization of these features is no doubt attractive, but the relative value of different components of the Web sites will likely change dramatically as the sites mature and grow in their sophistication. The items which provided "visual interest and clarity" may be the true harbingers of educational changes.

The negative comments were by and large focused on the administration of the initiative and specifically the fees. Overall, they were not directed against the concept of the online and other new instructional resources. There was a general feeling that the administration of the fees is unfair, owing either to the perception that faculty are not making full use of the Web, or that there is insignificant value received relative to cost. Many students have the perception that 100% of the fees merely supports the course Web pages. There was also a sense of dissatisfaction that the initiative was imposed without sufficient discussion or consultation with students.

Some of the comments included technical suggestions which will allow the faculty and Web site coordinators to improve content and presentation of the course Web sites.

IV. Comments on Interpretation of the Survey Results

With nearly 12,000 student responses, these data provide an interesting starting point for what is sure to be a revolution in the delivery of instruction. However, since the sample rate and rate of return were somewhat low, caution should be applied in interpreting these results too broadly. Further, response bias to innovations is often at the extremes (positive and negative).

Three comments from the beginning of this report should be reemphasized. First, the intent was not to evaluate instructors, but rather to find changes in the educational experience for students using new instructional resources. Second, only courses which had active Web sites were selected for surveys. Third, the pedagogical value and effectiveness of Web-based instruction were specifically not addressed.

V. Future Steps

The ICC plans to further analyze the written comments in terms of content analysis. Also, OID plans to re-process the survey forms in order to provide finer levels of discrimination.

The ICC and OID plan to sample courses once in the 1998-1999 academic year. Emphasis will be placed on adjusting existing questions to gather more detailed information and adding new questions to gain better insight on the impact of the IEI on undergraduate education at UCLA.

VI. Statistical Results

The following is a summary of the statistical results for the IEI student surveys.

1. Year in school

Fall 1997 Winter 1998 Spring 1998
Freshman 23.8% 23.1% 22.4%
Sophomore 25.5% 21.6% 21.8%
Junior 27.2% 26.2% 26.0%
Senior 22.0% 27.7% 28.9%
Graduate 1.0% 1.0% 0.5%
Other 0.4% 0.4% 0.4%

2. Gender

Fall 1997 Winter 1998 Spring 1998
Male 39.2% 44.7% 40.0%
Female 60.8% 55.3% 60.0%

3. Do you live in a residence hall?

Fall 1997 Winter 1998 Spring 1998
Yes 36.3% 35.8% 35.7%
No 63.7% 64.2% 64.3%

4. Prior to this course, did you have experience using the Web? / Rate your experience using the Web.

Fall 1997 Winter 1998 Spring 1998
Yes 87.0% 93.2% No experience 2.5%
No 13.0% 6.8% Beginner 17.3%
Intermediate 53.1%
Advanced 27.1%

5. Did you access your course Web site at least once? / How frequently did you access the course Web site?

Fall 1997 Winter 1998 Spring 1998
Yes 84.8% 93.3% Never 2.5%
No 15.2% 6.7% A few times 17.3%
Once per week 53.1%
Several times per week 27.1%
Daily 6.7%

6. How did you primarily access your course Web site?

Fall 1997 Winter 1998 Spring 1998
Campus Residence Hall 27.0% 26.8% 27.0%
Campus Computer Lab 22.9% 21.1% 23.3%
Bruin On-line 36.1% 39.1% 34.9%
Internet Service Provider 11.0% 10.6% 12.0%
Other 3.0% 2.5% 2.8%

7. When you accessed the course Web site:

A. How satisfactory was the access?

Fall 1997 Winter 1998 Spring 1998
Highly Unsatisfactory 6.5% 5.5% 4.3%
Unsatisfactory 8.0% 4.6% 6.3%
Neutral 16.9% 12.8% 16.2%
Satisfactory 52.7% 53.2% 55.6%
Highly Satisfactory 16.0% 23.9% 17.7%

B. When you accessed the course Web site: If access was not satisfactory, was it due primarily to (select one):

Fall 1997 Winter 1998 Spring 1998
Making a connection 21.5% 20.4% 18.2%
Speed of connection 23.5% 29.1% 27.9%
Access to a computer 12.6% 12.1% 8.7%
The computer 1.7% 2.9% 3.7%
The software 3.2% 3.6% 4.1%
Web experience 3.7% 5.0% 4.1%
Lack of information 18.2% 11.0% 18.0%
Lack of training 2.8% 3.2% 2.3%
Other 12.7% 12.7% 13.0%

8. Did the course Web site facilitate your contact with the following individuals:

A. With the instructor?

Fall 1997 Winter 1998 Spring 1998
Not at all 37.3% 31.0% 36.5%
Only slightly 21.4% 19.9% 21.4%
Moderately 27.6% 29.7% 28.6%
Significantly 13.7% 19.4% 13.5%


B. With the teaching assistant?

Fall 1997 Winter 1998 Spring 1998
Course did not have TA 8.0% 14.9% 14.9%
Not at all 51.7% 47.4% 45.9%
Only slightly 14.6% 15.8% 16.1%
Moderately 17.5% 16.4% 16.3%
Significantly 8.2% 5.6% 6.8%

C. With other students in the course?

Fall 1997 Winter 1998 Spring 1998
Not at all 63.5% 56.3% 61.0%
Only slightly 17.9% 20.5% 17.8%
Moderately 14.5% 16.7% 15.0%
Significantly 4.1% 6.5% 6.3%

9. Which components of the course Web site did you find to be the most useful (select all that apply)?

Fall 1997 Winter 1998 Spring 1998
Discussion bulletin board 15.2% 14.7% 14.5%
Syllabus 13.2% 12.8% 13.5%
Lecture notes 19.6% 16.6% 21.5%
Exams/Exam keys 23.9% 25.2% 20.9%
Useful links 5.4% 6.6% 7.6%
Assignments/Problems 17.5% 21.0% 17.0%
Other 5.1% 3.2% 5.0%

10. How often did you use the student computing labs (i.e. CLICC, Science Learning Center, departmental lab, etc.)?

Spring 1998
Never 39.1%
A few times 29.8%
Once per week 12.4%
Several times per week 14.6%
Daily 4.2%

11. How helpful were the support staff at the student computing lab you used most?

Spring 1998
No opinion 52.4%
Highly unsatisfactory 2.3%
Unsatisfactory 5.8%
Satisfactory 33.7%
Highly satisfactory 5.8%

12. Please suggest how the Web site for this class could be improved for future students in the course.

Written responses.

13. Are there any additional comments you would like to make about the use of the Web in this course.

Written responses.

Questions on this report may be directed to:

Professor Craig A. Merlic
Chair, Instructional Computing Committee
College of Letters and Science
1312 Murphy Hall
UCLA 143801

Back to main page