[back to Handbook]
Guidelines for the administration,
use, and interpretation of the "Student Perception of Teaching" (SPOT)
Revised by the Faculty Senate spring 2000 and spring 2004
The reliability of data gathered by way of student evaluation instruments
depends, in part, on the establishment of a set of common practices for
administration and use. The following statements constitute a set of guidelines
for the administration and use of SPOT.
- Analysis and reporting
- Guidelines for appropriate use of SPOT results
- Warnings against inappropriate use of SPOT results
- Guidelines for interpretation of SPOT results
- Administration of the instrument shall ordinarily be conducted during
the last ten class days of the semester (last five class days
in a summer session) at a time convenient to the instructor. Administration
times will be determined by agreement of the dean, the department
chairperson and the faculty member. Days when tests are being
or returned shall
be avoided when possible.
- SPOT shall ordinarily be used by all instructors in all courses every
semester including summer sessions. Paper versions of SPOT will be
used in traditional classroom settings, and online courses will use
version of SPOT. Recognizing, however, that some courses rely heavily
on specialized, non-classroom learning experiences (e.g., field-based;
laboratory-based; performance-based), exceptions may be established
at the departmental level by mutual consent of a faculty member and
chairperson. In such cases, some method of student evaluation shall
be implemented by the department chairperson.
- Should departments wish to use additional evaluation instruments,
these departmental instruments shall be administered after the administration
- Administration of the paper SPOTs shall be delegated to an individual
other than the instructor. That individual may be a student or another
- A brief standardized statement of instruction shall be presented
to each class prior to the administration of SPOT.
- During the administration of the paper SPOTs, the instructor shall
leave the classroom and its vicinity.
- Departments shall avoid practices which compromise student anonymity
(i.e., student names and/or identification numbers shall not appear
on evaluation forms).
- Following administration of the paper SPOTs, the evaluation forms
shall be sealed in an envelope and returned immediately to the departmental
Department chairpersons will keep these secure and will forward them
for processing. No analysis or interpretation is to be made by anyone prior to processing of the SPOT forms by Computing Services.
[top of page]
B. Analysis and reporting
- Academic departments/units shall deliver the administered forms,
with blank forms removed, to Computing Services by the last working day
exams (within one week of final exams for summer sessions) for
analysis. The analysis will not be done until after all grades have
to the registrar.
- Three copies of a course section summary (for each instructor, if
team-taught) shall be prepared; one for the instructor, one for the
and one for the instructor's dean. This summary shall contain, for
each item Q1 through Q16 and for any optional supplemental items the percentage of responses in each response category. For Q16 the summary shall contain the
response mean, the individual's response standard deviation, the
individual's minimum and maximum responses, the number of students
enrolled in the section,
the number of students responding, the departmental response mean,
and the departmental response standard deviation. In addition, the
receive one copy of the response frequencies of all SPOT items, including
the demographic information.
- Computing Services shall also provide to each instructor and his or her department
chairperson and dean a Question 16 Section Summary for each section
evaluated by SPOT. That summary shall contain:
- course and section number, instructor's name, and semester (or
- the section mean on Question 16;
- a histogram of the responses to Question 16 by students in
- Every personnel action recommendation for reappointment, promotion,
tenure, or post-tenure review should contain a summary, in a standard format, of the individual's
SPOT results for Q16 (at least) over the most recent two-and-one-half
together with a visual representation of trends. (An accumulation of Question
16 Section Summaries
over that period would accomplish this.) All RTP recommendations
shall include a qualitative interpretation of SPOT results by the department
and may include—at the individual's discretion—the individual's
own qualitative interpretation. All statistical calculations and quantitative
by anyone other than Computing Services (which is discouraged) must be clearly
identified as such.
[top of page]
C. Guidelines for appropriate use of SPOT results
- Data from individual faculty gathered through the use of SPOT shall
be treated with confidentiality and with recognition of the need
for continued study of the meaning and validity of these data. The
released by anyone other than the faculty member to anyone who
is not directly involved with evaluation for the purpose of reappointment,
promotion, tenure, post-tenure review, or annual departmental review, or to anyone
is not directly
involved with the development of norms, without the written permission
of the faculty member. Each department shall use a release form
that will enable instructors to designate other individuals or groups
access to evaluation information. In addition, quantitative data
shall not be released from the department, or comparable administrative
without an accompanying written interpretation of the data by
the appropriate evaluating officer and, if he/she chooses, by the faculty
evaluating officer's interpretation shall explain how an instructor's
scores compare with peers in the same department, discipline,
or course assignment, as appropriate. Because numerous studies have
indicated that both peer and student evaluations are necessary for
of teaching effectiveness, it is strongly suggested that peer
student evaluations be given similar emphasis in personnel recommendations.
- The Evaluation Committee of
the Faculty Senate is charged with regularly
reviewing both student and peer evaluation procedures, and with reporting
and making recommendations for improvement to the Senate.
- Instructors shall be given no access to individual response forms
prior to submission of grades and completion of processing by Computing Services.
- In the case of a formal appeal of a reappointment, promotion,
tenure, or post-tenure-review recommendation, all parties involved directly in the appeal
be allowed access to the archived data pertinent to that case.
- Individual SPOT results, when combined with qualitative interpretation
by the department chairperson and with peer evaluations of teaching,
can contribute to measuring an individual's teaching effectiveness
and to identification
of areas of strength and areas where improvement is possible. Under
those conditions, SPOT results are appropriately used for annual merit
summaries, consideration for salary raises, RTP, and post-tenure-review decisions.
[top of page]
D. Warnings against inappropriate use of SPOT
- Standard deviations that are reported by section (resp., department)
for each item measure the extent to which student responses are "scattered" within
that section (resp., department). They do not measure the manner
in which instructor means are distributed, hence should not be used
what percentile an instructor's mean score represents (or even how
good or how bad a mean score is).
- Means for the sixteen SPOT items must not be "averaged" to
produce a "combined SPOT score."
- Mean scores for two or more courses must not be averaged to obtain
SPOT score" for an individual.
- Averaging SPOT scores from several different courses across several
semesters to obtain an "overall individual SPOT score" is
- Direct comparisons of ratings from the version of SPOT used from fall 1992 through summer 2004 to ratings from the revised version implemented in fall 2004 are not appropriate.
[top of page]
E. Guidelines for interpretation of SPOT
Guidelines for SPOT ratings collected 1992-2004:
- There is strong evidence that the SPOT questions as a whole give
a valid measure of characteristics of effective teaching, and that
the results are reliable. Moreover, there are ample reasons to support
the Question 16 section mean as the best single measure of student
perception of teaching.
- SPOT scores should, whenever possible, be viewed in the context of
the immediately preceding five semesters. Comparisons should be general
should not ascribe meaning to the precision with which means are
reported. (For example, a mean of 4.22 on Question 16 for a certain
course might properly
be described as lying in the second highest quintile of UNCW Question
16 scores, but should not be viewed as different from a score of, say,
- The receipt of a Q16 section mean in the lowest quintile is not necessarily
an indication of poor teaching. Only 2.6% of student responses campus-wide
to Question 16 are "poor", and if every student were to answer "average" to
Question 16, the mean (3.00) would lie in the lowest quintile. However,
receipt of Question 16 means in the lowest quintile over a period
of several semesters may indicate an opportunity for improvement. Examination
of other SPOT items, consultation with the department chairperson,
and peer evaluations may reveal ways to improve student perception of
Guidelines for SPOT ratings collected from the revised version implemented in fall 2004:
Revised SPOT questions have been selected from reliability-tested instruments at UNCW and other institutions, and have been edited by the best judgment and experience of UNCW faculty. Revisions to the 1992-2004 SPOT instrument have been made in every case to improve the philosophy of the survey as a whole, the survey questions themselves, and the quality of the information collected.