Go to Collaborative Learning Go to FLAG Home Go to Search
Go to Learning Through Technology Go to Site Map
Go to Who We Are
Go to College Level One Home
Go to Introduction Go to Assessment Primer Go to Matching CATs to Goals Go to Classroom Assessment Techniques Go To Tools Go to Resources

Go to CATs overview
Go to Attitude survey
Go to ConcepTests
Go to Concept mapping
Go to Conceptual diagnostic tests
Go to Interviews
Go to Mathematical thinking
Go to Performance assessment
Go to Portfolios
Go to Scoring rubrics
Go to Student assessment of learning gains (SALG)
Go to Weekly reports

Go to previous page

Classroom Assessment Techniques
Attitude Surveys

(Entire CAT)
Go to next page

Eileen Lewis
Department of Chemistry
Caņada College

Elaine Seymour
Bureau of Sociological Research
University of Colorado, Boulder

Eileen Lewis

Early in my teaching career, I noticed that students could memorize equations, solve problems, and even use terms fairly correctly. However, further questioning revealed that students' knowledge was pretty shallow...I changed the assessments of students' understanding - emphasizing making sense of phenomena, connecting ideas, and giving explanations...Students knew they had to understand the concepts, not just be able to parrot them back...

This type of survey provides valuable information on student perceptions of their classroom experience. This includes general attitudes toward the course, the discipline, and their own learning. The results from this survey can also help you identify elements in your course which best support student learning.

While attitudinal surveys may take many forms and address a range of issues, they typically consist of a series of statements that students are asked to express their agreement or disagreement (using a scale.)


Instructor Preparation Time: Very little time is needed to use a valid, existing survey. Large amounts of time are required to develop a survey that is reliable and measures what is intended.
Preparing Your Students: No training is required, but a description of the survey's goals and scales should be read to students as well as included in the survey form itself.
Class Time: Varies with length, but rarely more than 20 minutes.
Disciplines: Appropriate for all.
Class Size: Appropriate for all.
Special Classroom/Technical Requirements: None, although an optical scanning device may be useful to read and analyze data in large classes.
Individual or Group Involvement: Typically individual.
Analyzing Results: Depends very much on class size and length of the survey. In large classes, the use of scanning forms and optical readers makes the task easier.
Other Things to Consider: To insure meaningful results, student responses must be guaranteed anonymity. These surveys can be given pre and post to measure gains over a course or to provide mid-course corrections to classroom teaching methods. Demographic data may be included in the survey so that correlation with gender, major, or ethnicity can be made.

An attitudinal survey (also known as an affective survey) can provide information on student perceptions (emotions, feeling, attitudes) of their classroom experience. For example it can reveal perceptions on:

  • the content of a course
  • specific components of a course
  • course components which aid or are detrimental to learning
  • the effects of course innovations
Attitudinal surveys may also focus on students' needs in taking a course, how well those needs are met, student interest in or appreciation for the subject matter or field, student confidence in their ability to perform in a course, or their beliefs about the nature of the discipline itself, e.g.
  • the nature of a discipline (chemistry, physics, mathematics, engineering)
  • the nature of learning within a discipline
  • their ability to learn within a course
  • useful strategies for learning within a course or discipline
  • their own learning style or preferences for learning

Figure 1: Sample statements from an attitudinal survey on students' learning:
  Please use the 7-point scale to indicate your agreement or disagreement with each statement.
Record all responses on your Scantron form.


SD D N A SA NA Don't Know
8 Often in lab I didn't understand the concept behind the lab experiment. 1 2 3 4 5 6 7
9 I like labs where I get to help design an experiment to answer a question. 1 2 3 4 5 6 7
10 This course provided opportunities for me to help design experiments to answer a question. 1 2 3 4 5 6 7
11 It was clear how the lab experiments fit into this course. 1 2 3 4 5 6 7
12 Doing labs in this class was like following a recipe in a cookbook. 1 2 3 4 5 6 7
13 The lab manual for this course was well-written (easy to understand). 1 2 3 4 5 6 7
  Assuming that all the following activities are equally well-implemented, I learn well by ... SD D N A SA NA Don't Know
33 doing homework assignments. 1 2 3 4 5 6 7
34 using diagrams and other visual media. 1 2 3 4 5 6 7
35 using computer-based materials. 1 2 3 4 5 6 7
36 reading a (good) textbook. 1 2 3 4 5 6 7
37 working with my lab partner. 1 2 3 4 5 6 7
38 getting good help / tutorial aid. 1 2 3 4 5 6 7
39 doing hands-on activities. 1 2 3 4 5 6 7
40 listening to lecture. 1 2 3 4 5 6 7
45 completing lab notebooks or lab reports. 1 2 3 4 5 6 7
46 reading and re-reading materials. 1 2 3 4 5 6 7
  I know I understand when ...              
49 I can work problems in the book. 1 2 3 4 5 6 7
50 I can apply ideas to new situations. 1 2 3 4 5 6 7
51 I get a good grade on an exam. 1 2 3 4 5 6 7
52 I can explain the ideas to someone else. 1 2 3 4 5 6 7
53 I can see how concepts relate to one another. 1 2 3 4 5 6 7
SD = Strongly Disagree; D = Disagree; N = Neutral; A = Agree; SA = Strongly Agree; NA = Not Applicable

Assessment Purposes
Depending upon the questions asked, instructors can be provided with information about students' learning styles or preferences for ways of learning. This allows instructors to choose among instructional approaches that would best meet the needs of students. Instructors can also discover which components of their course contribute most significantly to students' learning.

General information on students' beliefs about the nature of science/ mathematics/ engineering is helpful in designing activities to foster a more realistic view of a discipline and what members of that discipline do. For example, students would be asked to express their agreement with the statement, "Science, as it is practiced in the real world, is objective and unbiased."

An added benefit of this type of survey is that students are prompted to reflect on their own learning preferences, strengths, or styles. This often helps students become better managers of their own learning and encourages them to engage in more fruitful activities.

While the questions or statements on an attitudinal survey may seem obvious, they are, in fact, the result of considerable work in both designing the question/statement so that it measures what it was intended to measure (called validity) and that it has reliability across students and groups. For these reasons, the outcomes from surveys that are written without checking their validity and reliability are often without meaning.

Additionally, for best results, students must be guaranteed anonymity. This means if the instructor analyzes the data, no student identification should be requested. You may ask for demographic information like gender, ethnicity, major, etc. and look for correlation across those variables. If you want to correlate student responses to their performance, you must have someone else gather and analyze the data, explicitly letting the students know you are doing so. Data analysis can be very time consuming in large classes unless you have optical scanning response forms and an optical reader, e.g. Scantron¨ forms and optical scanner. If these resources are present, data can be scanned and directly imported into a statistical analysis program or spreadsheet program for analysis. For small classes, you may provide additional space for students to elaborate on their ideas.

Teaching Goals

Suggestions for Use
Attitudinal surveys can be used at the beginning of a course to identify the ways students perceive they learn best or to determine their attitudes toward the course or discipline. This information can be used to adapt instructional strategies. Another use is a comparison between attitudes at the beginning of the semester (pretest) and again at the end of the semester (posttest). This allows the instructor to discover the impact of their course on student perceptions on the variety of topics mentioned above. They also can be used at any point during the semester to make corrections to existing curricula or methods in a course.

The surveys can be taken in any location and require little time. A very long survey with eighty or so items usually requires no more than 20 minutes. Gender, ethnicity, major, year in school, and previous coursework can all be included in a survey.

Another feature that makes data collection and analysis easier is to use an optical scanning form and scanner (e.g. Scantron). This allows these data to be directly imported into a spreadsheet or statistical application for analysis.

Step-by-Step Instructions


  • If students circled a response, then you will need to tally the number of students who chose each of the responses represented. Then tally the responses to related questions and create a summary of responses.

    For example, the following four statements can be clustered into two categories.

    1. Science, as it is practiced in the real world, is objective and unbiased.
    2. Chemists work to uncover universal laws that already exist in nature.
    3. Chemists construct theories that explain what they observe in nature.
    4. It is important to be skeptical about the results of scientific experiments.
    The first two statements reflect an idealized view of science as truth. Whereas the last two statements are represent a more realistic view of the nature of science. Students' responses to question (1) and (2) versus (3) and (4) would provide a measure of the level of sophistication in students' views of the nature of science.

  • If you are using a form that can be scanned, the resulting data can be presented in an electronic file. This data can then be entered into a data analysis program that will allow you to analyze the results of individual statements and aggregate the results across similar statements as above. Simple frequency distributions and percentage of responses in each category are a good place to begin to view your students perceptions in a variety of categories. For example, if most of your students agreed with the statement: "Doing labs in this class was like following a recipe in a cookbook." you might want to change your laboratory activities.

If you use the surveys as pre/post measurements, you may find a statistical program useful in measuring any statistically significant changes in your students' views or attitudes. This type of pre/post survey is particularly useful if you have incorporated curricular changes into your course. A simple comparison of your student means from each statement will allow you to see trends in your students' attitudes toward your course and its components. For example, assume that the mean for statement 11 in Table 2, " It was clear how the lab experiments fit into this course, " was low (less than or equal to 2 where 1=strongly disagree and 2=disagree). This would indicate that your students do not understand how the lab fits into your course. It's always a good idea to make friends with someone in science or mathematics education. A conversation with them about some of their experimental data would help with data analysis.

Commonly used statistical packages:

Pros and Cons

  • Students are accustomed to taking surveys and using multiple choice responses so the experience is a familiar and comfortable one. Even very quiet and reticent students are usually comfortable expressing their ideas in this format and are generally pleased that the instructor is interested.
  • Instructors can quickly gain information about students' learning styles, attitudes toward a course or field, and self-assessment of skills and knowledge they bring to a course. This often results in making instruction more focused and effective.
  • Survey findings can be expressed in easily understood percentages or means.
  • If an attitudinal survey is done early in a course, students may become involved in the how a course it taught. This generally improves class morale, encourages students to be more actively involved in the class, and enhances communication.
  • Sharing survey results with a class helps the students see the diversity of opinions and styles within their class and makes them more accepting of a variety of valid approaches to the same content.
  • Often when students see a variety of approaches to learning, they are encouraged to experiment with those different approaches.
  • The act of completing the survey can promote reflection and increase students' self-awareness of their learning styles and attitudes.
  • Instructors may have an unpleasant surprise in discovering that their students' views are quite different, perhaps in direct opposition, to their own views or those of experts in the field.
  • Some students may fear that their responses will not be anonymous and therefore be less candid than they might wish to be.
  • The discovery of how students really feel about a given course can be depressing to instructors.
  • The preferences and needs expressed by students may not fit well with the instructor's plans for a course. In this case, the instructor must either ignore students' expressed preferences or change components of the course.
  • Surveys usually provide a broader range, but less detailed data than can be obtained from qualitative data such as individual interviews or focus groups.

Theory and Research
Research has found that effective teachers share several characteristics (Angleo & Cross, 1993; Davis, 1993; Reynolds, 1992; Murray, 1991; Shulman, 1990). Two of these characteristics are particularly relevant:

  • Effective teachers use frequent assessment and feedback to regularly evaluate what they do in the classroom and whether their students are really learning.
  • Effective teachers try to anticipate the concepts that will be difficult for their students and to develop teaching strategies that present these concepts in ways that make them more accessible to students. This requires becoming familiar with students' preparation, knowledge, and abilities as well as adjusting teaching strategies to maximize the class's learning.
There is substantial research which concludes that administering surveys to students can be both valid and reliable, providing a wealth of knowledge about the attitudes, behavior and values of students (Hinton, 1993). The attitudinal survey discussed here provides one method for obtaining valuable information about classroom components, teaching strategies, usefulness of instructional materials, organization, pacing, or workload. This information can then be used to engage in the practices that improve teaching effectiveness.

Additional information on how to design, administer, and interpret your own surveys can be found in Theall & Franklin (1990), Davis (1993), and Braskamp and Ory (1994).


  • Maryland Physics Expectation (MPEX) survey
    web URL: http://http://www.physics.umd.edu/rgroups/

  • Dunn Learning Styles - The instrument assesses five different stimuli: environmental, emotional, sociological, physical, and psychological. Within each of these stimuli are a total of about 20 elements which individuals receive feedback on. A number of faculty have reported success with this instrument and the information it provides about students' learning styles.

Angelo, T. A., and Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers, 2nd ed. San Francisco: Jossey-Bass.

Braskamp, L. and Ory, J. (1994). Assessing Faculty Work: Enhancing individual and institutional Performance. San Francisco: Jossey-Bass.

Centra, J. A. (1973). Effectiveness of Student Feedback in modifying college instruction. Journal of Educational Psychology 65 (3), 395-401.

Davis, B. G. (1993). Tools for teaching. San Francisco: Jossey Bass.

Fowler, F. J. (1993). Survey research methods. Newbury Park, CA: Sage.

Gramson, Z. and Chickering, A. (1977). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39, 5-10.

Henderson, M. E., Morris, L. L., & Firz-Gibbon, C. T. (1987). How to measure attitudes. Newbury Park, CA: Sage.

Murray, H. G. (1991). Effective teaching behaviors in the college classroom. In J. C. Smart (ed.), Higher education: Handbook of theory and research, Vol. 7 (pp. 135-172). New York: Agathon.

National Research Council (1997). Science teaching reconsidered: A handbook. Washington, D. C.: National Academy Press.

Reynolds, A. (1992). What is competent beginning teaching? A review of the literature. Rev. Educ. Res. 62, 1-35.

Shulman, L. S. (1990). Aristotle had it right: On knowledge and pedagogy (Occasional paper no.4). East Lansing, MI: The Holmes Group.

Shulman, L. S. (1991). Ways of seeing, ways of knowing - ways of teaching, ways of learning about teaching. Journal of Curriculum Studies, 23, (5) 393-395.

Theall, M. and J. Franklin, Eds. (1990). Student ratings of instruction: Issues for improving practice. New Directions for Teaching and Learning, No. 43. San Francisco: Jossey-Bass.

Eileen Lewis
Department of Chemistry
Caņada College

Eileen Lewis Early in my teaching career, I noticed that students could memorize equations, solve problems, and even use terms fairly correctly. However, further questioning revealed that students' knowledge was pretty shallow. So even though my "lectures" encouraged conceptual understanding, my exams were more traditional. Since students are pretty efficient about learning what they need to be successful in a course, I changed the assessments of students' understanding - emphasizing making sense of phenomena, connecting ideas, and giving explanations about why in everything they observed or did. Writing these questions was much harder, but it changed the type of learning that went on in the class. Students knew they had to understand the concepts, not just be able to parrot them back and solve algorithmic problems. These new assessments also served them in future courses because they really understood and had made connections between concepts.

Elaine Seymour
Bureau of Sociological Research
University of Colorado, Boulder

Elaine Seymour Elaine Seymour is the Director of Ethnography and Evaluation Research, Bureau of Sociological Research, University of Colorado, Boulder, a position she has held since 1989. She received a Ph.D. in Sociology from the University of Colorado, a M.A. in Education from the University of Glasgow, Scotland, and a B.A. with Honors in Economics and Political Science from Keele University, England. Her academic honors include Doctoral Fellowships from the National Institute of Mental Health and the University of Colorado, Teaching Excellence Awards, and a Fulbright Teaching Scholarship. Seymour's recent work in assessment includes the development of a prototype Field-Tested Learning Assessment Guide (FLAG), and the development of the Student Assessment of their Learning Gains (SALG) classroom evaluation instrument.

Go to previous page Go to next page

Tell me more about this technique:

Go to top of page.

Introduction || Assessment Primer || Matching Goals to CATs || CATs || Tools || Resources
Search || Who We Are || Site Map || Meet the CL-1 Team || WebMaster || Copyright || Download
College Level One (CL-1) Home || Collaborative Learning || FLAG || Learning Through Technology || NISE