Return to All Teaching Guides

WHY IS THIS IMPORTANT?

How to implement this successfully

CEILS worked with a cross-campus committee at UCLA to create a document outlining ideas for adjusting your assessments to make them more inclusive, effective and authentic; this includes key lessons learned from remote teaching. 

Click here to view the UCLA document: “Ideas and Recommendations for Alternative + Remote Assessments”

It begins with a table summarizing the alternatives to high-stakes exams, including key considerations to make assessments more equitable and effective; almost all of these can be done remotely. 

 

More effective and equitable assessments

  • Integrate active learning techniques (click here) during lecture, discussion and lab.
  • Have drafts due throughout the quarter
  • Have students grade one another’s work using structured, rubric-based protocols
    • CCLE has a “Workshop” assignment, in which each student submission can automatically be assigned to a few other students to grade based on a rubric. The students who are assigning the grade get more points for assigning similar grades to the same work (i.e. there is a reward for grading carefully).
    • Use Calibrated Peer Review to help students learn not only how to grade one another, but what good examples look like.

Grading on a Curve as a Systemic Issue of Equity in Chemistry Education (2022) – Challenges multiple popular arguments for curving.

UCLA Report to the EVC: Enhancing Student Success and Building Inclusive Classrooms at UCLA: Report to the Executive Vice Chancellor and Provost (2015)  (see pages 29-34) – Discusses how, at UCLA, norm-referenced grading (“curving”) is associated with greater disparities between underrepresented/first-gen/pell grant students and majority students.

UCLA Engineering: In October 2018, the Associate Dean of UCLA’s School of Engineering issued a memorandum with guidance on assessment and grading. In particular, pages 5-7 provide reasoning for using “criterion-referenced” grading (assigning grading based on mastery rather than curving), guidance on course design, and why and how to clearly state the grading policy in the syllabus.

NY Times Op-Ed (2015): Why We Should Stop Grading on a Curve

Driving Up or Dialing Down Competition in Introductory STEM Courses: Individual and Classroom Level Factors. (2014): Research finding norm-referenced grading associated with higher student perceptions of competition; Faculty can “dial down” competitiveness by structuring collaboration into courses.

Schinske and Tanner: Teaching more by grading less or differently

VALUE Rubrics: These AAC&U rubrics include detailed language for describing and assessing skills, such as critical thinking, problem solving, and quantitative literacy. Adapting such rubrics can help you build better assignments, while being more clear about the skills you want your students to develop and demonstrate.

 

 

Fink, L. Dee “A Self-Directed Guide to Designing Courses for Significant Learning: This workbook style guide will walk you through the different elements of backwards course design, including designing for assessment of student learning.

Use “exam wrapper” to enhance study skills

“All too often when students receive back a graded exam, they focus on a single feature – the score they earned. Although this focus on ‘the grade’ is understandable, it can lead students to miss out on several learning opportunities that such an assessment can provide.” (Ambrose et al, How Learning Works, 2010)

Exam wrappers are post-exam assignments that help students metacognitively reflect on what they can learn from a graded exam, including the following sets of skills (https://www.cmu.edu/teaching/designteach/teach/examwrappers/):

  1. identify their own individual areas of strength and weakness to guide further study;
  2. reflect on the adequacy of their preparation time and the appropriateness of their study strategies; and
  3. characterize the nature of their errors to find any recurring patterns that could be addressed.

Here are sample exam wrappers provided by Carnegie Mellon’s Eberly Center:

  • The Physics exam wrapper was completed by students during recitation when the first exam was turned back to students. Papers were collected at the end of class, reviewed by the instructor, and then returned to students just before the next exam, as a reminder of their self-discoveries from a few weeks prior.
  • The Biology exam wrapper was given to students during lecture on the day when the first exam was turned back.

Click here to access an Exam Wrapper google doc that you can copy and edit for your class. (Be sure to check out the other examples above to see if any components are more relevant to your course.)

Using Rubrics to be more transparent and equitable

There are online grading platforms that allow instructors and TAs to score assignments with open-response questions including exams and homework problems.  UCLA now has a licensing agreement with Gradescope (www.gradescope.com) for their online grading tool, which can be incorporated into CCLE course sites.


Why use Gradescope?

Benefits to instructors and TAs:

  • Rubric-based scoring system allows for more consistent and fair grading
  • Choice of using a positive (points added for correct answers) or negative (points taken away for wrong answers) rubric.
    • Positive scoring rubrics reward students for their achievements and sets a positive tone in a classroom – one that is focused on what students are doing right, rather than what students are doing wrong.
  • Streamlines the grading process, achieving greater grading efficiencies in large-enrollment courses
    • TAs can replicate the same feedback to multiple students without having to re-write the same comments over and over again (a very time-intensive effort)
    • Changes to scores get propagated to all assignments, so TAs save time because they do not have to regrade questions
  • Electronic grading provides flexibility because instructors and TAs can grade assignments from anywhere (home, a café, their lab, their office)
  • Academic integrity issues mediated with repository of electronic, scanned copies of every student’s exam or assignment
  • Tracks and reports analytics giving instructors and TAs substantially more feedback on student mastery of concepts and skills
  • Great online documentation with instructions (including video tutorials), suggestions and short-cuts

Benefits to students:

  • Transparency in points breakdown (partial credit) for the score a student receives on a question because the points assigned are linked to comments explaining what was done correctly or incorrectly
  • Students receive more feedback from TAs grading their assignments
  • Quality of feedback a student receives is superior to that they could get from hand-written comments by a TA
  • Electronic return of graded assignments to students permits rapid feedback used by students to gauge learning gains and progress towards mastery
  • Integration with CCLE allows secure sign-in with UCLA BOL account

There’s an app for that!

Here is an app you can share with your students to scan their homework into a pdf: https://acrobat.adobe.com/us/en/mobile/scanner-app.html


How to add Gradescope to your CCLE course site

Since UCLA now has a licensing agreement with Gradescope, instructors may incorporate this grading tool into their CCLE course sites as follows:

  1. Sign in to CCLE, and turn editing on.
  2. Navigate to the section of your CCLE site where you want the link for Gradescope to appear, and click Add an activity or resource. Select External tool, and click the Add button.
  3. In the box for Activity name, enter a name, such as “Gradescope”.
    From the drop-down menu for Preconfigured tool, select Gradescope.
    Scroll down and click the button Save and return to course.
  4. Click the link that has been created. This will open Gradescope in a new browser tab, and Gradescope will ask if you want to link this course with a new or existing course in Gradescope. Assuming you have not already created a Gradescope course for this class, select A new Gradescope course, and click Link course.
  5. After editing settings in Gradescope as desired, select Roster from the left panel. Click the Sync Moodle Roster button, and then Sync Roster. Go through the resulting roster in Gradescope and make sure your TA(s) have been assigned the “TA role”, rather than being listed as “Student”. (This is a known bug in the CCLE-Gradescope link.)

Tips for Using Gradescope at UCLA

For additional tips using Gradescope, please click here to see our FAQ document created by our CEILS Instructional Consultants, Will Conley (Math) and Josh Samani (Physics).  Both are available to answer questions and provide best practice advice about Gradescope.  Please email media@ceils.ucla.edu if you need assistance, and we’ll get the message to Will and Josh.

“Two-stage” exams to make an exam a learning experience 

Students first take an individual exam, which is often worth the majority of their exam grade; afterward, they work on a second stage of the exam (either select questions from the original exam) or related and deeper questions.

For a complete description, see this 3-page summary provided by the American Association of Physics Teachers: https://www.physport.org/recommendations/files/Group%20Exams.pdf

This site provides some of the evidence supporting the benefits of two-stage exams (excerpt below): https://learning.northeastern.edu/two-stage-exams/

“Two-stage exams and other collaborative testing models have been tested for immediate learning (individual responses to the same questions three days after the exam; Gilley & Clarkston, 2014), retention at one month (individual scores on repeat questions on the subsequent exam; Cortright et al., 2003; Vogler & Robinson, 2016), and retention two months later (individual scores on repeat questions on the final exam; Vogler & Robinson, 2016; Knerim et al., 2015; Zipp, 2007), with most studies showing positive outcomes at all stages.”

Assessments Repositories

Repository of reliable and validated assessment instruments and references for existing evaluation tools used to measure changes in student learning or student attitudes.

The 2010 User-Friendly Handbook for Project Evaluation published by NSF

Online Evaluation Resource Library – (OERL)

Concept Inventories (FC, Bioliteracy Project, Bibliography, Q4B)
Please inquire with CEILS Director about specific instruments available to UCLA faculty.

List of Concept Inventories

Other Biology Concept Inventories

Validated assessment tools for Physics, Astronomy, and Mathematics
This is a collection of resources that gather together information about published diagnostic tests or instruments that probe conceptual understanding, in various topic areas relating to (but not restricted to) the physical sciences.

Field-tested Learning Assessment Guide – (FLAG)
The FLAG offers broadly applicable, self-contained modular classroom assessment techniques (CATs) and discipline-specific tools for STEM instructors interested in new approaches to evaluating student learning, attitudes and performance.

Student Assessment of their Learning Gains – (SALG)
SALG is a developed and tested modular curricula and pedagogy for undergraduate chemistry courses.

Development of Biological Experimental Design Concept Inventory – (BEDCI)
This CI can be used to diagnose specific examples of non-expert-like thinking in students and to evaluate the success of teaching strategies that target conceptual changes.

The Experimental Design Ability Test – (EDAT)
The EDAT measures the level of students’ understanding towards what constitutes a good experimental design.

The Laboratory Course Assessment Survey (LCAS)
The LCAS was developed for Course-Based undergraduate research experiences (CUREs) to measure students’ perceptions of three design feautures of Biology lab courses.

Development and Validation of a Rubric for Diagnosing Students’ Experimental Design Knowledge and Difficulties – (REDI)
The development of this rubric assesses aquired knowledge about the processes of science and helps diagnose the key difficulties of student understandings.

Results Analysis Concept Inventory
The concept inventory was created to assess students’ comprehension about data analysis. These skills are necessary to make informed decision and reliable conclusions, going beyond applications in science towards the use of these skills in every day life.

Developing a Test of Scientific Literacy Skills (TOSLS)
TOSLS is designed to assess the scientific literacy skills that are necessary in a STEM field.

SPARST, multiple choice
While it is still in development, this online, multiple choice test is designed to assess reasoning skills, data analysis, and science communication.

Lawson Classroom Test of Scientific Reasoning (CTSR)
The Lawson CTSR is designed to measure concrete and formal operational reasoning, focusing on proportional thinking, probabilistic thinking, correlational thinking, and hypothetico-deductive reasoning. It is intended for students ranging from middle school to introductory college.

Critical Thinking Assessment Test – (CAT)
The CAT Instrument is a unique tool designed to assess and promote the improvement of critical thinking and real-world problem solving skills.

California Critical Thinking Skills Test – (CCTST)
This test has been used by various studies to assess gains in critical thinking. Here is the google scholar list of publications related to the creation and use of the test.

In-Class Concept Questions (CLASS)
This report examiles the role of conceptual questions that students answer using personal response systems or “clickers” to promote student leaning

Course-based Undergraduate Research Experiences (CURE)
Initiated in 2013, CUREnet aims to address topics, problems, and opportunities inherent to to integrating research experiences into undergraduate courses.

Science Motivation Questionnaire
Designed to give education researchers and science instructors insight into what factors impact how students determination to complete their science degree.

Survey of Undergraduate Research Experiences (SURE)
SURE exmamines the hypothesis that undergraduate research enhances the education experience of science undergraduates and retains talented students to careers in science.

Undergraduate Research Student Self-Assessment
The URSSA focuses on assessing student learning from undergraduate research, rather than whether they like it. The assessment looks at research outcomes such as evidence-based thinking, skills such as lab work and communication, conceptual knowledge, linkages among ideas in their field and with other fields, and preparation and clarity towards a career or educational path after graduating.

Perceived Cohesion (Bollen and Hoyle)
This theoretical definition of perceived cohesion says individuals’ perceptions of their own cohesion to a group has two dimensions: sense of belonging and feelings of morale. The study proposes that group members’ perceptions of cohesion are important for the behavior of the individual as well as the group as a whole.

Sense of Community Index (Chipuer)
This paper outlines the psychological sense of community (PSC)  in the neighborhood for adults and adolescents, and workplace PSC for adults, using true/false and three-point response formats.

Views of Nature of Science Questionnaire (VNOS)
The VNOS couples survey responses with individualized interviews to understand student perceptions about research and science education.

Views about Science Survey
This survey aims to understand the preconceptions students have towards science concepts before they learn them in the classroom through understanding the epistemology of science and its social context.

The Project Ownership Survey: Measuring Differences in Scientific Inquiry Experiences (POS) (Hauner et al.)
Project ownership is one of the psycho-social factors involved in student retention in the sciences. This instrument is designed to measure project ownership.

Self-authorship (Creanerm Baxter-Magolda)
This instrument measures factors that influence individuals’ construction of knowledge, identities, and relationships.

Grit Scale (Duckworth)
The Grit Scale measures trait-level perseverance and passion for long-term goals. Among adolescents, the Grit–S longitudinally predicted GPA and, inversely, hours watching television.

The Role of Efficacy and Identity in Science Career Commitment Among Underrepresented Minority Students (Chemers)
This instrument examines the role of efficacy and identity and its role in student pursuit of career in science.

Using the Multiple Lenses of Identity: Working with Ethnic and Sexual Minority College Students (Estrada)
This instrument investigates contexts that influence ethnic and sexual minority self-concept.

There are many attitudinal surveys that can be used to assess students shifts in attitude about how to approach a given discipline. One example is the Mathematics Attitudes and Perceptions Survey.