RU Banner

Center for Teaching Advancement and Assessment Research

116 College Avenue
Rutgers, The State University of New Jersey
New Brunswick, NJ 08901
http://ctaar.rutgers.edu/
Phone: (848) 932-7466
Fax: (732) 932-1845
Directions

Guidelines for departmental assessment plans and conducting assessments

A good departmental learning outcome assessment plan has the following characteristics:

The plan’s goals:

The plan’s methods:

The plan’s implementation, or 'closing the loop' procedures:

Designing an assessment plan

A learning outcome assessment plan should provide answers to following three questions.

Q1. Learning goals and objectives: What should the students completing our program or major know, value and do?
Q2. Methods of Assessment:  How can we determine if our students know, value and do what we intend?
Q3. Implementation: How will we use the information we gather to change, add to, or restructure our program or major so that students know, value and do what we find important and appropriate?


Q1: Learning Goals and Objectives:  Learning goals are broad statements about educational outcomes. Objectives are markers on the road to the learning goals. For example:

Learning Goal:  Our students will apply basic theoretical concepts to real-world problems.

Objective: Our students will design, develop and complete a research project illustrating the application of basic theory to real world problems.

Learning goals and objectives have meaning only when stated in active language, stating what students will know, value and do when the complete the course, program or major. They should be specific and clear.

Q2: Methods of Assessment:  There are two basic methods, direct and indirect. 

   Direct assessments involve a review of the work students do by a group of faculty. Examples of student work include the papers and essays they write for class or for a project, assignments they complete for class, performances they present. 

   Indirect assessments include surveys,  focus groups, informal discussions and interviews.  Surveys of students, alumni, stakeholders can be very useful and informative.

    Authentic assessments  involve reviews of the outcomes of student activities outside the classroom, typified by field work, internships, and civic engagements.  The hallmark of authentic assessment is that the student's 'voice' or principles and standards as a practitioner is heard throughout the work that is done.

Q3.Implementation of 'Closing the Loop': An assessment plan must make provision for how information gathered will be used. The results from the completed assessments should lead to change in
the program or major, its curriculum and its courses.  For example, a review of student work products by a group of faculty (a direct assessment), or the results of a given survey of seniors approaching graduation (an indirect
assessment), may lead to a reorganization of course material in a given course, a change in course sequencing, or a change in course offerings.


To 'close the loop' or implement change, schools and departments need an administrative structure that can make the change happen. For departments, the structure below is typical:

    Curriculum and Assessment Committee:  Most departments and programs have a standing committee, whose function is to oversee and coordinate the development of the department’s curriculum.  This faculty committee suggests changes in the curriculum, and is the departmental clearing-house for assessments and research on the department’s curriculum. Changes recommended by the committee are voted on by faculty and if approved forwarded to the school’s curriculum committee, for eventual approval by the faculty of the school in which the department is located.

    Standing sub-committees: Many departments have standing sub-committees reporting to the main Curriculum and Assessment committee or the department chair or vice-chair.  They are established to assess the development of particular components of the department’s curriculum. For example, many departments have sub-committees that focus attention on the introductory course, on departmental honors courses or on capstone courses.

      Ad hoc sub-committeesWhen an issue of special concern arises, departments will often form an Ad Hoc committee of faculty to address it. These committees will create assessments, conduct assessments and report to the department head or the curriculum and assessment committee.

Using Rubrics for Assessments

A small group of faculty can review a collection of student work and rate it according to a rubric based on the outcomes the department wants it’s students to achieve.  Below, is a simple assessment rubric.  Each faculty member would rate the students’ work as to whether it meets, exceeds or fails to meet expectations over a number of characteristics for the chosen learning goal.

Rubric to assess learning outcomes for a generic learning goal
 

Fails to meet expectations

Meets expectations

Exceeds expectations

The work is well organized

     

The work uses appropriate analytic techniques.

     

The work shows clear understanding of basic concepts.

     

The application of basic concepts is sensible and reasonable.

     

  

More developed rubrics give more detail for “Fails, Meets or Exceeds” standards. For example, see this rubric for assessing critical thinking.

Good assessment rubrics are simple to understand, clearly stated, and easy to use. For example, in the above grid, we can expand the third row as follows.

 

Fails to meet expectations

Meets expectations

Exceeds expectations

The work uses appropriate analytic techniques

Fails to use or to use correctly the statistical methods taught in required courses for the major

Uses the statistical methods taught in courses required for the major appropriately and correctly.

Uses basic statistical methods with insight and vision; understands their limits and seeks useful and appropriate alternatives.

     Rubrics are best developed collegially and collaboratively. They take time and thought to develop. The better the design and more complete the structure, the easier they are to use, and the more useful the results from using them.

     Student performance on standardized content exams in a discipline are also direct assessment, such as the Force Concept Inventory in Physics, the Test of Understanding of College Economics (TUCE), American Chemical Society (ACS) Standardized Examination in Organic Chemistry.

    Other types of direct assessments include placement in internships, placement of majors in graduate programs (for undergraduates) and placement in employment (both graduate and undergraduate).



 

You are using a web browser that does not support "CSS". Please ignore the part of this page below this text. You may be using an older version of web browsing software that is likely to have severe security and identity-theft issues, as well as problems displaying certain web pages. You should download a newer web browser from Microsoft or Mozilla.

Search Rutgers