RU Banner
Center for Teaching Advancement and Assessment Research
Center for Teaching Advancement and Assessment Research 116 College Avenue
Rutgers, The State University of New Jersey
New Brunswick, NJ 08901
Phone: (848) 932-7466
Fax: (732) 932-1845

Frequently Asked Questions (for instructors) - Online SIRS

Blue Surveys

What is Blue?
Blue is a new survey tool that we will begin using in Fall 2018. This will eventually host all the SIRS, replacing both the older Sakai survey system and the EvaluationKit Pilot. Blue has several advantages over the current systems.
Is Sakai going away?
Not immediately - in Fall 2018, Rutgers announced plans to transition to Canvas as the primary course management system, however Sakai will continue until a transition plan is devised and implemented.

The decision to use Blue was made independently, and Blue works well in Sakai as a direct replacement of the existing Sakai survey component. The only changes that you will see in Sakai are that the current “Survey Dashboard” will eventually be replaced by the equivalent Blue survey dashboard at the same link -, and the “All Surveys” item on the “My Workspace” tab may eventually be removed.

Additionally, Blue surveys now appear as a Sakai tool that instructors may choose to add directly to their course sites (look for it in the “LTI Plugin Tools” section of the tools page when creating or editing a Sakai site).
Which courses will have surveys in Blue?
We are beginning the transition to blue with courses that are taught through Canvas , and with academic units that had previously used EvaluationKit. For Fall 2018 this includes:
  • School of Social Work
  • School of Management and Labor Relations
  • School of Communication & Information
  • School of Business - Camden
  • Bloustein School of Planing and Public Policy
  • Rutgers Law School
What is the survey link for Blue?
All courses (regardless of course web platform) can use the old Sakai link - - which will show both Sakai and Blue surveys until the transition is complete and is the best link for students to see all of their surveys at once. You may also use the direct link to Blue at Students and faculty will all receive the link via email at the start of the survey process. Canvas and Blackboard will have their own links within those systems, but can use the Sakai link as well (just add it manually). In addition, CTAAR will keep all older SIRS links active, and we will redirect them to Blue when appropriate - the old links will continue to work with the new system.
Will instructors or students need to do anything differently?
No. CTAAR will try to make the transition as seamless as possible. Instructors will see some new features, including:
  • The ability to change survey dates directly in Blue
  • A streamlined method of adding additional questions
How do I add questions to surveys in Blue?
You will receive an email when your survey is available for editing in Blue, and can choose from pre-existing questions simply by clicking a button. Additional, several slots are available for entering questions of your own. Full instructions are available for adding questions in Blue.
How does Canvas, Blackboard, and Sakai integration work?
We have not yet finalized the exact details, but in general when the survey starts, Canvas and Blackboard will prompt the students to complete the survey when they visit their course site. Canvas and Blackboard will continue to remind the students to complete the survey each time they visit. In some cases Canvas and Blackboard may partially block access to the course site until they complete their survey (although the exact details still remain to be determined).

Sakai will continue to show a “Survey Dashboard” as it currently does, but Sakai is not able to send pop-up reminders. Faculty and students will not see any change in the way the surveys currently work.
Does Blue send email reminders?
Yes, Blue will send email reminders to both students and faculty. For Canvas and Blackboard courses, students will get both the email reminders and the in-course prompt to complete the survey.
How can I tell if a specific survey will be in Sakai, Blue, or evaluationKit?
We post a complete, searchable list on our Current Surveys page, - each course includes a link to the correct survey system as well as survey dates.

evaluationKit Pilot (ending)

What is the evaluationKit pilot?
CTAAR has been trying a new survey tool, hosted at, in place of the old survey tool in Sakai. The new tool has additional features such as integration with Blackboard and Canvas, but the SIRS survey itself is unchanged.
Why not continue with EvaluationKit?
While EvaluationKit was an improvement over the Sakai survey tool, we feel that Blue is a better fit for Rutgers University. Blue has many of the same benefits as EvaluationKit, plus some additional advantages including a survey dashboard in Sakai (EvaluationKit only directly supported Canvas and Blackboard).
What is the link? How do I log in to evaluationKit?
Log in with your Rutgers NetID and password at (this link will redirect you to Rutgers CAS and on to evaluationKit). The same link is used for faculty, students and administrators. We use “CAS Single Sign-On”, so you can log in through other Rutgers systems first and evaluationKit will recognize your login automatically.

If you are both a student and an instructor, you may need to change your view of evaluationkit. After you log in, look in the upper-right corner near your name to change your view.

Note that when we complete the transition to Blue, this link will go to Blue instead of EvaluationKit.
Canvas shows links to surveys, do they need to remain active?
Yes. You can use the Canvas settings page to move the “Student Instructional Rating Survey” link up or down, but it's not possible to disable or remove it. The link will always show when your survey is running, and is automatically hidden at all other times. Disabling the link in Canvas has no effect.

Survey Results

How do I see the results of my surveys?
We will email the results of the surveys as an “.html” file attachment to each instructor on the day after the registrar closes the grading period, or approximately 3 weekdays after the last day of exams. Exact calendar dates vary from year to year, but roughly:
  • Fall Semester: results distributed first week of January
  • Spring Semester: results distributed prior to Memorial day
  • Summer session: results for all sessions distributed in the last week of August
  • Winter session: results distributed prior to the start of the Spring semester
You should save the file attachment for future reference (note that Microsoft OneDrive and Google Drive will not display the file correctly, you need to save/download to your own computer before viewing the file).
The survey report sent by email is misaligned and difficult to read.
We send the report as an ".html" attachment to an email message. Some email software may reformat the report only when viewed within the email message, causing the font and text size to change. Simply save the attachment, then open the saved file to view the results with the original formatting.

When using Rutgers Connect, click the “Download” link beneath the file name. Do not use the “Save to OneDrive” option, OneDrive currently only edits HTML code instead of displaying the document; if you save to OneDrive you will still need to download the file before viewing.
When I open the file, I only see strange gibberish or code
If you saved the file to OneDrive or Google Drive, you need to click “Download” in order to view the file. This is a limitation of the Microsoft and Google cloud storage services; as long as you save the file anywhere else it will work properly. The file opens in any web browser, on any type of computer, tablet or phone.
Why does my computer say that the attachment is "possibly harmful"?
The attachments we send are safe. If you are using Google Chrome to read your email, Chrome will alert you to "possibly harmful" files every time you receive certain types of attachments. This warning is like a "stop" sign when driving - it's meant to alert you to the possibility of danger, and once you determine that it is safe you can proceed. In this case, since you know who sent you the file and why, you know it is safe. Click "Keep" to continue saving the file (however if you ever receive HTML attachments unexpectedly, that is cause for concern).
Can I get a PDF/Word/Excel copy of my results?
We send the report as an ".html" attachment which can be saved, forwarded, etc. like any other file, viewed on all types of computers, tablets and phones, or easily converted to any other format. Be sure to save or download the HTML file attachment; saving the entire email message will cause problems. If you would like to convert to PDF or Excel, please follow the instructions below:
There is no advantage to using the "PDF" format, but if you would like a PDF version, you may "print to PDF" after saving and opening the ".html" file (you may need Adobe Acrobat, a free alternative, or Google Chrome). To use Google Chrome, open your HTML document and choose “Print” then simply click "change" to switch from your printer to a PDF file. Macintosh computers have a built-in PDF function that works in any web browser, as does Microsoft Windows 10.
Google Print to PDF
Google Chrome print options - click ”Change“ to choose PDF
Word or Excel
You can open the HTML file directly in Microsoft Excel or Microsoft Word. Either right-click on your saved HTML file and choose “Open with…”, or use the “File --> Open…” button in Excel or Word (you may need to change the “files of type…” option while opening), then “save as…” an Excel or Word document. Excel works best. Word will alter the formatting, requiring adjustments to the font size, page orientation, and table column widths. Copy and Paste also works, however opening the HTML file directly usually gives better results.
Will the online format change the faculty rating? Will students who do not attend class but respond to the survey bias the outcome?
In our studies to date, changes in the average rating are not significant but we are continuing to collect data to determine if the online system affects the ratings. On average, individual ratings varied by ±0.48 points between Fall 2007 paper surveys and Fall 2008 online surveys. For comparison, individual ratings varied by an average of ±0.40 points between Fall 2006 and Fall 2007, both semesters using paper surveys. Comparisons were limited to instructors teaching the same course for more than one semester (i.e., we did not compare ratings for the same instructor teaching different courses, nor did we compare ratings for the same course taught by different instructors).
How are the results used?
University-wide, the survey results are used as part of the faculty promotion and tenure review process. While the use of the survey data varies within individual academic units, it is often used as part of a review process for improving the curriculum, implementing changes to teaching strategies, reappointment review for part-time lecturers and teaching assistants. Many faculty and instructors use the survey data, in particular the comments, to assess and improve their own teaching methods.
Who gets to to see the results?
The summary statistics of anonymous student responses for faculty and part-time lecturers are available to the entire university community at beginning with data from 2001, Fall and Spring semesters only. Older data is available on CD-ROM at the University Libraries. Data for Teaching Assistants is no longer published because of the requirements of the Family Educational Right to Privacy Act (FERPA). Student comments are not published.

CTAAR distributes all the reports, including the data for teaching assistants and the student comments, directly to the academic departments and to the individual instructors shortly after the grading period ends. By request, CTAAR also provides the raw, numerical student response data to departments that want to run their own statistical analysis.
Who gets to see the comments?
CTAAR sends the comments directly to the instructors and to the academic departments. The comments are completely anonymous and grouped by question. Comments are not published.

Student Responses

Will the number of students who respond drop?
The response rate (the number of students who filled out the survey divided by the enrollment) for the online surveys for each semester ranges on average between 50% to 65%. This represents a drop for some departments, and an increase for others. We are closely tracking response rates as we implement our online ratings, and slowly expanding the use of online surveys so we can make adjustments. Based on evidence from other universities, we expect to see an initial drop in the overall number of students replying to the surveys followed by a gradual increase as students and instructors become more familiar with the system. More importantly, the evidence suggests that the change in response rate does not significantly affect the average ratings for individual instructors or departments as a whole.
Response rates to individual surveys may either decrease or rise depending on how the instructor communicates the details of the survey with the students, and is affected by factors such as class size, attendance policies, and mode of instruction (lecture versus lab, etc.).

Please refer to our information about increasing participation.
If the response rate is very low, can one student unfairly impact my ratings?
Relative to the enrollment in the course, the impact of an outlier may be exaggerated by a low response rate. Although possibly disproportionate, these responses do reflect some student opinions and care must be taken when interpreting the survey. Multiple surveys across courses and semesters should be considered together, and outliers should be recognized as such. Please refer to the guidelines for interpreting the Student Instructional Rating Survey. SIRS is not intended to be the sole determiner in the assessment of teaching; other evidence of teaching ability can and should be included to offset the impact of an outlier.

If the outlier is due to student error (e.g., filling out the wrong survey, reversing the scale), please refer to our policy for requesting corrections.
Can I get a list of students who did or did not respond to the survey?
No, the system is designed to protect student identities and does not report who did or did not respond to the survey.
A student made a mistake on the survey form, can it be corrected?
Prior to the survey due date, students can go back and edit their own responses.

We do not interpret the students' responses on the survey, and we cannot examine student responses while the survey is still running. Any instructors who feel that a student has incorrectly submitted comments for another instructor or reversed the answer scale should communicate this to his or her department chair. The department chair should request in writing that the survey responses be reviewed and reprocessed.

In many cases students report that they they have made a mistake, but they merely misremember the details of filling out the survey. Requests for corrections should only be made after reviewing the survey results and ascertaining that a mistake does in fact exist and impacts the outcome. All requests for the department must be submitted together because each correction may affect the entire department mean.

Survey Process

Which courses are using the online survey?
You can view a full list of surveys in the online system, for the current semester. All Winter and Summer session courses use the online survey system. For Fall and Spring semesters, most departments have chosen to use the online system but a small number of departments continue to use the paper system.
Are Biomedical and Health Sciences (legacy UMDNJ) courses included?
As of Fall 2014, all School of Nursing courses will use SIRS. For other units, participation in the SIRS process depends in part on which student registration system is used for a course. RBHS courses that were formerly UMDNJ and continue to use the UMDNJ regstration system will not be included in SIRS, and should continue to use their existing processes (RBHS departments may contact us to discuss the use of SIRS). Courses that were in "joint Rutgers/UMDNJ" programs, use the Rutgers registration system and used SIRS before the merger will continue to use SIRS.
What questions are on the survey?
The questions are identical to the ones on the paper survey. We have an example of the online survey , and we can also provide interested people with a fully functional demonstration. Please contact us at please replace "brokenemail" with to arrange a demonstration.
Can I add my own questions?
As of Spring 2016, instructors can now add their own questions provided we know their NetID when we create their survey (NetIDs are required to correctly associate a specific survey with an instructor's Sakai login). Please refer to the instructions for adding additional questions. Questions added to SIRS should be consistent with the purpose of collecting course feedback, and may be posted publicly at SIRS Results. For other types of questions, instructors may prefer to use alternate methods to run their own questions, such as a poll or an anonymous quiz in Sakai or a Google Docs web form.

By request we can add a standardized set of additional questions if they are to be used department-wide.
Can I see how many students have replied to my surveys?
We will send an email update every Tuesday and Thursday during the run of your survey to let you know how many students have replied up to that point. You can also check on our current surveys web page (data updated daily).
Can you please stop sending so many emails to me?
Yes! As of May 2013, we can now honor your preferences for how often you want us to send you updates about your surveys. Write to us at please replace "brokenemail" with to let us know that you would like either only 1 email a week or no emails (the default is 2 per week). If you are teaching more than one course, we have already reduced your emails so that you will only receive one message that lists the progress of all your surveys (if you still get more than one, please let us know - it merely means we don't have consistent email addresses for you, we can easily correct that).

Please note that for students, email reminders will always be sent every 3rd or 4th day until they complete the survey. Instructors can request that we adjust the frequency of student emails, but it affects all students in the class - we cannot change this setting for individual students.
How are surveys for lectures with multiple recitations or labs handled?
Partly this depends on how your department lists the course in the Schedule of Classes. If the lecture and labs or recitations are listed separately (e.g. the lecture is listed as "102:01" and the labs are listed as "103:01", "103:02" etc.), we will create a survey for each section independently as with any other course. If the lecture is made up of multiple sections but does not have it's own course number (e.g., "101:01", "101:02" and "101:03" all meet together in a lecture hall one day a week), we will create a survey for each of the individual sections (one for each TA), plus a survey that combines all of the sections together (for the lecturer). If we do not know the names of the TAs we will contact you or the department administrator.

If you teach both the lecture and a recitation for the same course you will typically have two surveys so you can gather feedback about the lecture and the recitation separately. However we will need to alter the survey titles so the students can tell the surveys apart. Please contact us to request the change.
How are courses with more than one lecturer handled?
Will will create a separate survey for each lecturer, provided we know the lecturers names. We collect instructor's names from the Schedule of Classes and from department administrators at the start of the semester. You can specify dates to run the surveys, so if you would like to run each survey at different points in the term for the different lecturers, please contact us or ask your department to enter the dates when updating the survey information, at least two weeks before the survey should run.
What can I do to get more students to take the survey?
Faculty and instructors are essential to ensuring that the students respond to the survey. Above all else, communicate with your students regarding the importance of the survey to improve your own teaching, as well as the importance to the university as a whole. Consider taking the following actions:
  • Set aside some class time and have the students complete the survey in class. This will often result in response rates equal to the paper surveys.
  • While the survey is running, direct your students to and tell them to click "All Surveys" after logging in.
  • Do not rely on our email reminders - students may not read the email, and we cannot send email to students who do not provide an accurate email address to the university. Talk about the survey in class.
  • Include a statement on your syllabus that you expect all students to complete the SIRS survey.
  • Use informal, midcourse surveys throughout the term.
  • When the survey begins, take some class time to discuss the importance of the survey.
  • Give the students personal examples of how you have used prior surveys to improve your teaching.
  • Inform the students that the surveys are used by the University in promotion, tenure, and reappointment decisions.
  • Assure students that their comments and responses will always remain anonymous.
  • Invite students to view survey data from previous semesters at
  • Read more about student participation
How do you enforce student participation?
The survey is voluntary. The system notifies students by email of the availability of the survey, students who do not reply to the survey will receive repeated reminders until they respond or until the survey ends. Additional methods of enforcement cannot be implemented until the university community has an opportunity to discuss the implications and practicality, however instructors have a great deal of flexibility for encouraging student participation.
Why do students need to log in? Does this violate their anonymity?
Student log-in information is used only to determine which surveys a student can take, and to prevent the students from responding more than once to the same survey. The survey software never reveals the students' identities, and all reports generated by CTAAR only include anonymous, aggregate data. See the privacy policy for more information. It is important that you communicate to your students that you will only see anonymous data, and only after final grades have been submitted.
My students cannot log in to the survey link that I sent them.
If you sent the survey link to your students, please note that two problems frequently beak the link:
  1. The period at the end of the sentence may break the link if it directly follows the link. Please put the link on a line by itself with no additional punctuation, or make sure you put a space between the end of the link and the period (e.g., “Please go to .”).
  2. If you use Outlook Web Access, do not copy and paste the link. There is a known bug in some versions of Microsoft Outlook Web / Exchange 2003 that breaks links. You must forward or retype links from Outlook Web instead of copying.
Can students who withdraw from the course take the survey?
No. The survey system only allows students who are currently enrolled in the course to take the survey. Roster information is updated daily.  
Why is my course enrollment in your email wrong?
While the survey is running, we send periodic updates to instructors to let them know the current response rate. We only update the enrollment numbers in this email at certain points in the semester, so at times the enrollment number in the email will be slightly out of date. This does not affect the survey since the survey runs from the student perspective. When a student logs into the system, it checks to see if that individual student is registered for the course - the total enrollment is inconsequential for making that determination. We update the enrollment numbers before issuing the final report, but if you find a discrepancy in the final report please let us know and we will correct it.

Additionally, due to the way the registrar reports enrollment numbers, students who withdraw with a grade of “W” will be counted in the enrollment despite never having access to the survey. Please write to us if you would like your enrollment adjusted to reflect these withdrawals.





You are using a web browser that does not support "CSS". Please ignore the part of this page below this text. You may be using an older version of web browsing software that is likely to have severe security and identity-theft issues, as well as problems displaying certain web pages. You should download a newer web browser from Microsoft or Mozilla.

Search Rutgers