Improving teaching quality through qualitative feedback… using machines

Improving teaching quality through qualitative feedback… using machines


POINT OF VIEW

During our performance evaluations, our reporting officers might give suggestions on how to improve the courses we teach.. they should be able to know what happened both within the course, as well as about the course at a macro level. If they are aware of the positive comments provided for other courses, these can also be shared with other faculty for improvement in their teaching.

Swapna Gottipati

Associate Professor of Information Systems


In brief

  • Student Evaluation of Teaching (SET) is commonly used in higher education as feedback for course instructors’ performance, but it only measures what certain questions asked.
  • Associate Professor Swapna Gottipati’s latest project seeks to extract insights to improve the curriculum from students’ open-ended feedback. It can also help faculty members gain deeper insights into their teaching practices, curriculum, as well as assessment development.
  • Utilising Natural Language Processing, Assoc Prof Gottipati examined five main aspects of student feedback between quantitative and qualitative feedback: 1) Topics, 2) Sentiments, 3)Suggestions, 4) Time, and 5) Correlation. These will help the course instructor fine-tune their teaching and course managers and university administrators with macro-level management of the curriculum.

Student Evaluation of Teaching, or SET, is commonly used in higher education as feedback for course instructors’ performance. Students rate their teachers quantitatively, scoring their performance on a numerical scale on questions such as “Teacher is prepared for class” and “I have learnt a lot from this teacher”.

While numerical ratings provide a tangible measurement of an instructor’s classroom performance, they only measure what the questions ask. Open-ended questionnaires would reflect the big picture much better, but they are relatively under-utilised with regard to extracting student feedback to identify ways to improve how a course is taught.

What insights come to mind?

What insights come to mind?

Click to respond and see what others think too

What makes you skeptical?

We read every single story, comment and idea; and consolidate them into insights for our writer community.

What makes you curious?

We read every single story, comment and idea; and consolidate them into insights for our writer community.

What makes you optimistic?

We read every single story, comment and idea; and consolidate them into insights for our writer community.

What makes you on the fence?

We read every single story, comment and idea; and consolidate them into insights for our writer community.

Story successfully submitted.

Story successfully submitted.

Thank you for your story. We'll be consolidating all stories to kickstart a discussion portal in our next release. Subscribe to get updates on its launch.

I consent to SMU collecting, using and disclosing my personal data to provide information relating to XXX offered by SMU that I am signing up for/that I have indicated my interest in.

I can find out about my rights and choices and how my personal data is used and disclosed here.

Why is that?

“Suppose I'm teaching some 300 students, 400 students and these students are giving me feedback – it's a large data set and manually gaining insights which are important for changing my teaching process or improving my teaching process is tedious and painstaking,” explains Swapna Gottipati, Assistant Professor of Information Systems (Education) at SMU. “Hence, I have to depend on some kind of tool or machine, and this is one such attempt to generate the data insights very quickly for the faculty to gain the insights from qualitative feedback.”

Aspects of feedback

Professor Gottipati is referring to her recently concluded project “Learning Analytics on Qualitative Student Feedback to Improve Teaching and Learning in Higher Education”, which was supported by the Ministry of Education (MOE) Tertiary Education Research Fund. Utilising SMU’s in-house feedback system that collates students’ end-of-course evaluations, Professor Gottipati proposes a learning analytics system called the ‘Course Feedback Analytics System (CFAS)’ to “help faculty members to gain deeper insights on their teaching practices, curriculum as well as assessment development”.

Utilising Natural Language Processing (NLP), Professor Gottipati examined five main aspects of student feedback between quantitative and qualitative feedback, which include Topics, Sentiments, Suggestions, Time, and Correlation.

"Firstly, it is the topic, which are the issues that the students are talking about,” she tells the Office of Research and Tech Transfer. "Secondly, it is about the ranking of the topics and issues which are the most important ones because we want to prioritise them. 

"The third is to understand the perceptions, like sentiments or opinions of the students. Taking a faculty’s teaching style as an example – perhaps he or she is not very engaging or speaks very slowly, or in a very low voice. These are all negative feedback. It is called sentiment.

"And the last one is the quick summaries of the comments. This means a visualisation or some kind of user-friendly visuals or reports that can help us to identify what needs to be addressed."

Professor Gottipati explains that NLP models are rules-based and grammar-based, and the rules built into a model can extract adjectives and what these adjectives refer to. It can also identify the topics that students are writing about in their feedback.

Using the information

All the data extracted from open-ended answers would be worth little if it could not be used. To that end, CFAS will feature what Professor Gottipati calls interactive ‘doughnut’ graphics that are not just basic bar graphs and pie charts but those that “expand and pop out the respective negative or positive feedback for the given topics and so on”.

Ultimately, these will help not only the course instructor in fine-tuning their teaching, but will also help course managers and university administrators with macro-level management of the curriculum.

“During our performance evaluations, our reporting officers might give suggestions on how to improve the courses we teach,” says Professor Gottipati, who is also the Interim Associate Dean of Undergraduate Education at the School of Information Systems (SIS). “They should be able to know what happened both within the course, as well as about the course at a macro level. If they are aware of the positive comments provided for other courses, these can also be shared with other faculty for improvement in their teaching."

Originally published at https://research.smu.edu.sg/news/2020/may/12/improving-teaching-quality-through-qualitative-feedback%E2%80%A6-using-machines

Last updated on 28 May 2020