A research partnership story
 

Dr Tracey-Ann Palmer from University of Technology Sydney is delighted to share the story of a very successful research partnership between the University of Technology Sydney and Pymble Ladies’ College. The research study we undertook would not have been possible without the support of the dedicated staff, committed teachers, and enthusiastic students at Pymble. The strong research mindset at the school played a vital role in making this collaboration such a resounding success.

The project

My story starts with an opportunity to apply for university funding that was aimed at fostering collaboration between faculties within the university. Nick Hopwood, Mun Yee Lai and I come from the school of education within the Faculty of Arts and Social Sciences at UTS and so we research best-practice teaching and learning in schools. We partnered with the UTS Data Science Institute (DSI) who truly do leading-edge research on using data to understand how the world works. DSI had developed remarkable software capable of tracking human-computer interactions. They could monitor facial expressions, gaze patterns, keystrokes, and even detect emotions. The software collects a vast dataset of measurements using software loaded onto a standard computer. Given the surge in online learning during the Covid-19 era, we recognised the potential of this technology to enhance educational experiences.

The partnership

Data experts, Kun Yu and Yifei Dong from the DSI were keen to work with us because they wanted to use their software with school students and needed our expertise in schools to make that happen. They also needed us to help them with making sense of the data so they could train AI to interpret what their software was detecting.

We were delighted to work with DSI and explore how such new technology could be of value to teachers and their students. After some discussion, the joint team decided to focus on engagement and to conduct a pilot study to find out if the tracking software could ‘see’ student engagement with an online task.

You may not be familiar with the intricacies of university funding processes, but it is quite common for us to receive invitations with incredibly tight timelines for funding applications. This project was no exception and presented us with a challenging timeframe to develop our proposal. Our faculty has extensive experience researching in schools so we assembled our team of experts and working with our new colleagues from DSI, created a broad plan to collect data in a school.

Thankfully, we contacted Sarah Loch to ask if Pymble would be a partner in the study. Fortunately, we discovered that this research aligned perfectly with the new Data Science course offered by the school and the school’s overall Digital Intelligence strategic pillar guiding innovation in this area, and the teachers were eager to test the technology. Our funding application was created, submitted and was successful!

We subsequently developed a detailed research plan and obtained the necessary ethics approvals from the university and Pymble’s ethics committee. Nick Hopwood and Kun Yun were invited to speak to the application at a committee meeting which helped students and teachers to understand it more fully and to meet university academics with experience in research. We were particularly impressed by the existence of such a committee at Pymble.

The plan

We know that schools are busy places and that coming into a school to conduct research can be complicated. This is especially true when undertaking pilot studies, where the outcomes are particularly uncertain. This is where the research mindset fostered by the Pymble Institute comes to the fore.

The development of the pilot involved extensive cooperation with Pymble’s Data Science instructors, namely Cedric Le Bescont, Anthony England, Glen McCarthy and Kim Maksimovic, as well as Sarah Loch and Victoria Adamovich in terms of research support. Collaboratively, we formulated a comprehensive data collection strategy. Given the experimental nature of the pilot, it was immensely beneficial to collaborate with a team that exhibited remarkable responsiveness and adaptability. Our meetings fostered fruitful discussions, bringing together individuals with open minds, all focused on optimising the value of teacher and student time.

Through the workshops and discussions, we agreed on the following research design:

  • Teachers collaboratively designed an online task.
  • Students performed the task while being monitored through school laptops using the tracking software.
  • Students participated in focus groups to share their experiences.
  • The data collected from the laptops was analysed by the data science experts.
  • Teachers were presented with the data and shared their insights and ideas.
  • Each student received an individualised report based on their data and participated in a group feedback session.

The pilot

The pilot study was conducted in late 2022, and data analysis was carried out during the first half of 2023. We had some logistical issues (as all good pilot studies do) but nothing that stood in the way of us collecting excellent data.

The data we gathered included:

  • Emotions, gaze patterns, keystrokes, and mouse movements from 25 Data Science students in Years 9 and 10 who completed the task.
  • Insights and experiences shared by 20 students in five focus groups.
  • Input from four teachers in a focus group, where they reviewed the data and suggested how the results could enhance their teaching and student learning.

We also provided each participating student with a personalised report with their data and conducted a feedback session to discuss the results and gather their input.

The product

We got some very interesting results. The data collected is extensive and the DSI team is still working through some of the data. The focus group data found that students found the software unobtrusive, and they quickly forgot about it as it appeared as if they were working on a regular computer.

The results of the study were intriguing. Through our focus group discussions, we observed that students approached the online task in three distinct ways, which we categorised as learning dispositions: The Whizz, The Worker, and The Worrier. We then analysed the tracking data for each student and predicted their learning disposition accordingly. To validate our analysis, we sought feedback from the teachers.

The teacher focus group session was both exciting and fruitful. By examining samples of student data for each learning disposition, the teachers confirmed that our analysis, based on the tracking data, aligned with their perceptions of the students’ approaches. It was a clear match in almost all cases. We were thrilled to discover that the tracking software could identify different learning styles, and the teachers recognised its potential as a valuable tool for identifying individual student needs. They also believed the software could aid in identifying student emotions, enabling them to be more responsive in online learning environments.

Thanks to our collaborative partnership within the university and with Pymble Ladies’ College, the pilot study proved to be an enormous success. While our initial aim was to evaluate the value of the tracking software, we now believe it can effectively track emotions and monitor students’ task engagement and actions. However, determining genuine task engagement – deep interest rather than mere completion – remains a more complex challenge. We find this aspect fascinating and intend to conduct further research in this area.

The postscript

This research endeavour became possible due to the collaborative efforts of our university and the College. As a pilot study, we did not anticipate finding all the answers. Nevertheless, we gained valuable insights into what works and what doesn’t, discovered answers to unanticipated questions, and obtained valuable hints for addressing our initial inquiries.

That’s the essence of research – an open mind and a willingness to explore. In research, there are no failures, only opportunities to gain more information about what doesn’t work. This is what makes research both wonderful and occasionally, a bit messy.

 

 

This article was originally published in Illuminate, Edition 9.

Pymble Institute logo