• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

HSE University Launches Project on Automated Exam Checking

HSE University Launches Project on Automated Exam Checking

© iStock

During this summer’s exams, HSE School of Foreign Languages and the Centre for Computational Educational Sciences launched a pilot project for the automated checking of student exams. Ekaterina Kolesnikova, Head of the School of Foreign Languages, Dmitry Abbakumov, Head of the Centre for Computational Educational Sciences, and Daria Kravchenko, Deputy Head of the centre, spoke about the project and its future prospects.

The Start of the Project

Dmitry: HSE administration asked us to expand the centre’s purview beyond online courses. We decided we could evaluate the quality of measurement materials, analyze the effectiveness of evaluation tools, and offer evidence-based adjustments to improve this system. We could also analyze how the learning process goes in general, and what difficulties students face.

Ekaterina: The pandemic did not only pose new challenges but also showed us what innovations needed to be implemented as soon as possible to make English language teaching at the University more resilient to uncertainty. In some ways, the changes that have taken place over the past six months are the key to stable future development. In spring we realized that remote education requires not just an online test, but an online exam on a suitable platform.

Tasks and Difficulties

Ekaterina: the task turned out to be difficult and required us to solve many issues related to the development of test structure and content, as well as building an organizational model for its implementation and designing a communication matrix with all the participants (which is more than 10,000 people).

The difficulties were both systematic, typical for a group of students, for example, from a particular campus, and individual, when a particular person had a problem with something

Content planning and test specification that suited an online format were difficult. In a short time, we had to significantly increase the number of previously planned tasks, the number of unique options, taking into account the platform's capabilities, additional dates, and a special approach to organizing the exam for students with disabilities.

From the very first stage, it was important to reassure our colleagues that we would not reschedule the exams, but were ready to deal with this situation at the beginning of the pandemic in April. We had to rely on the fact that none of us were ready to fail and were able to calmly resolve a difficult situation or in some cases, stay up working all through the night in order to upload materials to the platform in time.

We knew that we had experts with the necessary competencies to implement this project at the School of Foreign Languages and the Centre for Computational Educational Sciences, but it was important to help all our colleagues feel like a team working together on one task.

© iStock

Implementation

Dmitry: First, we collected all the necessary information and presented in the form of a visual dashboard at a school meeting. The heads of departments and educational programmes paid great interest to it, because they could see that this information was particularly relevant to their students.

This was a successful starting point, and then we started preparing for a much larger-scale task - to conduct an online exam for 8,000 students. The risks were high because, by this point, quarantine had been imposed.

Daria: All tests were held on the Moodle platform. There are two assessment systems: an automated system, which allows you to automatically check assignments, and semi-automated one, which allows teachers to check student essays. Written tasks were uploaded to the Disk manually, and then a special script that we developed sorted the essays into folders according to the degree programme.

Each teacher had a special link, which they could use to log in and start checking the tasks. After that, the teacher put the grade for the written exam in the document and the final grade was calculated automatically. Despite the fact that the exam period lasted only four days, we were able to ensure that this huge system operated smoothly, so it was easy for teachers to use it. 

Dmitry: we provided anonymity: teachers did not know whose work they were checking, and each student had their own digital ID. This is important because complete anonymization guarantees maximum objectivity of assessment.

We conducted some preliminary training for teachers and prepared video instructions on using our system. Later, we compiled a survey for teachers, which showed that 85% were satisfied with the exam format.

Ekaterina: We understood our responsibility to the University and wanted the HSE School of Foreign Languages to become the first and, as it turned out, only department in Russia to create an online English exam for a university. This exam was conducted on verified standardized materials, with proctoring and analytics for 8,500 students and four campuses. Moscow, St. Petersburg, Perm, and Nizhny Novgorod did not experience any geographical or temporal differences. All department managers synchronized their team’s work.

Developing the Project

Dmitry: Thanks to this pilot, we are starting a new project in September. It will concern the development of automated entrance testing for the School of Foreign Languages. The Centre for Computational Educational Sciences will work on the methodology for implementing a special analytical system that will ‘guide’ the student from his/her admission to graduation. It’s an ambitious project!