"Test Activity in Moodle", was a pilot project promoted by the Universidade Aberta’s rectory, aimed at using the Moodle Test (MT) tool to conduct online tests, replacing the face-to-face exams and/or submissions of final electronic written tests, on the Moodle platform. The project involved seven curricular units (CU) from four departments of the Universidade Aberta (UAb), a Portuguese public university of distance education.
The project objectives were to i) promote among the participating teachers the necessary skills to develop, monitor, and evaluate, successfully, online tests using the Moodle platform; ii) implement a viable and robust process that would guarantee the academic integrity of the online tests; and to iii) evaluate the success of the process in comparison with previous practices.
A team of researchers from the Distance Education and Elearning Laboratory (LE@D) undertook the evaluation of the project to respond to the 3rd objective, identifying its most favorable aspects, as well as critical aspects to be improved. Good practices The UAb has maintained over the years, and as part of its undergraduate degrees, a final face-toface assessment (Global e-folio or exam). The students enrolled in the continuous assessment modality will also have to complete other assignments in the form of E-folios. The students that chose not to follow a continuous assessment modality will take a high-stake test (exam) at the end of the year (Pereira et al. 2007). Given that it was impossible to carry out face-to-face assessments during the pandemic, the UAb started moving its final assessments online using the Moodle platform, the platform where the CUs are delivered. In this form of assessment, an assessment brief/matrices and answer sheet are made available to students. The student, at the end of the set time, submits the test through the specific device created for this purpose.
In the 2nd semester of 2021, a pilot project was developed to explore the use of the Moodle Test tool (MT) in these final exams replacing the existing procedure, seeking to optimize the whole process. Method The team responsible for the evaluation of this project defined a mixed analyses methodology (Creswell, 2003), adopting instruments of a qualitative (interviews and document analysis) and quantitative (questionnaire survey) nature, and established four stages in the process. Stage 1: analysis of the matrices underlying the development of the tests, to characterize the tests regarding their structure, skills/objectives under evaluation, types of questions used, among other aspects. Stage 2: design, development, and application of an online survey addressed to students about the experience of taking the online test with the TM tool (Global e-folio and Exams) to analyse their degree of satisfaction with this experience. In this sense, online interviews were conducted previously with students from different areas who had gone through the experience under analysis. The final survey presented 36 items, divided into 6 sections (see table), and was sent to the 1373 students involved in the pilot.
The items in sections 2, 3, 4, and 5, adopted a Likert scale with 5 levels of response (1 Strongly Disagree and 5 Strongly Agree). 3rd stage: inferential statistical treatment of the data (collected via a survey), as well as qualitative analysis of the open questions. 4th and final stage: Interviews were conducted with the teachers involved, after the entire evaluation process was completed, to assess their satisfaction with the different objectives of the project concerning the process experienced and the results obtained. Sample The sample was comprised of 379 students, 31.7% (n=120) male, 67.8% (n=257) female and 0.5% (n=2) other gender. The mean age was 40.73 years (SD=8.52) and ranged from 21 to 64 years. The difference between males (M=41.03, SD=8.57) and females (M=40.69, SD=8.47) in age was not statistically significant [t(373)=0.358, p=.720]. Almost half of the sample considers themselves to have high (47.8%) or very high (17.2%) digital skills. Regarding teachers, all involved were interviewed (7).
Regarding the development of skills to design and deliver the online tests, there was a positive evaluation given by the teachers who felt they had developed, during the training offered, the necessary skills to use the MT tool. Most students were quite satisfied with the conditions and procedures adopted. This feeling was also expressed in the open answers to the questionnaire, where the positive aspects mentioned outweighed the negative aspects. The conditions to take their MT was also evaluated very positively by most respondents, highlighting, as most evident aspects, the shorter amount of time spent, its control, as well as security, related to the fact that the answers are automatically recorded. The open answers, collected at the end of the survey, corroborated the results of the answers obtained in the closed questions. It should also be noted that students from the Statistics for Social Sciences CU had the most positive evaluation of the pre-conditions to take the tests, compared to the students from the other CU, probably because they are the most experienced students in this type of test as the teacher used it throughout the semester.
Concerning the difficulties felt while taking the test, only the reference to navigating with restrictions and the settings used to time each answer per question was mentioned. These results were corroborated in the open-ended questions and were listed as negative aspects, occasionally associated with other aspects, such as not seeing the full text in the questions requiring open answers and fear/lack of confidence of having technical problems. Regarding the satisfaction with the characteristics of the tests, most students considered adequate the use of the TM tool in tests for objective questions and for open questions, although, in the latter case, the level of agreement was lower.
Considering the students' overall satisfaction with the process, it was found that the experience was overall positive for most students. A significant majority would like to use this tool in all the CUs. Teachers considered the experience of using the MT very positive, even though they emphasised that it required more work whilst transposing the existing version of the test to the TM tool. They also emphasised the greater ease and speed in the correction and feedback process, especially when it comes to answers to objective questions. They also highlighted the good receptivity of their students and considered themselves overall quite satisfied with the experience.
The analysis of the collected data and its triangulation allowed us to conclude that the delivery of final exams and global e-folios using the MT tool was evaluated as very positive by both students and teachers, and both groups were in favour of continuing and expanding this pilot experience.
Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or the European Education and Culture Executive Agency (EACEA). Neither the European Union nor EACEA can be held responsible for them.