Experiences of conducting online exam proctoring in low- resource settings: a Sri Lankan case study

Software development at a Sri Lankan higher education institution. Use of online proctoring methods by educational institutions. Deployment of online control systems in conditions of limited resources. Requirements for expanding the scale of online exams.

Рубрика Педагогика
Вид статья
Язык английский
Дата добавления 21.03.2023
Размер файла 1,5 M

Отправить свою хорошую работу в базу знаний просто. Используйте форму, расположенную ниже

Студенты, аспиранты, молодые ученые, использующие базу знаний в своей учебе и работе, будут вам очень благодарны.

Размещено на http://www.allbest.ru/

Размещено на http://www.allbest.ru/

Experiences of conducting online exam proctoring in low- resource settings: a Sri Lankan case study

Dilky N. Felsinger

B. Sc (Hons) in Computer Science (Reading)

University of Colombo School of Computing, Colombo, Sri Lanka

Thilina Halloluwa

Doctor of Philosophy, Senior Lecturer, Department of Information Systems Engineering University of Colombo School of Computing, Colombo, Sri Lanka

C. L. Ishani Fonseka

B.Sc (Hons) in Computer Science, Lecturer, Department of Information Systems Engineering University of Colombo School of Computing, Colombo, Sri Lanka

Abstract

Online tests are becoming increasingly common as online learning gains popularity. Online assessments are more prone to academic dishonesty than traditional exams. Ad hoc online proctoring techniques have been used by institutions to prevent academic misconduct during online tests. In contexts with unreliable infrastructure, such as slow internet connections and a lack of compatible equipment, it is challenging to figure out how to employ technology to proctor online tests swiftly and efficiently. The goal of this study was to gain a better understanding of the context and requirements for scaling up online exams in Sri Lanka as well as to investigate what options for improving the subsequent deployment of online proctoring systems in low-resource settings exist in order to maintain the caliber of academic programs. In this work, an in-class assignment for a Software Engineering course at a higher education institution in Sri Lanka was manually proctored using video conferencing software in a completely remote context. A postexam feedback survey was used to collect information about participants' experiences with online proctoring. The survey was supposed to reveal the experiences about the devices used, internet connectivity, the factors that led to stress and perceptions about cheating during the exam. The data obtained from both the assignment and the survey was analyzed in order to make well- informed decisions about the future deployment of online proctoring technologies. This paper presents evidence of the experiment that strongly recommends that online exam proctoring solutions be developed after a careful examination of the constraints such as internet connection speed, stability, and affordability, as well as sudden power outages of the environment in which they will be utilised, in order to unlock the true potential of e-learning technologies.

Keywords: Online Proctoring; Online Education; Online Exams.

INTRODUCTION

Global adoption of online learning has increased rapidly in the recent past [1]. Access to quality learning material from world-renowned universities (e.g [2], [3]) and the ability to learn at one's own pace by engaging in Massive Open Online Courses (MOOC) are some of the benefits of online learning. Despite its advantages, online learning is facing challenges in conducting online exams while ensuring academic integrity and protecting the credibility of the programs [1], [4]. Academic dishonesty is more common in online tests than in physical exams since all the material that needs to be accessible is available with a single click or a voice command [5]. Siddhpura & Siddhpura [6] have identified cheating, collusion, impersonation, contract cheating and plagiarism as the possible types of academic misconduct in online exams. Gazing away from the screen, talking, using unauthorized materials, positioning oneself out of the view of the webcam, constant change of active window can be identified as behaviour associated with academic misconduct in online exams [7]. While using a false identity and using outside sources are two of the most frequent cheating strategies, candidates nevertheless come up with new strategies on a regular basis. There are methods to abuse internet access or navigation privileges despite careful monitoring [8].

There are several proctoring services with specialized proctoring software that helps instructors to conduct online examinations [9]. Examination candidates are required to have a webcam and an internet connection to access the proctoring platform to connect with the well- trained proctors. These services offer the highest reliability at prices calculated per session, which is often very costly if the requirement is to conduct a mass-scale examination.

There have been three different remote proctoring systems documented [10]. One such method is live proctoring, which is very similar to traditional in-person proctored exams but necessitates a proctor with higher technological proficiency. Live proctoring is an exam that is taking place with a real-time human proctor monitoring the process remotely (for example, by watching the students live online via Zoom). The second type is a video recording of the test taker that is later watched and judged by a human proctor. The third kind of proctored examinations involves automation, in which the technology records the exam and recognizes and alerts instances of potential student misconduct by introducing very complex systems that combine low-level features into high-level features using artificial intelligence, image processing algorithms, biometric technologies, and cloud technologies [11], [12]. Many of the studies have elaborated that an Automated Online Exam Proctoring system contains two main components, namely, Exam Candidate Authentication and Detection of Academic Misconduct [12], [13]. In most of the studies conducted to address the problems in online proctoring, a priority has been given to user authentication and successful methods have been implemented to correctly identify the test taker using technologies such as biometrics, continuous authentication etc. [14], [15]. To detect academic misconducts, previous work has utilized gaze estimations, head pose detections, keystroke metrics, mouse movement data etc. [16], [17].

Authors in [16] proposed a Visual Analytics system that consists of two major modules, namely, the suspected case detection engine and the visualization. The suspected case detection engine analyses head poses and mouse movements to determine the abnormal behaviour of an exam candidate. The focus of [16] has been improving the reliability and convenience of exam reviewers by effectively visualizing the results of the case detection engine in three levels of detail. The reviewers can use Student List View, Question List View and Behaviour view along with playback facility to catch the wrongdoers. However, the experiments of the study have been conducted on a device with an Nvidia Titan Xp GPU which is an expensive computational device. Therefore, it is a question whether the proposed system can be utilized in low-resource environments even by rebuilding it to be used on a CPU.

Authors in [17] have proposed a method to create a summary of pre-recorded videos from sequences of abnormal behaviour of exam candidates modelled using a two-state hidden Markov model, with observations defined as head poses. The video summaries generated using the skim generation technique aid with the post-examination review. The invigilators save time by reviewing the summaries, rather than videos that are hours long. Although the scalability issue of e-proctoring in large-scale exams is addressed by these proposed systems, the operational needs of their solutions have not been explicitly stated, therefore it is uncertain whether they can be applied in low-resource settings.

Muzaffar et al. [18] have reported that developing countries contribute more to online exam research than developed countries. Despite the higher contribution, developing countries have failed to implement their solutions often due to unstable infrastructure, so the proposals have been only limited to research publications. Proctoring solutions that are operational in real environments [12, 13] are from developed countries like the USA and Spain with stable infrastructure and high internet speed. Muzaffar et al. [18] further reported that economic situations and existing e-learning infrastructure are highly important for the adoption of online exams in a country and recommend considering network infrastructure, hardware requirements, implementation complexity and training requirements when designing feasible online exam solutions.

Institutions have tried to implement affordable ad-hoc online proctoring methods to deter academic misconducts in online exams. The study conducted in [19] has used Youtube to livestream the footage taken by two web cameras to proctor a student during an examination. They discourage using generic video streaming services to proctor online exams since it heavily relies on a third party and due to privacy concerns [19]. Many universities have adopted remote exam proctoring through video-conferencing software [20], [21]. This method is convenient and affordable because it does not require additional hardware and software. However, in cases where thousands of students take an online test simultaneously, monitoring the student's physical learning environment as well as everything the student performs during the examination to detect fraudulent activities manually using videoconferencing software can be a tedious task [22].

Table 1

Summary of related work

Study/

Commercial

Solution

Nature of the proctoring method (Manual/Semiautomated/Fully automated)

Features of the system

Has the study mentioned about the operational constraints of the proposed solution?

ProctorU

Manual

Semi-automated

Fully automated

Facial Recognition

Real-time audio analysis

Prevention of usage of external applications

Recording audio, webcam and screen

No

ExamSoft

Semi-automated

Fully automated

Facial Recognition

Prevention of website access and recording web traffic

Recording audio, webcam and screen

No

Locdown

Browser

-

Preventing students from opening unauthorized websites, copying and pasting text, taking screenshot, and printing during an assessment.

No

[15]

Semi-automated (The reviewers can use Student List View,

Question List View and Behaviour view along with playback facility to catch the wrongdoers)

Head pose estimation

Behaviour modelling for mouse movements

No

[16]

Semi-automated (The proctors review the video summaries)

Head pose estimation techniques

Video summarization techniques (Skim generation)

No

[18]

Manual

(The proctors watch the Youtube livestream)

Recording the surrounding, face of the candidate and his/her screen 3 camera setup

Livestreaming the recordings via Youtube

Yes

Our method

Manual

(The proctors watch the recorded videos submitted by student)

Recording the surrounding, face of the candidate and his/her screen 3 camera setup using Zoom

This method is used to determine the operational and environmental constraints of e- proctoring.

The goal of this study was to understand the context and requirements for scaling up online examinations in Sri Lanka, as well as to determine what could be improved for subsequent deployment of online proctoring systems to maintain academic program quality in low-resource settings. The considerable gap of access to quality internet connection, up-to- date devices, and study-friendly environments among students in Sri Lanka are the barriers in adoption of online learning [4]. Therefore, the online proctoring mechanisms should be inplace in such a manner that it would not obstruct the education of students from low-resource environments. As depicted in Table 1, neither the commercial solutions nor the research work have explicitly mentioned the operational feasibility of the proposed solutions. sri lanka online exam

The study consisted in an in-class assignment of a Software Engineering course, that was proctored using Zoom, a video conferencing software. After completing the assignment, the participants were sent a post-exam feedback survey to communicate their experience with online proctoring. The study revealed to what extent the available tools can be used to conduct proctored online exams and the difficulties an exam candidate must face when completing an examination in an environment with unstable infrastructure. Finally, the findings were used to make informed decisions about features of a solution of online proctoring system that could be well-suited for the available infrastructure.

1. RESEARCH METHODS

Requirements of the proctoring method of online exams

In onsite exams, students usually send signals or exchange chits as they look at each other. As a result, examiners in the face-to-face classroom had to walk around, carefully observe the students, and notify them if they engaged in any suspicious behavior. Remote online exams must meet certain requirements to be comparable to onsite exams in terms of ensuring academic integrity. There are two main types of evidence that should be obtained at online exams, to ensure the academic integrity is not violated.

Is this the right candidate for the exam?

A big challenge in remote online exams is making sure that there is no room for contract cheating or impersonation. Institutes must rightly identify the exam candidate, must check if the candidate is a student of the institution and make sure that the correct candidate answers the exam throughout the exam. Therefore, there should be evidence of continuous authentication from the start to the end of the exam.

What does the exam candidate do during the examination?

Unlike in onsite exams, there are many more opportunities for exam candidates in remote online exams to engage in academic misconducts. The activities of an exam candidate during an online exam are of two sorts. The first is the activities he or she performs in their physical learning place during the examination period such as using a mobile phone to search for answers or seeking help from someone in the room. The second is the activities that happen on device, such as accessing any other computer apps, including the internet browser, functions like copying, pasting, or printing, use of virtual machines which could lead to academic misconducts. Video evidence recorded using a webcam and evidence of system activity during the examination should be obtained from the start to the end of the exam.

There are challenges to overcome in designing a proctoring solution suitable for usage in Sri Lankan universities. The class size of a university is often between 100 - 200 and the exams are conducted at a scheduled time for the whole class. The universities do not possess enough staff members to offer live proctoring when the exams go online. Public funded universities are not financially strong enough to get services of third-party commercial proctoring solutions. The proctoring solution should only use the common, widely available devices without imposing a financial burden for the student. Speed, stability and affordability of the internet connections and sudden power outages are the other problems students face in adopting to the online exams.

Proctored online assessment

We used Zoom, a video conferencing software, to proctor a compulsory assignment given to second year computer science undergraduates following the software engineering module at the University of Colombo School of Computing, Sri Lanka. The exam consisted of a case study, where participants were allowed to use only the module textbook during the exam. Participants had to write their answers on paper within one hour, scan and upload it to Moodle course page. There were no live proctors present, instead the recorded video of the Zoom call had to be submitted with the answer scripts. This practice is in keeping with the recommendations made by authors of [23], who advised using a uniform online exam policy where a camera recording each student's computer screen and room is a requirement to deter cheating in online exams. The primary camera, secondary camera and the screen recording are the three main components, and their setup is shown in Figure 1. The hardware and software setup of the proctoring method, the procedure including the instructions given and the observations are discussed below.

Figure 1. Proctoring Apparatus Setup

Setup

The requirements exam candidates had to meet were a webcam (either built-in or a peripheral device), a smartphone, access to Zoom video conferencing software and an internet connection with a standard quality. The setup of the proctoring method and the purpose of each component are as follows:

S Primary Camera

In this setup, a built-in webcam on exam candidate's laptop or a USB webcam is used as the primary camera to capture the face of the candidate. It provides the evidence on candidate authentication predominantly. This camera does not provide enough information on what the student does in his/her physical exam space.

S Secondary Camera

Since a built-in camera has a narrow viewpoint, a smartphone camera was used to obtain evidence on what student does during the exam, his/her workstation as well as the surroundings. We instructed the participants to set up the smartphone at a distance where the workstation is clearly visible and provides a wider view on the surroundings.

S Screen Recording

We needed to obtain evidence of the activities that happen on the participant's screen to fulfill all our requirements. Specifically, we wanted to make sure that the participant only uses the module textbook during the exam. The screen sharing feature of Zoom is used to record the activities on the screen of a candidate's device.

Upon setting all the hardware, we instructed the participants to create a Zoom meeting and to keep both the devices joined to the meeting and to share the screen. The video recording should ideally have two camera streams and a screen recording, to be counted as successful proctoring evidence. No live proctor was present, the submitted videos were reviewed later.

Procedure

The procedure of the exam proctoring is of three stages. The instructions the candidates must follow are detailed below.

S Before the exam

- Setup the laptop/desktop computer with a webcam to provide a clear view of the face, in a place with good lighting conditions.

- Join both the computer and the smartphone to the Zoom call with the video on and microphone unmuted only on either of the two devices.

- Provide a 360-degree view of their workstation and surrounding using the smartphone.

- Place the smartphone at an appropriate distance to provide a clear view of the surroundings.

- Verify the student identity by showing the University ID card in front of the primary camera.

- Show the papers that you use to write on, using the primary camera to confirm that they are empty papers.

- Start screen sharing on Zoom. Use the `Screen' option to record all the windows open, not only a specific window.

- Access the case study through Moodle.

S During the exam

- Write the answers on the piece of paper which was showed earlier.

- Do not stop the videos from either of the devices or the screen recording until the end of the exam.

S After the exam

- Show the answer scripts to the webcam, scan and upload them to the Moodle course page.

- End the Zoom meeting, save the recording to your device and upload it as well.

Moreover, we instructed the participants to provide information about the device they used for the exam, and the details on network data consumption during the exam. A data usage monitoring software called Glasswire was used to measure the data usage of the session. We instructed the participants to select the time window of the exam on Glasswire and report the data usage which occurred in the said period. A screenshot of the system information was requested to obtain the device details such as the processor type and main memory capacity. The purpose of collecting that information was to get an understanding on common device types available with the students, which may affect the development of a holistic examination system upon gathering requirements and constraints.

Post-exam feedback survey

A survey consisting of 16 questions was used to collect data about the participant's experience about the proctored exam. We sent the survey to all the participants regardless of their consent to use the recorded videos for research purposes because the participants who did not provide consent to use the videos due to privacy concerns preferred to answer an anonymous survey regarding their experience. The survey consisted of 13 default questions and 3 conditional questions based on the answers for the previous question. The survey was anonymous, voluntary, and contained both close and open-ended questions. The questions were supposed to reveal the experiences about the devices used, internet connectivity, the factors that led to stress and perceptions about cheating during the exam. From the total of 197 participants, 147 (74%) had responded to the survey. The data required minimal cleaning, since all the participants had genuinely answered the questionnaire. Hence, only once record had to be removed due to duplication. Because of the exploratory nature of this study, the quantitative results are presented using summary statistics. The quantitative data was downloaded into a spreadsheet for analysis and the creation of data visualizations. The qualitative data from the surveys was downloaded into a spreadsheet and manually coded depending on the questions asked using deductive reasoning. Because the data was aligned with survey questions, coding was relatively simple.

2. THE RESULTS

Proctored Online Assessment

In our online exam, we were able to collect 126 videos from participants who provided consent for research purposes, from a total of 197 participants. The time length of a video varies from 45 minutes to 90 minutes, as participants were allowed to conduct early submission if they had finished all the questions. The videos consist of the participants' behaviors, the events happening in their surrounding and the materials they refer to during the exam.

Network Data

A total of 126 records on internet data usage including upload and download bandwidth were collected. Some of the records had to be discarded due to inaccurate reporting. The statistics of time spent, upload bandwidth and download bandwidth are given in Table 1 examining 67 records. Most of the entries were inaccurate or did not provide the required information. Since two devices are connected to the zoom call, data consumption happens in both devices.

Table 2 Summary statistics of network data usage

Average Time Spent

52 minutes

Average Upload Bandwidth

480 MB

Average Download Bandwidth

148 MB

Device Details

Device details such as processor type, clock speed and total physical memory (RAM) were collected. Only 49 records contained device information. Above 80% of the participants had either an Intel Core I5 or I7 processor model, but this may not be the case in faculties. The most common RAM capacity was 8 GB, but there were instances it was as low as 4 GB and high as 32 GB.

Video Recordings

We were able to derive some interesting insights from the video data by manually examining them. Unavailability of views does not always amount to academic misconduct. In this experiment, exam candidates had to deal with situations such as technical glitches in their electronic devices. If a candidate had to get up and fix the secondary device due to a problem, going missing from the front view is not academic misconduct. The purpose of setting up a secondary proctoring device is to observe the writing station and surroundings of the exam candidates. The students do not possess the facilities to fully serve the purpose. Most of the time the device was not positioned in such a way as to give a clear idea of the examination candidate's surroundings. All three views were not available simultaneously in some instances, which may raise concerns for the proctor.

Upon examining the video records, it was clear that designing a solution using only the observations would not be enough. Therefore, we used a post-exam feedback survey to better understand the challenges faced by the exam candidates during proctored online exams.

Post-exam Feedback Survey

In this section, we describe the survey findings, beginning with an explanation of the quantitative data and then moving on to the qualitative data.

Quantitative Data

First, we wanted to know if the participants had clearly understood the instructions we provided regarding the assignment, the proctoring setup, and the procedure. Participants' agreement/disagreement to that statement (n = 146) were gauged on a 5-point Likert scale (see Figure 2). 53 participants agreed with the statement, and 52% of the participants were clear on the instructions given, while 30% did not either agree or disagree with the statement. Only 17% said that they did not clearly understand the instructions.

Figure 2. Participants ' responses for the statement “The instructions about the assessment format andproctoring setup were communicated clearly. ”

The number of the participants who had issues with the devices was close to 50% (see Figure 3) and 66% of the participants had issues with their internet connection (see Figure 4).

Figure 3.Participants ' responses for “Did you have any issues with the device (e.g. Computer, Laptop, Smartphone) during the assessment?”

Figure 4. Participants ' responses for “Did you have any issues with your internet connection during the assessment?”

According to the data in Table 2, 46 of the participants (n=146) had issues with both the internet connection and the devices. 34.9% had issues with the internet connection only, while 20% had issues only with the devices they used. Only 13% of the participants reported that they did not encounter any issues concerning the internet connection and the devices. 86.9% of the test participants have had at least one issue, which is alarming about the practical issues of the proctoring setup.

Table 3

Summary on Counts of participants who had issues with internet connection and devices during the assessment

Did you have any issues with your internet connection during the assessment?

Did you have any issues with the device (e.g., Desktop, Laptop, Smartphone) during the assessment?

Total

(n=146)

Yes

No

Yes

46 (31.5%)

51 (34.9%)

97

No

30 (20.5%)

19 (13%)

49

Total

76

70

146

Qualitative Data

Findings relating to problems in internet connections, issues in devices used, factors that led to stress during the exam, and the factors that made cheating easy or hard during the assessment, obtained as responses to open-ended questions, are discussed below.

Issues regarding internet connectivity

Participants faced several types of troubles during the exam due to low bandwidths, instability, and sudden disconnection of the internet. The common issue many of the participants faced was having to reconnect two devices to the Zoom session because of sudden disruption of the internet connection. One respondent mentioned, “I lost my connection when trying to access the internet with two devices and had to record again and again”.

We noticed participants leave the seat and adjust their smartphones in some instances, which is evident in the comment:

The need to use two devices to log in to the meeting several times displayed that my network bandwidth is unstable and on one occasion my mobile phone was disconnected. After I noticed that, I went back and re-connected it and then continued doing the assignment'.

Therefore, we can conclude that the instability of the internet connection is one of the reasons for unavailability of certain views. When the internet connection is unstable, Zoom automatically switches off the videos and the participant is not aware of it. Many of the students reported that often the internet connection was unstable, unless it was a fiber connection that is only available in urban areas. In cases of power outages, the participants tried to connect the devices via a mobile hotspot. Mentioned below is a comment regarding the mobile data signals:

“'It's really hard to manage our bandwidth if we don't have a fiber connection. Else if we got a power cut it's hard to connect via mobile hotspot, during power cuts 4G connection becoming to 2G or 3G (but the bandwidth is really low at that time) ”

Another issue that a considerable number of participants had to face was difficulties in uploading the recorded video within the given time limit. One participant commented, “Connection was not stable, and it took me more than 6 hours to upload the assessment videos”. Average size of a video file ranged between 150 MB to 220 MB, and this was a problem that could not be seen from the videos. Some people can forge the videos using this issue as an excuse as well. Hence, there should be a mechanism that reports the behaviors as soon as the exam is completed, whether the video is submitted or not.

Issues regarding devices used

Poor battery health in both laptops and smartphones was the major concern participants had regarding the devices. Using Zoom in smartphones for more than 1 hour seemed problematic for many participants. Some responses regarding the issues faced due to bad battery health of the devices used is stated below.

“My phone is having a low RAM and when the camera is turned on the battery drains first. Luckily, I was able to finish the assessment before it died... phone storage space is also very low... I had to delete several necessary applications in order to download zoom to my mobile phone ”

“My mobile phone runs out of battery very fast, so it is difficult to stay in zoom for around 1 hour of time without charging. ”

“Smartphone storage full and Battery is weak, so I have to keep the charger connected during the assessment, which makes the mobile heated. ”

Other than battery health issues, broken in-built webcams, echo in microphones and technical glitches in laptops were among the issues reported regarding the devices. Participants were not in favor of using the smartphone as a proctoring device for a prolonged time due to the above reasons.

Factors that led to stress during the assessment

The time constraints of the assessment, feeling of being recorded and fearing the technical issues discussed above, were the factors that added stress and anxiety during the exam, which is not the case in an exam hall. 57.8% of the participants strongly agreed with the statement that this proctored assessment was more stressful than a traditional invigilated exam.

Several respondents reported that there were many other factors to pay attention to, on top of answering the questions, which is not the case in an exam hall: “Unlike an exam centre, our home environment is not peaceful, it's too noisy, which makes it more difficult to concentrate in an exam. When we are in an examination hall, there is less probability that external factors affect our results, only the knowledge we gain by studying will affect our exam results. But when it is online, other external factors affect our results, like electricity, or internet connection, our home environment. In an exam centre we only need to worry "how to answer the questions", but online we also worry what will happen when other external factors affect the exams, thus we lose our concentration. Trust me I want to be supportive and help the academics to conduct the exam successfully, but as these factors disturb and take away my concentration, I'm not able to be supportive, I'm sorry about that. ”

The participants said that external factors such as noisy environments, interruptions by family members, malfunctions of the devices made it difficult for them to concentrate on answering the questions. The participants did not have a peaceful exam environment at home. For example, one commented, “There were a few disturbances such as background noises or neighbors interrupting by knocking on door. Additionally, there wasn't a comfortable place to do the exam at the house since I have no separate room. Anyone could come to the house at the time and interrupt what I was doing. ”

The major stress-factor of this assessment was the fear that one would be accused of cheating when facing a technical issue. A few responses that resonate with the above statement are listed below.

“I had no proper place to set up my back camera so had to tape it to a wall - and it fell off multiple times. I had to check it constantly and reposition it multiple times ”

“If there is a power cut during an online exam, we do not know how it will affect our exam, and how do we prove it to examiners? And if the power comes again, will it give us back the lost time? If so, how? These factors are stressing me. ”

From the above statements, it is evident that the participants were worried about proving their innocence to the invigilators, if they were caught up with some issue during the assessment.

Furthermore, many participants' opinion was that the need to constantly check on the proctoring setup reduced their attention on answering the questions. Below are some examples,

“Continuously I have to check out the devices are working well or not and also if one of the devices or both disconnected then I have to re-connect. That is time consuming. So those are stressful things when it comes to an exam. ”

“It increased the stress since many uncontrollable factors contributed to the final output other than our performance. Such as device performance, battery lifetime of the device, network connection, meeting the deadline with uploading speed and everything had to be looked into during the assessment and that reduced the focus on the assessment. ”

Figure 5. Student responses to the statement: “The online assessment made cheating ...”

Factors that made engaging in misconduct easy or hard

Before inquiring the opinions on cheating in a proctored exam setup, we asked the respondents to rate if the proctoring setup made cheating easy or hard, on a 5-point Likert Scale (See Figure 5). 82 of 146 respondents said it was neither hard nor easy. Only 19 of 146 respondents believed that cheating was easy in that proctored assessment. According to the findings of [24], the lack of clarity in examination instructions regarding what resources might be used led to confusion over what constituted cheating. Therefore, we explicitly mentioned that the participants could only use the module textbook for the assessment.

The participants mentioned that they did not have enough time to consider cheating, because of the time constraints and the proctoring setup: “I do not think I had the proper motivation to even attempt the assignment successfully given the conditions and the stress. That alone discouraged any attempts to score any higher than I could at my best via cheating in anyway”. This aligns with the findings of [24] but one respondent who mentioned that cheating is somewhat easy in this setup stated that “examiners have to go through every second in every recording to find out if any cheating happens”, which is conflicting. It is evident that students think that it is a difficult task to manually review videos that are hours long, hence they consider it as an opportunity to cheat.

DISCUSSION

This study concludes that it is impractical to develop a fully automated online proctoring system in the current context. The best suited proctoring method is semi-automated proctoring that reports the actions of an exam candidate during an exam session. Upon examining the records, the final decision should be taken by invigilators. Moreover, there should be a fine balance between deterring academic misconduct using proctoring and the inconvenience faced by the students during proctoring. Our findings say that a complex proctoring setup itself is a factor that affects students' performance during an exam, regardless of academic misconduct. Academics should exercise caution when creating proctoring systems, especially in developing nations, where the resources at their disposal make it difficult to completely map online exams to traditional exams. Based on the aforementioned conclusions, it is advised to add the following components to the typical features of an online proctoring system for low-resource situations.

The main cause of most of the issues faced by the students is the secondary proctoring device (mobile phone). Setting up a secondary device provides some useful information but the impediments of having to set up a secondary device exceed the advantages. The two- device setup doubles the network data consumption and doubles the work of the invigilators because they must review two videos per candidate. The main issue reported regarding the devices is the battery drainage of the mobile phones. The candidates say that they had to give extra attention to ensuring the mobile phone is properly connected. Therefore, an ideal solution should eliminate the need for the two-way camera setup.

Difficulties candidates face because of unstable internet connection are the next important problem that should be addressed when designing online proctoring solutions. A solution should minimally depend on the internet connectivity. Trying to identify the performance constraints of a proposed solution to determine whether on-device proctoring can be facilitated in common devices is another step that should be taken. Such facilitation will enable offline mode where the internet will be required at the start and the end for submission purposes only. Once the candidate is authenticated, the system should run offline and should submit a behaviour report as soon as the exam is over instead of sending a bulky video, which will reduce the time for video uploading. Moreover, an ideal solution should include reports about internet connectivity if achieving offline mode is not possible. If disconnected, the timestamps of the disconnected duration should be obtained so that the student can continue answering when the connection becomes stable again without losing writing time. If the behavior report is sent instead of uploading the video, the video must be saved on the device until an examiner request it upon inspection of the behavior report. Another possible feature which could lessen the gap of access to e-learning would be expanding a solution functionality to include mobile phones and tablets, since it is reported that many students in Sri Lanka use mobile devices to engage in online learning activities [4].

CONCLUSIONS AND PROSPECTS FOR FURTHER RESEARCH

Along with the rise of online education, online examination is becoming more popular. However, it is obscure how to use technology to proctor online tests simply and efficiently with limited resources. Our findings strongly suggest that online exam proctoring solutions should be developed after carefully examining the constraints of the environment that it is intended to be operationalized. Based on the outcomes of this study, future research could look at a variety of topics on e-exam proctoring such as offline test proctoring, live- transmission of action detection records or saving behavior report securely until the exam is completed. Integration of the functions into a full online test system would aid in the seamless operation of online education in contexts where internet connectivity and infrastructure are unreliable. To scale the availability of trustworthy online exam processes, a possible future work is to develop e-proctoring solutions that may well operate in various devices such mobile phones, and tablets which would be highly beneficial in facilitating online education in low-income counterparts of the society.

REFERENCES (TRANSLATED AND TRANSLITERATED)

[1] K. A. A. Gamage, E. K. de Silva, and N. Gunawardhana, “Online delivery and assessment during COVID-19: Safeguarding academic integrity,” Educ Sci (Basel), vol. 10, no. 11, pp. 1-24, 2020, doi: 10.3390/educsci10110301.

[2] edX Inc., “Schools and Partners | edX,” edXInc., 2019. https://www.edx.org/schools-partners (accessed Jun. 06, 2021).

[3] “Global Freshman Academy Start Earning College Credit | edX.” https://www.edx.org/gfa?track=gfa-banner (accessed Jun. 06, 2021).

[4] R. Hayashi, M. Garcia, A. Maddawin, and K. P. Hewagamage, “Online Learning in Sri Lanka's Higher Education Institutions during the COVID-19 Pandemich,” vol. 5, no. 151, p. 12, 2020.

[5] M. Amzalag, N. Shapira, and N. Dolev, “Two Sides of the Coin: Lack of Academic Integrity in Exams During the Corona Pandemic, Students' and Lecturers' Perceptions,” J Acad Ethics, pp. 1-21, Apr. 2021, doi: 10.1007/s10805-021-09413-5.

[6] A. Siddhpura and M. Siddhpura, “Plagiarism, Contract Cheating And Other Academic Misconducts In Online Engineering Education: Analysis, Detection And Prevention Strategies,” in 2020 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE), Dec. 2020, pp. 112-119. doi: 10.1109/TALE48869.2020.9368311.

[7] “Academic Misconduct for Online Exams | Office of Teaching, Learning & Technology.” https://teach.uiowa.edu/academic-misconduct-online-exams (accessed Jul. 13, 2021).

[8] “Ten Clever Ways Students Cheat In Online Proctored Exams and How To Prevent Them.” https://blog.mettl.com/cheating-in-online-exams/ (accessed Jul. 15, 2021).

[9] S. Arno, A. Galassi, M. Tommasi, A. Saggino, and P. Vittorini, “State-of-the-art of commercial proctoring systems and their use in academic online exams,” International Journal of Distance Education Technologies, vol. 19, no. 2, pp. 41-62, 2021, doi: 10.4018/IJDET.20210401.oa3.

[10] M. J. Hussein, J. Yusuf, A. S. Deb, L. Fong, and S. Naidu, “An Evaluation of Online Proctoring Tools,” Open Praxis, vol. 12, no. 4, p. 509, 2020, doi: 10.5944/openpraxis.12.4.1113.

[11] Y. Atoum, L. Chen, A. X. Liu, S. D. H. Hsu, and X. Liu, “Automated Online Exam Proctoring,” IEEE Trans Multimedia, vol. 19, no. 7, pp. 1609-1624, 2017, doi: 10.1109/TMM.2017.2656064.

[12] M. Labayen, R. Vea, J. Florez, N. Aginako, and B. Sierra, “Online Student Authentication and Proctoring System Based on Multimodal Biometrics Technology,” IEEE Access, vol. 9, pp. 72398-72411, 2021, doi: 10.1109/ACCESS.2021.3079375.

[13] S. Idemudia, “A Smart Approach of E-Exam Assessment Method Using Face Recognition to Address Identity Theft and Cheating,” vol. 14, no. 10, pp. 515-522, 2016.

[14] H. S. G. Asep and Y. Bandung, “A Design of Continuous User Verification for Online Exam Proctoring on M- Learning,” Proceedings of the International Conference on Electrical Engineering and Informatics, vol. 2019- July, no. July, pp. 284-289, 2019, doi: 10.1109/ICEEI47359.2019.8988786.

[15] Y. Khlifi and H. A. El-Sabagh, “A novel authentication scheme for E-assessments based on student behavior over E-learning platform,” International Journal of Emerging Technologies in Learning, vol. 12, no. 4, pp. 6289, 2017, doi: 10.3991/ijet.v12i04.6478.

[16] H. Li, M. Xu, Y. Wang, H. Wei, and H. Qu, “A Visual Analytics Approach to Facilitate the Proctoring of Online Exams,” vol. 17, no. 21, 2021, doi: 10.1145/3411764.3445294.

[17] M. Cote, F. Jean, A. B. Albu, and D. Capson, “Video summarization for remote invigilation of online exams,” in 2016 IEEE Winter Conference on Applications of Computer Vision, WACV 2016, May 2016. doi: 10.1109/WACV.2016.7477704.

[18] A. W. Muzaffar, M. Tahir, M. W. Anwar, Q. Chaudry, S. R. Mir, and Y. Rasheed, “A systematic review of online exams solutions in e-learning: Techniques, tools, and global adoption,” IEEE Access, vol. 9, pp. 3268932712, 2021, doi: 10.1109/ACCESS.2021.3060192.

[19] N. Chotikakamthorn and S. Tassanaprasert, “Affordable Proctoring Method for Ad-hoc Off-campus Exams,” in SIGITE 2020 - Proceedings of the 21st Annual Conference on Information Technology Education, Oct.

2020, pp. 266-272. doi: 10.1145/3368308.3415421.

[20] “Proctoring a Closed-Book Exam in Zoom | Department of Government.” https://gov.harvard.edu/proctoring- closed-book-exam-zoom (accessed Jun. 06, 2021).

[21] “Proctoring a Live Online Exam with Zoom - NIU - Center for Innovative Teaching and Learning.” https://www.niu.edu/citl/resources/guides/proctoring-with-zoom.shtml (accessed Jun. 06, 2021).

[22] S. Manoharan and X. Ye, “On Upholding Academic Integrity in Online Examinations,” in 2020 IEEE Conference on e-Learning, e-Management and e-Services (IC3e), Nov. 2020, pp. 33-37. doi: 10.1109/IC3e50159.2020.9288468.

[23] E. Bilen and A. Matros, “Online cheating amid COVID-19,” JEcon Behav Organ, vol. 182, pp. 196-211, Feb.

2021, doi: 10.1016/j.jebo.2020.12.004.

Размещено на Allbest.ru


Подобные документы

  • Программа online обучения как программа, основанная на изучении материалов и взаимодействии студента с преподавателем посредством сети Интернет. Особенности процесса проектирования изделия. Виртуальный университет и виртуальное рабочее место одновременно.

    реферат [12,1 M], добавлен 26.03.2011

  • Характеристика сущности информационно-образовательной среды дистанционного обучения. Использование виртуальных библиотек, online учебников, тестов и научных разработок для получения знаний. Ознакомление с опытом "Открытый университет Великобритании".

    контрольная работа [33,8 K], добавлен 03.04.2014

  • Planning a research study. Explanation, as an ability to give a good theoretical background of the problem, foresee what can happen later and introduce a way of solution. Identifying a significant research problem. Conducting a pilot and the main study.

    реферат [26,5 K], добавлен 01.04.2012

  • School attendance and types of schools. Pre-school and elementary education. Nursery schools and kindergartens which are for children at the age of 4 - 6. The ideal of mass education with equal opportunity for all. Higher education, tuition fees.

    реферат [20,5 K], добавлен 01.04.2013

  • The basic tendencies of making international educational structures with different goals. The principles of distance education. Distance learning methods based on modern technological achievements. The main features of distance education in Ukraine.

    реферат [19,1 K], добавлен 01.11.2012

  • Modern education system in the UK. Preschool education. The national curriculum. Theoretical and practical assignments. The possible scenarios for post-secondary education. Diploma of higher professional education. English schools and parents' committees.

    презентация [3,3 M], добавлен 05.06.2015

  • The education system in the United States of America. Pre-school education. Senior high school. The best universities of national importance. Education of the last level of training within the system of higher education. System assessment of Knowledge.

    презентация [1,4 M], добавлен 06.02.2014

  • History of school education system in the USA. The role of school education in the USA. Organisation of educational process in American schools. Reforms and innovations in education that enable children to develop their potential as individuals.

    курсовая работа [326,6 K], добавлен 12.01.2016

  • Контекстно-центрированный подход как один из ведущих в обучении иностранным языкам в профильных школах с экономическим направлением. Умения, формируемые на основе использования метода Case Study в процессе профессионально-ориентированного обучения.

    дипломная работа [60,3 K], добавлен 26.04.2016

  • The impact of the course Education in Finland on my own pedagogical thinking and comparison of the Finnish school system and pedagogy with my own country. Similarities and differences of secondary and higher education in Kazakhstan and Finland.

    реферат [15,2 K], добавлен 01.04.2012

Работы в архивах красиво оформлены согласно требованиям ВУЗов и содержат рисунки, диаграммы, формулы и т.д.
PPT, PPTX и PDF-файлы представлены только в архивах.
Рекомендуем скачать работу.