Original research: Comprehensive licensure review and adaptive quizzing assignments for enhancement of end-of-programme exit examination scores in Saudi Arabia: a quasi-experimental study (2024)

  • Journal List
  • BMJ Open
  • v.13(7); 2023
  • PMC10347515

As a library, NLM provides access to scientific literature. Inclusion in an NLM database does not imply endorsem*nt of, or agreement with, the contents by NLM or the National Institutes of Health.
Learn more: PMC Disclaimer | PMC Copyright Notice

Original research: Comprehensive licensure review and adaptive quizzing assignments for enhancement of end-of-programme exit examination scores in Saudi Arabia: a quasi-experimental study (1)

Link to Publisher's site

BMJ Open. 2023; 13(7): e074469.

Published online 2023 Jul 12. doi:10.1136/bmjopen-2023-074469

PMCID: PMC10347515

PMID: 37438057

Original research

Monir M Almotairy,Original research: Comprehensive licensure review and adaptive quizzing assignments for enhancement of end-of-programme exit examination scores in Saudi Arabia: a quasi-experimental study (2)1 Adnan Innab,1 Naji Alqahtani,1 Ahmed Nahari,2 Reem Alghamdi,3 Hamza Moafa,4 and Dalal Alshael1

Author information Article notes Copyright and License information PMC Disclaimer

Associated Data

Supplementary Materials
Data Availability Statement

Abstract

Objective

This study explores the effectiveness of a comprehensive licensure review and adaptive quizzing assignments intervention in improving the performance of undergraduate senior nursing students on the end-of-programme exit exam.

Design

A quasi-experimental single-group design was used to compare pretest and post-test scores through computerised adaptive tests.

Setting

The setting was a nursing college in Saudi Arabia.

Participants

The study included 292 senior nursing students enrolled in the Bachelor of Science in Nursing programme.

Intervention

A comprehensive licensure review bundled with adaptive quizzing assignments was delivered over 15 weeks in the academic year 2021–2022. The bundle was guided by the elaboration theory, and it included 3-hour synchronous lectures every week and 23 adaptive quizzing assignments that covered weekly content.

Primary and secondary outcome measures

Students’ mastery scores and the percentage of correct answers were the primary and secondary measures, respectively. Both measures were collected in the pretest and post-test (exit examination). Additionally, demographic characteristics were collected in the pre-test using an online survey.

Results

The overall mean of the mastery score was statistically significantly higher in the exit exam (M=2.51, SD=1.70) than in the pretest (M=1.45, SD=0.44; p<0.001). Although the overall mean of the mastery score in the exit exam did not reach the cut-off score, students who demonstrated the required knowledge and satisfactory performance in the pretest achieved a mastery score above the cut-off. The percentage of correct answers was statistically significantly higher in the exit exam (M=58.59%, SD=9.50) than in the pretest (M=49.32%, SD=9.78; p<0.001). A statistically significant difference in students’ performance based on gender, age and grade point average was observed.

Conclusions

A comprehensive licensure review and adaptive quizzing assignments intervention bundle fostered the performance of undergraduate nursing students in the end-of-programme exit exam.

Keywords: medical education & training, education & training (see medical education & training), health education

STRENGTHS AND LIMITATIONS OF THIS STUDY

  • The study utilised computerised adaptive tests along with a comprehensive licensure review for senior nursing students in Saudi Arabia.

  • The intervention bundle was designed based on the blueprints of the National Council Licensure Examination for Registered Nurses and Saudi Nursing Licensing Exam.

  • The study lacked a control group, and all participants were recruited from one university, which could limit the generalisability of the findings.

Introduction

The Saudi Commission for Health Specialties periodically announces the performance of students from different healthcare specialties in national licensure exams. The licensure exams ensure that applicants have acquired the necessary knowledge, skills and competencies to safely provide care that meets predetermined nursing standards.1 2 Thus, the exams assure the public that nurses meet the criteria to practice.1

The Saudi Nursing Licensing Exam (SNLE) is a computer-based non-adaptive test used to assess nurses’ ability to provide safe and effective nursing care in Saudi Arabia.3 Nursing students from 39 public and private nursing school programmes across the country have shown a wide disparity in passing scores in the SNLE. The pass rates for nursing students have declined from 98% in 2017 to 66% in 2023, clearly indicating a need to improve educational preparation in prelicensure/undergraduate programmes.4 The SNLE pass rate is a widely used method of assessing nursing programme effectiveness.5 A high pass rate for first-time licensure exam takers is considered a major indicator of nursing programme quality.6 7 Thus, implementing measures to help reduce students’ performance discrepancies in the licensure exam is vital.

A low pass rate can negatively impact not only students but also the nursing profession.8 Failing the licensure exam may delay nursing students’ entry into the nursing profession, thus reducing the number of newly graduated nurses joining the healthcare workforce.9 The experience of failure can be damaging to students and may cause low self-esteem, depression, isolation, and the stigma of being unsuccessful.8 10–13 Failure may increase the financial burden on newly graduated nurses due to reduced employment opportunities.10 13 Therefore, nursing students must receive the necessary preparation to succeed in the licensure exam.

Several studies have reported various strategies and activities to improve recent graduates’ pass rates in licensure exams.14 15 For example, standardised testing, review courses, test-taking strategies and coaching led by faculty members were provided to senior nursing students in USA to improve pass rates in the National Council Licensure Examination for Registered Nurses (NCLEX-RN).16–18 Similarly, a focused remediation programme was implemented for students at risk of failing the NCLEX-RN, which resulted in a significant improvement in a comprehensive predictor exam employed to evaluate students’ readiness for the NCLEX-RN in the USA.14 19

In Saudi Arabia, Almarwani20 developed a review course aimed at enhancing students’ passing rates in the SNLE and found that the course improved students’ performance on an SNLE-style multiple choice test developed by a faculty member. However, to the best of our knowledge, no previous study in Saudi Arabia has evaluated the impact of a comprehensive licensure review bundled with adaptive quizzing assignments on students’ performance in end-of-programme exit exams delivered using a computerised adaptive testing approach. Such adaptive exit exams provide individualised performance evaluations to determine students’ strengths and weaknesses that impact their potential to pass nursing exit exams and, ultimately, licensure exams. Therefore, this study explored the effectiveness of a comprehensive licensure review bundled with adaptive quizzing assignments to improve students’ performance in the end-of-programme exit examination in an undergraduate nursing programme in Saudi Arabia.

Theoretical framework

Our study was guided by the elaboration theory, developed by Reigeluth and colleagues in 1983, which proposes that the design of educational content must follow a sequence from simple to complex.21 Its basic premise is a shift in teaching style from a teacher-centred to a learner-centred approach.22 The instructor organises and delivers the educational material to suit the learners’ needs and help achieve the desired learning outcomes.21 This theory also values a holistic sequence of instruction that can improve students’ motivation to learn. Thus, we designed the comprehensive licensure review and adaptive quizzing assignments bundle to provide educational content that can expand nursing students’ knowledge and analytical and cognitive skills. These skills are crucial to delivering safe and high-quality care.

The educational content delivered in this study incorporated the blueprint and structure of the NCLEX-RN and SNLE tests. Moreover, based on the elaboration theory, the intervention employed adaptive quizzing activities as a motivational tool for students to (1) learn from example questions, which helps them create a mental connection between new information and prior knowledge; (2) use analogies by elaborating on new information and connecting it with prior knowledge, which allows students to see patterns and relationships between concepts; and (3) aid them in comprehending new information. In general, quizzes have been used as a motivational tool to encourage students to read.23 24

Furthermore, Reigeluth and Stein proposed that learners’ motivation to accomplish a predetermined objective is crucial for acquiring knowledge and skills.21 Thus, bundling comprehensive licensure reviews with adaptive quizzing assignments should motivate students to perform better on assignments and improve their analytical and cognitive skills. Our study sought to motivate nursing students to acquire the necessary knowledge and skills to pass the programme’s exit exam and the licensure exam, and to ultimately ensure their ability to safely handle complex healthcare needs.

Methods

Aims

The specific aims of the study were to examine (1) students’ overall mastery score and percentage of correct answers before and after a comprehensive licensure review and adaptive quizzing assignments intervention, and (2) the association of students’ demographic characteristics with their overall mastery score and percentage of correct answers, before and after a comprehensive licensure review and adaptive quizzing assignments intervention.

Study design

We utilised a quasi-experimental (pretest and post-test) single-group design. The research team followed the TREND statement checklist for non-randomised controlled trials.

Sample and setting

Nursing students were recruited from a nursing college at a public university in Saudi Arabia. The inclusion criteria were senior nursing students in the Bachelor of Science in Nursing (BSN) programme for the academic year 2021–2022. The sample size was calculated using the G-Power software V.3.1 (Heinrich-Heine-Universität, Düsseldorf, Germany). Using a set significance level of 0.05, an effect size of 0.3 and a power of 0.8, the minimum sample size was determined to be 109.

Study intervention

The study investigators designed an intervention bundle comprising a comprehensive licensure review and adaptive quizzing assignments to support senior BSN students’ transition to practice through synthesising acquired knowledge and mastering cognitive skills. Specifically, the intervention bundle aimed to help students recognise the knowledge and skills essential for registered nurses in various healthcare settings, and integrate evidence-based practice in nursing care for patients across the lifespan. The intervention bundle was also designed to foster students’ cognitive abilities in demonstrating effective clinical and decision-making skills. Students were expected to identify and manage healthcare needs across the patient lifespan and use multidisciplinary communication to enhance the healthcare provided to patients and their families. In addition, the intervention bundle was expected to promote the assimilation of professional ethical principles and values into the delivery of appropriate and culturally competent nursing care.

The intervention bundle was delivered over 15 weeks and included two main components: (1) comprehensive licensure review content presented through 3-hour synchronous lectures every week and (2) adaptive quizzing assignments (table 1). The weekly lectures were presented to reflect the content outline of the NCLEX-RN and SNLE test plans. An online adaptive testing system delivered the 23 adaptive quizzing assignments using NCLEX-RN type questions (table 1). Students were required to achieve a mastery score of 8 (highest score) in each of the 23 assignments by the end of week 15. The score fluctuated based on their sustained performance on adaptive quizzes.25 Thus, the number of questions or quizzes for each assignment was not established ahead of time, as the adaptive quizzing system determined students’ performance throughout the assignments.

Table 1

Timeline for the comprehensive licensure review, adaptive quizzing assignments and data collection

WeekCourse content (3 hours weekly)Delivery of the intervention bundleAdaptive quizzing assignmentsData collection (week 1 and 15)
1Medication calculations
  • Test-taking strategies

  • Case studies

  • Scenarios

  • Videos, animations, infographics and audio materials on anatomy, physiology, pathophysiology, clinical manifestations, diagnostics procedures, multidisciplinary management approaches, pharmacological and non-pharmacological management

  • Practicum Q&A on the weekly content

Students completed weekly assignments on 23 topics relevant to the weekly content. For instance, the course content for week 1 focused on medication calculations, so following the completion of their weekly session students were required to complete an assignment on medication calculations through the online adaptive quizzing system. The 23 topics on the online adaptive quizzing system were:
  1. Medication calculations

  2. Acid–base balance

  3. Fluid and electrolyte balance

  4. Cardiovascular disorders

  5. Respiratory disorders

  6. Neurosensory disorders

  7. Genitourinary disorders

  8. Musculoskeletal disorders

  9. Endocrine and metabolic disorders

  10. Immunologic and haematologic disorders

  11. Oncologic disorders

  12. Gastrointestinal disorders

  13. Antepartum period

  14. Intrapartum period

  15. Postpartum period

  16. The neonate

  17. Infant

  18. Toddler

  19. Preschooler

  20. School-age child

  21. Adolescent

  22. Psychiatric and mental health nursing

  23. Prioritisation and delegation

Demographic characteristics collected in online survey diagnostic exam (pretest):
  • Computerised adaptive testing

  • 75 questions

  • Two hours

  • Measured the mastery score and percentage of correct answers

2Fluid and electrolytes/acid–base balance
3Cardiovascular system-1
4Cardiovascular system-2
5Respiratory system
6Neurological system
7Renal system
8Musculoskeletal system
9Endocrine system
10Immunology, haematology and oncology
11Gastrointestinal system
12Special topics in Women’s health
13Special topics in paediatric and adolescent health
14Community and psychiatric health
15Unit management and leadershipEnd-of-programme exit exam (post-test):
  • Computerised adaptive testing

  • 75 questions

  • Two hours

  • Measured the mastery score and percentage of correct answers

Open in a separate window

Intervention fidelity

The comprehensive licensure review materials were obtained from NCLEX-RN comprehensive review textbooks widely used by prelicensure graduates to prepare for the NCLEX-RN examination in USA. Furthermore, the researchers evaluated the appropriateness of using these textbooks to prepare for the SNLE by reviewing the SNLE’s blueprint against the textbook content. The adaptive quizzing assignments, pretest and post-test (ie, end-of-programme exit exam) were conducted using a valid online adaptive learning and testing source for the NCLEX-RN comprehensive licensure examination.

Study measures

Demographic characteristics

Demographic data on participants’ age, gender, cumulative grade point average (GPA) and whether nursing was their first choice programme at university were collected.

Dependent variables

The study had two dependent variables (mastery score in the exam and percentage of correct answers). The students’ mastery scores were collected in the pretest and post-test. The mastery score was rated on an 8-point scale ranging from 1 (easiest) to 2 (most difficult) in the online adaptive testing system. A minimum score of 4 was required to pass the NCLEX-RN exam. Progressing from a lower to higher mastery score was based on students’ sustained performance on previous questions.25 The mean of students’ mastery scores was calculated. The second measure was the percentage of correct answers in the pre-test and post-test.

Data collection

The online adaptive quizzing system was used to conduct two assessment exams at two points during the intervention. The diagnostic comprehensive exam (pretest) was conducted in the first week of the intervention to assess the students’ baseline mastery score. The end-of-programme exit exam (post-test) was conducted in the last week of the intervention to evaluate the students’ improvement (table 1). Each of the two assessment exams comprised 75 questions that had to be completed within 2 hours.

Data on participants’ mastery scores and correct answers for each exam were extracted from the online adaptive testing system. Furthermore, data on demographic characteristics were collected using an online survey sent to students at the beginning of the intervention. Data on mastery scores, correct answers and demographic characteristics were merged using the participants’ university identification number as a primary identifier across different data sheets.

Data analysis

Data were analysed using IBM SPSS Statistics for Mac V.28 software. Central tendency measures (eg, mean and median), dispersion measures (eg, SD and variance) and frequencies and percentages for categorical measures were used to present the descriptive results. A paired sample t-test was used to evaluate the mean difference in students’ performance before and after the intervention. The differences in the study variables (ie, mastery scores and percentages of correct answers) according to demographic characteristics were measured using an independent sample t-test and Pearson’s product–moment correlation. Statistical significance was set at p<0.05.

Patient and public involvement

The study did not involve patients or the public.

Ethical consideration

Institutional Review Board approval was granted by the Standing Committee for Scientific Research Ethics at King Saud University (KSU-HE-22-001). Participants consented to the use of their data for research purposes, and they were informed that their information would be kept strictly confidential. Participants were also informed that the data would be reported in an aggregated form to maintain their confidentiality. Access to participants’ responses and identifiers was restricted to the principal investigator.

Results

Of the 292 students who participated in the study, nearly 56% (n=162) were female, and 67% (n=196) did not choose nursing as their first option at university. The mean age was 22 years (SD=0.94) and mean GPA was 4.15 (SD=0.41). The overall mean mastery score (table 2) increased significantly from 1.45 (SD=0.44) in the pretest to 2.51 (SD=1.71) in the post-test (t(286) = 11.24, p<0.001, d=0.66). The mean percentage of correct answers (table 2) increased significantly from 49.32 (SD=9.78) in the pretest to 58.59 (SD=9.50) in the post-test (t(286) = 15.76, p<0.001, d=0.93). Due to missing data from five students who did not complete the intervention, data on 287 students who completed the intervention were used in the analysis.

Table 2

Comparison of students’ mean scores before and after the intervention (n=287)

VariablesnPretestPost-testtP value95% CI
M (SD)M (SD)LowerUpper
Overall mastery score2871.45 (0.44)2.51 (1.70)11.24<0.0010.871.25
Percentage of correct answers28749.32 (9.78)58.59 (9.50)15.76<0.0018.1110.43

Open in a separate window

Although the post-test mastery score was found to be satisfactory during the analysis, it did not meet the minimum 4-point score required to pass the NCLEX-RN exam. Data were then analysed to determine whether the students’ performance would differ based on their pretest exam performance. Students were categorised into three groups based on the overall mastery score in the exit exam. Group 1 comprised students whose mastery scores in the exit exam were equivalent to the overall mean mastery score in the pretest. Group 2 comprised students whose mastery scores in the exit exam were equal to or greater than the overall mean mastery score in the exit exam. Group 3 comprised students whose mastery scores were between those of Groups 1 and 2.

Based on further analysis (table 3), the mastery score of students in Group 1 did not increase significantly from the pretest to the post-test (p=0.08). However, the mean percentage of correct answers increased significantly from 42.03 (SD=9.53) in the pretest to 48.77 (SD=6.57) in the post-test (t(78) = 5.97, p<0.001, d=0.67). For students in Group 2, the mean mastery score increased significantly from 1.67 (SD=0.55) in the pretest to 4.35 (SD=1.73) in the post-test (t(98) = 14.77, p<0.001, d=1.49), and the mean percentage of correct answers increased significantly from 54.29 (SD=8.15) in the pretest to 67.78 (SD=5.60) in the post-test (t(98) = 12.67, p<0.001, d=1.27). For students in Group 3, the mean mastery score increased significantly from 1.42 (SD=0.35) in the pretest to 1.76 (SD=0.21) in the post-test (t(108) = 10.24, p<0.001, d=0.98), and the mean percentage of correct answers increased significantly from 50.08 (SD=8.07) in the pretest to 57.36 (SD=5.46) in the post-test (t(108) = 9.54, p<0.001, d=0.91).

Table 3

Comparison of groups’ mean score before and after the intervention (n=287)

VariablesnPretestPost-testtP value95% CI
M (SD)M (SD)LowerUpper
Group 1
 Mastery score791.20 (0.16)1.24 (0.11)1.810.080.0030.06
 Percentage of correct answers7942.03 (9.53)48.77 (6.57)5.97<0.001*4.498.99
Group 2
 Mastery score991.67 (0.55)4.35 (1.73)14.77<0.001*2.313.03
 Percentage of correct answers9954.29 (8.15)67.78 (5.60)12.67<0.001*11.3715.59
Group 3
 Mastery score1091.42 (0.35)1.76 (0.21)10.24<0.001*0.270.41
 Percentage of correct answers10950.08 (8.07)57.36 (5.46)9.54<0.001*5.768.79

Open in a separate window

Note. The mastery score was rated on an 8-point scale ranging from 1 (easiest) to 8 (most difficult) in the online adaptive testing system. The percentage of correct answers in the pretest and post-test reflect the number of correct answers in relation to the overall number of questions in the pretest and post-test. Group 1: mastery scores in the exit exam were equivalent to the overall mean mastery score in the pretest. Group 2: mastery scores in the exit exam were equal to or greater than the overall mean mastery score in the exit exam. Group 3: mastery scores fell between those of Groups 1 and 2.

Compared with male students, female students had significantly higher mastery scores in the pretest (Welch’s t=5.61, p<0.001) and post-test (Welch’s t=9.20, p<0.001) (online supplemental table 1). Furthermore, they had a significantly higher percentage of correct answers in the pretest (t=5.42, p<0.001) and post-test (t=9.09, p<0.001). Age had a weak statistically significant correlation with mastery scores in the post-test (r=−0.14, p=0.03) and the percentage of correct answers in the pretest (r=−0.14, p=0.04) and post-test (r=−0.15, p=0.03). GPA had a moderate statistically significant correlation with mastery scores and percentages of correct answers in the pretest (r=0.43, p<0.001; r=0.47, p<0.001, respectively) and post-test (r=0.39, p<0.001; r=0.44, p<0.001, respectively).

Supplementary data

bmjopen-2023-074469supp001.pdf

Discussion

Our study examined nursing students’ mastery scores and the percentages of correct answers before and after a comprehensive licensure review and adaptive quizzing assignments intervention bundle. It also examined the association between students’ demographic characteristics and their mastery scores and the percentages of correct answers before and after the intervention. The study findings supported the view that the comprehensive licensure review and adaptive quizzing assignments bundle improved students’ mastery scores and percentages of correct answers in the end-of-programme exit exam. Further analysis showed that the group that performed better at the baseline (pretest) achieved mastery scores of more than four and had higher percentages of correct answers in the post-test compared with the other groups. By contrast, further analysis showed that the mastery scores of students in Group 1 did not improve significantly. Considering that students’ GPAs in this study had a moderate positive significant association with mastery scores in the pretests and post-tests, these findings suggest the importance of mastering-acquired knowledge and skills from other previous courses in the undergraduate nursing programme as a minimum requirement to excel in the end-of-programme exit exam. Furthermore, our findings were consistent with previous studies that presented the benefits of incorporating adaptive quizzing in nursing programmes to improve students’ learning,26 end-of-programme exam scores27 28 and, ultimately, nursing licensure exam scores.29 30

Our findings also revealed that younger students had higher post-test mastery scores, as well as higher percentages of correct answers preintervention and postintervention compared with older students. These findings were inconsistent with a previous study conducted among Saudi nursing interns, where age was not associated with student performance in the licensure exam.31 This finding could be because older students are often those who fall behind in their study programme due to underperformance, considering that students in Saudi Arabia typically enrol in public colleges immediately after graduating from high school, which puts them in the same age cohorts. Furthermore, although the intervention was delivered equally to the students regardless of their gender, female nursing students had a higher baseline mastery score, percentage of correct answers and GPA compared with male students. This finding was consistent with previous studies reporting that female nursing students had higher RN exam readiness scores20 32 and NCLEX-RN passing rates compared with male students.31 33

Implications and recommendations

The study findings have important implications for nursing academic administrators and educators to improve students’ performance in the nursing end-of-programme exit exam. Along with providing the knowledge and skills needed for nursing practice, academic programmes should integrate a comprehensive licensure review bundled with adaptive quizzing to foster students’ cognitive skills and improve their performance in the end-of-programme examination. Implementing this strategy can ultimately prepare students for the next step in their nursing career following graduation, such as attempting the licensure examination. Moreover, the gender performance differences in the end-of-programme exit exam warrant further evaluation from nursing educators to ensure that both male and female students excel in the exam.

Additionally, the study findings indicated that students with higher GPAs performed better in the pretest and end-of-programme exit exams compared with other students. Accordingly, faculty members should ensure that adequate academic counselling is available for students who have underperformed in the baseline test. Furthermore, programme directors could monitor students’ performance throughout the programme and consider providing additional remediation activities for students with lower GPAs prior to the exit exam. Monitoring students’ performance would help to ensure their readiness for and success in the exit exam. Moreover, this study showed that younger students performed better in the exit exam compared with older students, which warrants further evaluation to identify the influence of age on students’ performance in the exam. Future studies should also evaluate the impact of a comprehensive licensure review and adaptive quizzing assignments bundle on students’ performance in the SNLE.

Limitations

Our study had some limitations. First, it did not have a control group for comparison, which may have affected the internal validity of the study intervention. Second, the sample was restricted to one university, which could limit the generalisability of the findings due to possible differences in students’ characteristics. Future large-scale studies should include a control group and participants from different universities to reach generaliable conclusions.

Conclusion

We examined the effectiveness of comprehensive licensure reviews and adaptive quizzing assignments to improve students’ performance in the end-of-programme exit exam. The intervention improved students’ performance through their achievement of higher mastery scores and percentages of correct answers in the exit exam. The findings suggest that a comprehensive licensure review and adaptive quizzing assignments can foster students’ knowledge and cognitive skills to facilitate a successful transition to nursing practice.

Supplementary Material

Reviewer comments:

Click here to view.(155K, pdf)

Author'smanuscript:

Click here to view.(1.0M, pdf)

Acknowledgments

The authors extend their appreciation to the Deputyship for Research and Innovation, Ministry of Education in Saudi Arabia for funding this research (IFKSUOR3-091-3).

Footnotes

Twitter: @malmotairy1, @DrAdnanAlmanie, @Naji_PhD, @AhmNahari, @Dr_ReemSaeed

Contributors: MMA, AI, AN and NA conceived the investigation; MMA and RA designed and delivered the investigation, MMA, DA, NA, AN, AI and HM managed the collection of data through pre-exams and post-exams; and MA and AI conducted the analysis. All authors drafted the manuscript and contributed to the interpretation and intellectual revision of the manuscript and agreed on the final version. MMA is the author acting as a guarantor in this study.

Funding: This research received funding from the Deputyship for Research and Innovation, Ministry of Education in Saudi Arabia (IFKSUOR3–091–3).

Competing interests: None declared.

Patient and public involvement: Patients and/or the public were not involved in the design, or conduct, or reporting or dissemination plans of this research.

Provenance and peer review: Not commissioned; externally peer reviewed.

Supplemental material: This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.

Data availability statement

Data are available upon reasonable request.

Ethics statements

Patient consent for publication

Not applicable.

Ethics approval

This study was approved by the Standing Committee for Scientific Research Ethics at King Saud University (Ref No. KSU-HE-22-001).

References

1. National Council of State Boards of Nursing . RN practice analysis: linking the NCLEX-RN® examination to practice [online]. 2021. Available: https://www.ncsbn.org/public-files/21_NCLEX_RN_PA.pdf

2. National Council of State Boards of Nursing . What you need to know about nursing Licensure and boards of nursing [online]. n.d.Available: https://www.ncsbn.org/Nursing_Licensure.pdf

3. Saudi Commission for Health Specialties . Saudi nursing licensing exam (SNLE): applicant guide [online]. Available: https://scfhs.org.sa/sites/default/files/2022-06/SNLE%20Applicant%20Guide%20-%20March%202022.pdf [Accessed Mar 4].

4. Saudi Commission for Health Specialties . License exam summary—Saudi universities [online]. Available: https://public.scfhs.org.sa/SCFHS_License_Exam.html [Accessed Mar 2023].

5. Al-Alawi R, Alexander GL. Systematic review of program evaluation in baccalaureate nursing programs. J Prof Nurs2020;36:236–44. 10.1016/j.profnurs.2019.12.003 [PubMed] [CrossRef] [Google Scholar]

6. Giddens JF. Changing paradigms and challenging assumptions: redefining quality and NCLEX-RN pass rates. J Nurs Educ2009;48:123–4. 10.3928/01484834-20090301-04 [PubMed] [CrossRef] [Google Scholar]

7. Jeffreys MR. Jeffreys's nursing universal retention and success model: overview and action ideas for optimizing outcomes A-Z. Nurse Educ Today2015;35:425–31. 10.1016/j.nedt.2014.11.004 [PubMed] [CrossRef] [Google Scholar]

8. Roa M, Shipman D, Hooten J, et al.. The costs of NCLEX-RN failure. Nurse Educ Today2011;31:373–7. 10.1016/j.nedt.2010.07.009 [PubMed] [CrossRef] [Google Scholar]

9. Thompson CW. Retaining graduate nurses who experience NCLEX failure: recommendations for supporting retest success. Nurse Leader2023;21:299–302. 10.1016/j.mnl.2022.11.013 [CrossRef] [Google Scholar]

10. Atemafac J. Consequences for nursing graduates of failing the National Council Licensure examination (NCLEX) [Dissertation on the Internet]. Minnesota: Walden University. 2014. Available: https://www.proquest.com/openview/6ff1ca5275c5a6e89b24bb8fa9807f7e/1?pq-origsite=gscholar&cbl=18750

11. Kasprovich T, VandeVusse L. Registered nurses’ experiences of passing the NCLEX-RN after more than one attempt. J Nurs Educ2018;57:590–7. 10.3928/01484834-20180921-04 [PubMed] [CrossRef] [Google Scholar]

12. Lutter SL, Thompson CW, Condon MC. Tutoring for success: empowering graduate nurses after failure on the NCLEX-RN. J Nurs Educ2017;56:758–61. 10.3928/01484834-20171120-11 [PubMed] [CrossRef] [Google Scholar]

13. McGillis Hall L, Lalonde M, Kashin J. People are failing! something needs to be done: Canadian students' experience with the NCLEX-RN. Nurse Educ Today2016;46:43–9. 10.1016/j.nedt.2016.08.022 [PubMed] [CrossRef] [Google Scholar]

14. Meehan CD, Barker N. Remediation for NCLEX-RN success in high-risk nursing students. Teaching Learning Nurs2021;16:254–7. 10.1016/j.teln.2021.02.003 [CrossRef] [Google Scholar]

15. Quinn BL, Smolinski M, Peters AB. Strategies to improve NCLEX-RN success: a review. Teaching Learning Nurs2018;13:18–26. 10.1016/j.teln.2017.09.002 [CrossRef] [Google Scholar]

16. Czekanski K, ho*rst BJ, Kurz J. Instituting evidence-based changes to improve first-time NCLEX-RN® pass rates. J Nurs Regulat2018;9:11–8. 10.1016/S2155-8256(18)30049-8 [CrossRef] [Google Scholar]

17. Czekanski K, Mingo S, Piper L. Coaching to NCLEX-RN success: a Postgraduation intervention to improve first-time pass rates. J Nurs Educ2018;57:561–5. 10.3928/01484834-20180815-10 [PubMed] [CrossRef] [Google Scholar]

18. Hyland JR. Building on the evidence: interventions promoting NCLEX success. OJN2012;02:231–8. 10.4236/ojn.2012.23036 [CrossRef] [Google Scholar]

19. Conklin PS, Cutright LH. A model for sustaining NCLEX-RN success. Nurs Educ Perspect2019;40:176–8. 10.1097/01.NEP.0000000000000326 [PubMed] [CrossRef] [Google Scholar]

20. Almarwani AM. The effect of integrating a nursing licensure examination preparation course into a nursing program curriculum: a quasi-experimental study. Saudi J Health Sci2022;11. 10.4103/sjhs.sjhs_87_22 [CrossRef] [Google Scholar]

21. Reigeluth CM, Stein FS. The elaboration theory of instruction. In: Reigeluth CM, ed. Instructional-design theories and models: an overview of their current status. New York: Routledge, 1983: 335–81. [Google Scholar]

22. Wilson B, Cole P. A critical review of elaboration theory. ETR&D1992;40:63–79. 10.1007/BF02296843 [CrossRef] [Google Scholar]

23. Lei SA, Bartlett KA, Gorney SE, et al.. Resistance to reading compliance among college students: instructors’ perspectives. Coll Stud J2010;44:219–29. [Google Scholar]

24. Stratton G. Does increasing textbook Portability increase reading rates or academic performance?Inquiry2011;16:5–16. [Google Scholar]

25. Phelan JC. An investigation of student use of Passpoint and NCLEX-RN® outcomes. 2021. Available: https://download.lww.com/efficacy/WP_PassPoint.pdf

26. Simon-Campbell EL, Phelan JC. Effectiveness of an adaptive quizzing system as an institutional-wide strategy to improve student learning and retention. Nurse Educ2016;41:246–51. 10.1097/NNE.0000000000000258 [PubMed] [CrossRef] [Google Scholar]

27. Parcell H, Morton K, Froble D, et al.. Evaluating the effect of pre-exam adaptive quizzing on nursing student exam scores. Nurs Educ Perspect2022;43:E100–2. 10.1097/01.NEP.0000000000000988 [PubMed] [CrossRef] [Google Scholar]

28. Presti CR, Sanko JS. Adaptive quizzing improves end-of-program exit examination scores. Nurse Educ2019;44:151–3. 10.1097/NNE.0000000000000566 [PubMed] [CrossRef] [Google Scholar]

29. Cox-Davenport RA, Phelan JC. Laying the groundwork for NCLEX success: an exploration of adaptive quizzing as an examination preparation method. Comput Inform Nurs2015;33:208–15. 10.1097/CIN.0000000000000140 [PubMed] [CrossRef] [Google Scholar]

30. Oliver BJ, Pomerleau M, Potter M, et al.. Optimizing NCLEX-RN pass rate performance using an educational Microsystems improvement approach. J Nurs Educ2018;57:265–74. 10.3928/01484834-20180420-03 [PubMed] [CrossRef] [Google Scholar]

31. Butcon VE, Pasay-An E, Indonto MCL, et al.. Assessment of determinants predicting success on the Saudi nursing Licensure examination by employing artificial neural network. J Educ Health Promot2021;10:396. 10.4103/jehp.jehp_652_20 [PMC free article] [PubMed] [CrossRef] [Google Scholar]

32. Monroe HE. Determining readiness to take the National Council Licensure examination for registered nurses® [Dissertation on the Internet]. Colorado: University of northern Colorado. 2019. Available: http://www.proquest.com/docview/2302691385/abstract/D233A270344E43F1PQ/2

33. Havrilla E, Zbegner D, Victor J. Exploring predictors of NCLEX-RN success: one school’s search for excellence. J Nurs Educ2018;57:554–6. 10.3928/01484834-20180815-08 [PubMed] [CrossRef] [Google Scholar]

Articles from BMJ Open are provided here courtesy of BMJ Publishing Group

Original research: Comprehensive licensure review and adaptive quizzing assignments for enhancement of end-of-programme exit examination scores in Saudi Arabia: a quasi-experimental study (2024)
Top Articles
Latest Posts
Article information

Author: Gregorio Kreiger

Last Updated:

Views: 6350

Rating: 4.7 / 5 (57 voted)

Reviews: 88% of readers found this page helpful

Author information

Name: Gregorio Kreiger

Birthday: 1994-12-18

Address: 89212 Tracey Ramp, Sunside, MT 08453-0951

Phone: +9014805370218

Job: Customer Designer

Hobby: Mountain biking, Orienteering, Hiking, Sewing, Backpacking, Mushroom hunting, Backpacking

Introduction: My name is Gregorio Kreiger, I am a tender, brainy, enthusiastic, combative, agreeable, gentle, gentle person who loves writing and wants to share my knowledge and understanding with you.