An Application of Item Response Theory to Basic Science Multiple-Choice Test Development and Validation
dc.creator | Ani, Mercy I. | |
dc.creator | Ekeh, David O. | |
dc.creator | Fuente, Jayson A. Dela | |
dc.creator | Obodo, Abigail C. | |
dc.date | 2022-11-02 | |
dc.date.accessioned | 2023-08-21T08:15:48Z | |
dc.date.available | 2023-08-21T08:15:48Z | |
dc.description | The study made use of instrumentation research design while item response theory was applied, to develop and validate Basic Science multiple-choice tests. 600 junior secondary school II students consisted of a sample that was randomly selected from 20 government co-education secondary schools in Udi education zone of Enugu State, Nigeria. The study was guided by six research questions. A 40-test item of Basic Science multiple choice test was constructed by the researchers and used to collect data. Three experts subjected the instrument to content and face validation to ensure its validity. Two of them were from the departments of Science education and educational foundations, respectively. A reliability index of 0.85, was realized. Analysis of the data that were generated, was carried out, using the maximum likelihood estimation technique of BILOG-MG computer programming. It revealed that 40-test items with the appropriate indices consisted of the final instrument developed and was used to assess the ability of students in Basic Science. The result of the study confirmed the reliability of the items of the Basic Science Multiple choice questions based on the three-parameter (3pl) model. The findings again revealed that the multiple-choice Basic Science test items were difficult and that there was differential item functioning in Basic Science among male and female learners. Recommendations that were in line with the findings were made, such as that: teachers and examination bodies should adopt and encourage IRT in the development of test instruments used in measuring the ability of students in Basic Science and other subjects. | en-US |
dc.format | application/pdf | |
dc.identifier | https://journals.researchparks.org/index.php/IJIE/article/view/OSF.IO | |
dc.identifier | 10.17605/FH4UD | |
dc.identifier.uri | http://dspace.umsida.ac.id/handle/123456789/17676 | |
dc.language | eng | |
dc.publisher | Research Parks Publishing LLC | en-US |
dc.relation | https://journals.researchparks.org/index.php/IJIE/article/view/OSF.IO/3448 | |
dc.source | International Journal on Integrated Education; Vol. 5 No. 11 (2022): IJIE; 4-15 | en-US |
dc.source | 2620-3502 | |
dc.source | 2615-3785 | |
dc.source | 10.31149/ijie.v5i11 | |
dc.subject | Achievement Test | en-US |
dc.subject | Basic Science | en-US |
dc.subject | Differential Item Functioning (DIF) | en-US |
dc.subject | Item Response Theory (IRT) | en-US |
dc.subject | Multiple Choices | en-US |
dc.title | An Application of Item Response Theory to Basic Science Multiple-Choice Test Development and Validation | en-US |
dc.type | info:eu-repo/semantics/article | |
dc.type | info:eu-repo/semantics/publishedVersion | |
dc.type | Peer-reviewed Article | en-US |