Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al]
Despite the current popularity of performance-based assessment and the emergence of new assessment methods, multiple choices (MC) item remain a major form of assessment. Conventional Number Right (NR) scoring method where one point for correct response and zero for other response has been consiste...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Thesis |
Language: | English |
Published: |
2006
|
Subjects: | |
Online Access: | https://ir.uitm.edu.my/id/eprint/94756/1/94756.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
id |
my-uitm-ir.94756 |
---|---|
record_format |
uketd_dc |
spelling |
my-uitm-ir.947562024-05-08T22:57:49Z Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] 2006 Lau, Sie Hoe Paul Lau, Ngee Kiong Ling, Siew Eng Hwa, Tee Yong H Social Sciences (General) Study and teaching. Research Despite the current popularity of performance-based assessment and the emergence of new assessment methods, multiple choices (MC) item remain a major form of assessment. Conventional Number Right (NR) scoring method where one point for correct response and zero for other response has been consistently criticized for failure to credit partial knowledge and encourage guessing. Various alternative scoring methods such as Number Right with Correction for Guessing (NRC). Elimination Testing (ET), Confidence Weighting (CW) and Probability Measurement (PM) had been proposed to overcome these two weaknesses. 1 lowever to date, none has been widely accepted although the theoretical rationale behind various scoring methods under Classical Test Theory (CTT) is sound. A major cause of concern is the possibility that complicated scoring instruction might introduce other factors, which may affect the reliability and validity of the test scores. Studies on whether examinees can be trained to follow the new test instructions realistically have been inconclusive. Whether they can consistently follow the test instruction throughout the whole test remain an open question. There have been intense comparisons studies on scores obtain through various CTT scoring methods with NR scores. What yet to be explore is the comparison of these scores with Item Response Theory (IRT) ability estimates. 2006 Thesis https://ir.uitm.edu.my/id/eprint/94756/ https://ir.uitm.edu.my/id/eprint/94756/1/94756.pdf text en public masters University Technology MARA Sarawak Institut Of Research Development Commercialisation |
institution |
Universiti Teknologi MARA |
collection |
UiTM Institutional Repository |
language |
English |
topic |
H Social Sciences (General) H Social Sciences (General) |
spellingShingle |
H Social Sciences (General) H Social Sciences (General) Lau, Sie Hoe Paul Lau, Ngee Kiong Ling, Siew Eng Hwa, Tee Yong Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] |
description |
Despite the current popularity of performance-based assessment and the emergence of new assessment methods, multiple choices (MC) item
remain a major form of assessment. Conventional Number Right (NR) scoring method where one point for correct response and zero for other response has been consistently criticized for failure to credit partial knowledge and encourage guessing. Various alternative scoring methods such as Number Right with Correction for Guessing (NRC). Elimination Testing (ET), Confidence Weighting (CW) and Probability Measurement (PM) had been proposed to overcome these two weaknesses. 1 lowever to date, none has been widely accepted although the theoretical rationale behind various scoring methods under Classical Test Theory (CTT) is sound. A major cause of concern is the possibility that complicated scoring instruction might introduce other factors, which may affect the reliability and validity of the test scores. Studies on whether examinees can be trained to follow the new test instructions realistically have been inconclusive. Whether they can consistently follow the test instruction throughout the whole test remain an open question. There have been intense comparisons studies on scores obtain through various CTT scoring methods with NR scores. What yet to be explore is the comparison of these scores with Item Response Theory (IRT) ability estimates. |
format |
Thesis |
qualification_level |
Master's degree |
author |
Lau, Sie Hoe Paul Lau, Ngee Kiong Ling, Siew Eng Hwa, Tee Yong |
author_facet |
Lau, Sie Hoe Paul Lau, Ngee Kiong Ling, Siew Eng Hwa, Tee Yong |
author_sort |
Lau, Sie Hoe |
title |
Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] |
title_short |
Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] |
title_full |
Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] |
title_fullStr |
Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] |
title_full_unstemmed |
Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] |
title_sort |
determining the performance of five multiple choice scoring methods in estimating examinee’s ability / lau sie hoe ... [et. al] |
granting_institution |
University Technology MARA Sarawak |
granting_department |
Institut Of Research Development Commercialisation |
publishDate |
2006 |
url |
https://ir.uitm.edu.my/id/eprint/94756/1/94756.pdf |
_version_ |
1804889937271586816 |