Research Article
BibTex RIS Cite

Rubrics in Terms of Development Processes and Misconceptions

Year 2023, Volume: 14 Issue: 3, 222 - 234, 30.09.2023
https://doi.org/10.21031/epod.1251470

Abstract

The present study aimed to examine the development process of rubrics in theses indexed in the national thesis database and to identify any misconceptions presented in these rubrics. A qualitative research approach utilizing document analysis was employed. The sample of theses was selected based on literature review and criteria established by expert opinions, resulting in a total of 395 theses being included in the study using criterion sampling. Data were collected through a "thesis review form" developed by the researchers. Descriptive analysis was employed for data analysis. Findings indicated that approximately 27% of the 395 theses contained misconceptions, with a disproportionate percentage of these misconceptions being found in master's theses. Regarding the field of the thesis, the highest rate of misconceptions was observed in health, social sciences, special education, and fine arts, while the lowest rate was found in education and linguistics. Additionally, theses with misconceptions tended to possess a lower degree of validity and reliability evidence compared to those without misconceptions. This difference was found to be statistically significant for both validity evidence and reliability evidence. In theses without misconceptions, the most frequently presented validity evidence was expert opinion, while the reliability evidence was found to be the percentage of agreement. The findings were discussed in relation to the existing literature, and recommendations were proposed.

References

  • Brookhart, S. M. (2013). How to create and use rubrics for formative assessment and grading. ASCD.
  • Brookhart, S. M. (2018). Appropriate criteria: key to effective rubrics. In Frontiers in Education, 3(22), 1-12. https://doi.org/10.3389/feduc.2018.00022
  • Brookhart, S. M., & Chen, F. (2015). The quality and effectiveness of descriptive rubrics. Educ. Rev, 67(3), 343–368. https://doi.org/10.1080/00131911.2014.929565
  • Corbin, J. & Strauss, A. (2008). Basics of qualitative research: Techniques and procedures for developing grounded theory (3rd ed.). Thousand Oaks, CA: Sage.
  • Çolak-Ayyıldız, A. (2022). Alternatif eğitim konusunda yapılmış lisansüstü eğitim tezlerinin incelenmesi. Gümüşhane Üniversitesi Sosyal Bilimler Dergisi, 13(3), 877-886.
  • Dochy, F., Gijbels, D., & Segers, M. (2006). Learning and the emerging new assessment culture. In L. Verschaffel, F. Dochy, M. Boekaerts, & S. Vosniadou (Eds.), Instructional psychology: Past, present and future trends. Elsevier.
  • Forster, N. (1995). The analysis of company documentation. C. Cassell ve G. Symon (Eds.), Qualitative medhods in organizational research: A practical guide. Sage.
  • Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational research review, 2(2), 130-144. https://doi.org/10.1016/j.edurev.2007.05.002
  • Koyuncu, M. S., Şata, M. & Karakaya, İ. (2018). Eğitimde ölçme ve değerlendirme kongrelerinde sunulan bildirilerin doküman analizi yöntemi ile incelenmesi. Journal of Measurement and Evaluation in Education and Psychology, 9(2), 216-238. https://doi.org/10.21031/epod.334292
  • Lane, S., & Tierney S. T., (2008). Performance Assessment. Thomas L. G, (Ed), In 21st century education: A reference handbook (Vol. 1), SAGE.
  • Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel psychology, 28(4), 563-575. https://doi.org/10.1111/j.1744-6570.1975.tb01393.x
  • Morrison, G. R., & Ross, S. M. (1998). Evaluating technology-based processes and products. New Directions for Teaching and Learning, 74, 69-77. https://doi.org/10.1002/tl.7407
  • Moskal, B. M. (2000). Scoring rubrics: What, when and how?. Practical Assessment, Research, and Evaluation, 7(3), 1-5. https://doi.org/10.7275/a5vq-7q66
  • Moskal, B. M., & Leydens, J. A. (2000). Scoring rubric development: Validity and reliability. Practical Assessment, Research, and Evaluation, 7(10), 1-6. https://doi.org/10.7275/q7rm-gg74
  • Ocak, İ., & Yeter, F. (2018). Investigation of national theses and articles on “the nature of science” between 2006-2016 years. Journal of Theoretical Educational Science, 11(3), 522-543. https://doi.org/10.30831/akukeg.344726
  • Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: A review. Educational research review, 9(1), 129-144. https://doi.org/10.1016/j.edurev.2013.01.002
  • Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & evaluation in higher education, 35(4), 435-448. https://doi.org/10.1080/02602930902862859
  • Reynolds-Keefer, L. (2010). Rubric-referenced assessment in teacher preparation: An opportunity to learn by using. Practical Assessment, Research, and Evaluation, 15(8), 1-9. https://doi.org/10.7275/psk5-mf68
  • Rezaei, A. R., & Lovorn, M. (2010). Reliability and validity of rubrics for assessment through writing. Assessing writing, 15(1), 18-39. https://doi.org/10.1016/j.asw.2010.01.003
  • Wiggins, G. (1998). Educative assessment. Jossey-Bass.
  • Wilson, F. R., Pan, W., & Schumsky, D. A. (2012). Recalculation of the critical values for Lawshe’s content validity ratio. Measurement and Evaluation in Counseling and Development, 45(3), 197-210. https://doi.org/10.1177/0748175612440286
  • Yenilmez, K., & Yaşa, E. (2008). İlköğretim öğrencilerinin geometrideki kavram yanılgıları. Uludağ Üniversitesi Eğitim Fakültesi Dergisi, 21(2), 461-483. Zembat, İ, Ö. (2010). Kavram yanılgısı nedir?. MF. Özmantar, E. Bingölbali & H. Akkoç (Eds.), Matematiksel kavram yanılgıları ve çözüm önerileri içinde. Pegem Akademi.
Year 2023, Volume: 14 Issue: 3, 222 - 234, 30.09.2023
https://doi.org/10.21031/epod.1251470

Abstract

References

  • Brookhart, S. M. (2013). How to create and use rubrics for formative assessment and grading. ASCD.
  • Brookhart, S. M. (2018). Appropriate criteria: key to effective rubrics. In Frontiers in Education, 3(22), 1-12. https://doi.org/10.3389/feduc.2018.00022
  • Brookhart, S. M., & Chen, F. (2015). The quality and effectiveness of descriptive rubrics. Educ. Rev, 67(3), 343–368. https://doi.org/10.1080/00131911.2014.929565
  • Corbin, J. & Strauss, A. (2008). Basics of qualitative research: Techniques and procedures for developing grounded theory (3rd ed.). Thousand Oaks, CA: Sage.
  • Çolak-Ayyıldız, A. (2022). Alternatif eğitim konusunda yapılmış lisansüstü eğitim tezlerinin incelenmesi. Gümüşhane Üniversitesi Sosyal Bilimler Dergisi, 13(3), 877-886.
  • Dochy, F., Gijbels, D., & Segers, M. (2006). Learning and the emerging new assessment culture. In L. Verschaffel, F. Dochy, M. Boekaerts, & S. Vosniadou (Eds.), Instructional psychology: Past, present and future trends. Elsevier.
  • Forster, N. (1995). The analysis of company documentation. C. Cassell ve G. Symon (Eds.), Qualitative medhods in organizational research: A practical guide. Sage.
  • Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational research review, 2(2), 130-144. https://doi.org/10.1016/j.edurev.2007.05.002
  • Koyuncu, M. S., Şata, M. & Karakaya, İ. (2018). Eğitimde ölçme ve değerlendirme kongrelerinde sunulan bildirilerin doküman analizi yöntemi ile incelenmesi. Journal of Measurement and Evaluation in Education and Psychology, 9(2), 216-238. https://doi.org/10.21031/epod.334292
  • Lane, S., & Tierney S. T., (2008). Performance Assessment. Thomas L. G, (Ed), In 21st century education: A reference handbook (Vol. 1), SAGE.
  • Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel psychology, 28(4), 563-575. https://doi.org/10.1111/j.1744-6570.1975.tb01393.x
  • Morrison, G. R., & Ross, S. M. (1998). Evaluating technology-based processes and products. New Directions for Teaching and Learning, 74, 69-77. https://doi.org/10.1002/tl.7407
  • Moskal, B. M. (2000). Scoring rubrics: What, when and how?. Practical Assessment, Research, and Evaluation, 7(3), 1-5. https://doi.org/10.7275/a5vq-7q66
  • Moskal, B. M., & Leydens, J. A. (2000). Scoring rubric development: Validity and reliability. Practical Assessment, Research, and Evaluation, 7(10), 1-6. https://doi.org/10.7275/q7rm-gg74
  • Ocak, İ., & Yeter, F. (2018). Investigation of national theses and articles on “the nature of science” between 2006-2016 years. Journal of Theoretical Educational Science, 11(3), 522-543. https://doi.org/10.30831/akukeg.344726
  • Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: A review. Educational research review, 9(1), 129-144. https://doi.org/10.1016/j.edurev.2013.01.002
  • Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & evaluation in higher education, 35(4), 435-448. https://doi.org/10.1080/02602930902862859
  • Reynolds-Keefer, L. (2010). Rubric-referenced assessment in teacher preparation: An opportunity to learn by using. Practical Assessment, Research, and Evaluation, 15(8), 1-9. https://doi.org/10.7275/psk5-mf68
  • Rezaei, A. R., & Lovorn, M. (2010). Reliability and validity of rubrics for assessment through writing. Assessing writing, 15(1), 18-39. https://doi.org/10.1016/j.asw.2010.01.003
  • Wiggins, G. (1998). Educative assessment. Jossey-Bass.
  • Wilson, F. R., Pan, W., & Schumsky, D. A. (2012). Recalculation of the critical values for Lawshe’s content validity ratio. Measurement and Evaluation in Counseling and Development, 45(3), 197-210. https://doi.org/10.1177/0748175612440286
  • Yenilmez, K., & Yaşa, E. (2008). İlköğretim öğrencilerinin geometrideki kavram yanılgıları. Uludağ Üniversitesi Eğitim Fakültesi Dergisi, 21(2), 461-483. Zembat, İ, Ö. (2010). Kavram yanılgısı nedir?. MF. Özmantar, E. Bingölbali & H. Akkoç (Eds.), Matematiksel kavram yanılgıları ve çözüm önerileri içinde. Pegem Akademi.
There are 22 citations in total.

Details

Primary Language English
Journal Section Articles
Authors

Fuat Elkonca 0000-0002-2733-8891

Görkem Ceyhan 0000-0001-9342-6876

Mehmet Şata 0000-0003-2683-4997

Publication Date September 30, 2023
Acceptance Date September 13, 2023
Published in Issue Year 2023 Volume: 14 Issue: 3

Cite

APA Elkonca, F., Ceyhan, G., & Şata, M. (2023). Rubrics in Terms of Development Processes and Misconceptions. Journal of Measurement and Evaluation in Education and Psychology, 14(3), 222-234. https://doi.org/10.21031/epod.1251470