Gauging Information and Computer Skills for Curriculum PlanningReport as inadecuate

Gauging Information and Computer Skills for Curriculum Planning - Download this document for free, or read online. Document in PDF available to download.

Online Submission, Paper presented at the ALISE Annual Conference (Dallas, TX, Jan 17-20, 2012)

Background: All types of librarians are expected to possess information and computer skills to actively assist patrons in accessing information and in recognizing reputable sources. Mastery of information and computer skills is a high priority for library and information science programs since graduate students have varied multidisciplinary backgrounds. The curriculum must include ways to assess the information and computer skills of students entering and exiting the program. Purpose: To assess the information and computer skills of graduate library science students using the ETS "iSkills" exam. Previous research demonstrated successful results among undergraduate students across various disciplines. Effective use of the exam as a screening and assessment tool targeted at specific populations, i.e. library science graduate students, was the purpose of this study. Setting: The study was conducted from 2009-2011 at Clarion University's ALA accredited library and information science graduate program in northwest Pennsylvania. Study Sample: A total of 33 graduate library science students participated in the study over the course of two academic years. Research Design: Other Quantitative; Interview. Data Collection and Analysis: Students were invited to take the exam each fall semester. Variables collected for descriptive statistics included the exam score, gender, undergraduate and graduate GPA, years of work experience, undergraduate major, and whether English was the student's native tongue. In addition, a t test between the reported and cut scores and a correlational analysis of the reported scores and skill categories were conducted. Additional data were collected through a focus group and individual student interviews for content analysis. Findings: No statistical differences were found between the reported scores, gender, native tongue, undergraduate major, GPA, or years of work experience, although, scores tended to decline with an increase in work years. A t test between the reported scores and the cut score revealed significant differences. Strong positive correlations between the reported scores and most skills categories also proved statistically significant. Interview and focus group data revealed recurring themes, such as the time allotted for the exam, the testing location, netbook computer problems, the testing software, the clarity of exam questions, and the overall purpose, i.e. critical thinking, of the exam. Conclusion: The value of using the exam for screening and assessment of information and computer skills for graduate students remains questionable. While the testing environment can be improved, the heart of the exam, i.e. critical thinking, can be demonstrated better with authentic assessments within the curriculum. Computer skills could then be addressed with an assessment checklist in the orientation and introductory courses. Citation: Krueger, J.M., and Ha, Y. (2012). Gauging information and computer skills for curriculum planning. Presentation at the ALISE Conference in Dallas, TX. (Contains 5 tables.)

Descriptors: Curriculum Development, Graduate Students, Introductory Courses, Grade Point Average, Focus Groups, Testing, Statistical Analysis, Interviews, Content Analysis, Evaluation Methods, Screening Tests, Student Evaluation, Critical Thinking, Check Lists, Performance Based Assessment, Computer Literacy, Information Skills, Library Skills, Information Literacy, Library Education, Computer Assisted Testing, Cutting Scores

Author: Krueger, Janice M.; Ha, YooJin



Related documents