Skip to content

This semester I provided two workshops for the part-time librarians I work with who do most of the teaching in our one-shot library/research instruction program.  Although I see them every day, it’s rare that we carve out time to meet as a group and getting together even depends on some librarians coming in on their time off.  But we get so much out of sharing our experiences with each other that we’re all willing to give a little extra to make it work.  At these meetings I had a chance to facilitate discussion about the Framework, which might seem a little late since it was first adopted nearly three years ago, but it was good timing for us because we recently got support from our college administrators to purchase the Credo InfoLit Modules and it’s helping us to think about the scope of our instruction in new ways.

In particular, we’ve been thinking about how to reach beyond our one-shots in new ways.  The information literacy lessons from Credo are one way to reach students before or after we see them in the library.  With a little coordination between the librarian and the professor who’s requesting instruction, students can be introduced to concepts like the value of information or the role of iteration in planning a search strategy before coming to the library.  Or they can get step-by-step, self-paced practice with MLA citations to follow up on our in-class discussions about how they should expect to use various types of sources in their analysis or argument.

Another way we’re reaching beyond our one-shots is to have students create their own research guide during the session so they can follow their own steps again when they are ready to resume their searches.  To ensure that students will have access to their self-made research guides, we have started adapting Carolyn Caffrey Gardner’s contribution to CORA: Community of Online Research Assignments, called Strategic Searching Spreadsheet, which is a Google Sheet template.  We customize the steps on the sheet according to what students need when they come in.  Sometimes we focus on generating keywords or selecting databases.  Sometimes we focus on choosing and analyzing sources.  When students return to the Google sheet, they have all of their notes available to them.  And we also find that the sheet makes it possible for us to do more accurate formative assessment, allowing us to correct misconceptions or highlight database features as we see students needing the additional information during the class.

As we’re planning for how we’re going to reach beyond our one-shots to provide additional support to students, we use the frames to help us identify likely barriers to students’ success given the requirements of their assignments.  Since it’s not always possible to do formative assessments to find out what our students know and need, we turn to the frames to help us reflect on where we should focus our instruction in class and beyond.  And now we can also draw upon a list of predictable misunderstandings, identified through surveys and focus groups, that Lisa Janicke Hinchliffe and her team have compiled in partnership with Credo.  These predictable misunderstandings help us to anticipate and address where our students may get stuck and they remind us of the importance of what Mike Rose calls “intelligent errors” that are signposts along students’ path to developing new skills, beliefs, and behaviors.  

I’ve written about intelligent errors in this blog before, since Carolyn Radcliff and Hal Hannon and I presented a panel at ACRL 2015 about the connection between threshold concepts and intelligent errors.  It’s a robust approach to formative assessment because it’s a reminder that mistakes are part of learning and that while some mistakes demonstrate lack of attention most errors that students make are their best effort to meet our expectations and should be treated by students and instructors as markers of progress rather than signs of failure.  Treating intelligent errors as markers of progress means using them as a starting point for reflection and discussion and observing them closely to see what they reveal about students’ growth.  Taking this approach in our own teaching and sharing it with faculty who express frustration about students’ persistent research mistakes can lead to new collaborations that actively engage students.

Suppose that you think students should be knowledgeable about the rights and responsibilities of information creation. Furthermore, they should be able to recognize social, legal, and economic factors affecting access to information. These two statements form the basis of the Module 4 – The Value of Information – of the Threshold Achievement Test for Information Literacy (TATIL). In this post, I will describe the development of TATIL test knowledge questions. How do we go from a concept to a set of fully formed, sound test questions?

It begins with outcomes and performance indicators written by members of the TATIL advisory board and inspired by the ACRL Framework for Information Literacy. An iterative process of review and revision guided by the TATIL project leader Dr. April Cunningham results in the foundation for writing test questions.

...continue reading "Genesis of a Test Question"

Sometime around 1996 I attended a conference on communication studies. I was working on a master’s degree in Comm Studies and this was my first conference in an area outside of librarianship. I was happy to discover a presentation on research related to libraries, specifically nonverbal behaviors of reference librarians. As the researcher described her findings and quoted from student statements about their interactions with librarians, I experienced a range emotions. Interest and pride soon gave way to embarrassment and frustration. The way I remember it now, there were a host of examples of poor interactions. “The librarian looked at me like I was from Mars,” that sort of thing. Most memorable to me was one of the comment/questions from an audience member. “Librarians need to fix this. What are they going to do about it?,” as though this study had uncovered a heretofore invisible problem that we should urgently address. (Did I mention feeling defensive, too?) I didn’t dispute the findings. What I struggled with was the sense that the people in the room thought that we librarians didn’t already know about the importance of effective communication and that we weren’t working on it. Was there room for improvement? For sure! But it wasn’t news to us.

I thought about that presentation again recently after viewing a webinar by Lisa Hinchliffe about her research project, Predictable Misunderstandings in Information Literacy: Anticipating Student Misconceptions To Improve Instruction. Using data from a survey of librarians who provide information literacy instruction to first year students, Lisa and her team provisionally identified nine misconceptions that lead to errors in information literacy practice. For example, first year students “believe research is a linear (uni-directional) process (and therefore do not see it as an iterative process and integrated into their work).” The project is a partnership with Credo. See the press release or view the webinar slides. ...continue reading "We’re Working On It: Taking Pride in Continuous Instructional Improvement"

The cornerstone of the Threshold Achievement Test for Information Literacy are the outcomes and performance indicators we wrote that were inspired by the ACRL Framework for Information Literacy for Higher Education.

Working with members of our Advisory Board, we first defined the information literacy skills, knowledge, dispositions, and misconceptions that students commonly demonstrate at key points in their education: entering college, completing their lower division or general education requirements, and preparing for graduation. These definitions laid the groundwork for analyzing the knowledge practices and dispositions in the Framework in order to define the core components that would become the focus of the test. Once we determined to combine frames into four test modules, the performance indicators were then used to guide item writing for each of the four modules. Further investigation of the Framework dispositions through a structural analysis led to identifying and defining information literacy dispositions for each module.

...continue reading "From Framework to Outcomes to Performance Indicators, Plus Dispositions!"

After three years of development, two years of field testing, and countless hours of creative innovation and hard work, Carrick Enterprises is proud to announce the availability of the Threshold Achievement Test for Information Literacy!

We are fortunate to work with many librarians, professors, measurement and evaluation experts, and other professionals on the development of this test. We are grateful for the opportunity to collaborate with these creative people and to benefit from their insights and wisdom.


Test Item Developers
Jennifer Fabbi – Cal State San Marcos
Hal Hannon – Palomar and Saddleback Colleges
Angela Henshilwood – University of Toronto
Lettycia Terrones – Los Angeles Public Library
Dominique Turnbow – UC San Diego
Silvia Vong – University of Toronto
Kelley Wantuch – Los Angeles Public Library

Test Item Reviewers
Joseph Aubele – CSU Long Beach
Liz Berilla – Misericordia University
Michelle Dunaway – Wayne State University
Nancy Jones – Encinitas Unified School District

Cognitive Interviewers
Joseph Aubele – CSU Long Beach
Sophie Bury – York University, Toronto
Carolyn Gardner – CSU Dominguez Hills
Jamie Johnson – CSU Northridge
Pearl Ly – Skyline College
Isabelle Ramos – CSU Northridge
Silvia Vong – University of Toronto

Field Test Participants
Andrew Asher – Indiana University
Joseph Aubele – California State University, Long Beach
Sofia Birden – University of Maine Fort Kent
Rebecca Brothers – Oakwood University
Sarah Burns Feyl – Pace University
Kathy Clarke – James Madison University
Jolene Cole – Georgia College
Gloria Creed-Dikeogu – Ottawa University
David Cruse – Adrian College
April Cunningham – Palomar College
Diane Dalrymple – Valencia College
Christopher Garcia – University of Guam
Rumi Graham – University of Lethbridge
Adrienne Harmer – Georgia Gwinnett College
Rosita Hopper – Johnson & Wales University
Suzanne Julian – Brigham Young University
Cynthia Kane – Emporia State University
Martha Kruy – Central Connecticut State University
Jane Liu – Pomona College
Talitha Matlin – California State University at San Marcos
Courtney Moore – Valencia College
Colleen Mullally – Pepperdine University
Dena Pastor – James Madison University
Benjamin Peck – Pace University
Carolyn Radcliff – Chapman University
Michelle Reed – University of Kansas
Stephanie Rosenblatt – Cerritos College
Heidi Senior – University of Portland
Chelsea Stripling – Florida Institute of Technology
Kathryn Sullivan – University of Maryland, Baltimore County
Rosalind Tedford – Wake Forest University
Sherry Tinerella – Arkansas Tech
Kim Whalen – Valparaiso University

Standard Setters
Joseph Aubele – California State University, Long Beach
Stephanie Brasley – California State University Dominguez Hills
Jennifer Fabbi – California State University San Marcos
Hal Hannon – Palomar and Saddleback Colleges
Elizabeth Horan – Coastline Community College
Monica Lopez – Cerritos College
Natalie Lopez – Palomar College
Talitha Matlin – California State University San Marcos
Cynthia Orozco – East Los Angeles College
Stephanie Rosenblatt – Cerritos College

The Threshold Achievement Test for Information Literacy (TATIL) measures student knowledge and dispositions regarding information literacy. The test is inspired by the Association of College and Research Libraries' Framework for Information Literacy for Higher Education and by expectations set by the nation's accrediting
agencies. TATIL offers librarians and other educators a better understanding of the information literacy capabilities of their students. These insights inform instructors of improvement areas, guide course instruction, affirm growth following instruction, and prepare students to be successful in learning and life. Each test is made up of a combination of knowledge items and disposition items.
...continue reading "It’s Here! Announcing the Threshold Achievement Test for Information Literacy!"