You are on page 1of 5

Winter-Summer 2018  

(ACRL-SLILC) | Student Learning and Information Literacy Committee  


Interview with April Cunningham, Palomar College, and Carolyn 
Radcliff and Rick Wiggins, Carrick Enterprises 
 
 
The Student Learning and Information 
Literacy Committee (SLILC) interviewed 
Carolyn Radcliff, Information Literacy 
Librarian and Rick Wiggins, Chief 
Technology Officer, both of Carrick 
Enterprises, along with April 
Cunningham, Project Leader for the 
Threshold Achievement Test for 
Information Literacy this semester to 
learn more about the ​Threshold 
Achievement Test​ and how it can help 
librarians better understand the 
information literacy capabilities of their 
students. SLILC Members individually took a module of the test and submitted 
questions as a result of their experiences to help shape the dialogue for this interview. 
Read on to learn more about Carrick Enterprises and their approach to designing and 
providing resources to assess student learning in the 21st century.  
 
SCLIC:​ D​ o you believe that students are properly trained in information literacy to 
become researchers and life-long learners? 
 
Carolyn Radcliff (CR): W ​ hen we look at college student performance on assessments 
such as​ S​ AILS​ a
​ nd TATIL, we see much of what you would expect. There is a vast range 
of performance levels. Students make progress year to year from the time they enter 
college to the time they leave. We also see that while many students are comfortable 
with basic searching and simple source evaluation, fewer students are aware of the 
full range of their options, opportunities, and obligations with regard to information 
discovery and use. 
  
It’s exciting to be part of the effort to improve information literacy abilities of college 
students, both to help them be successful in college and so that they are discerning 
consumers and responsible contributors to the information ecosystem throughout 
their lives. 
 
SLILC:​ W
​ hat are some of the reasons why you chose to create a test to survey student 
learning and students’ approach to research in the 21st century? 
 
RWR: ​We want to contribute to the national discussion on information literacy with 
the overarching goal of elevating students’ abilities and dispositions. We wanted to 
build on the success of Project SAILS by creating a next-generation assessment that 
was responsive to the ​ACRL Information Literacy Framework​. So part of it was keeping 
up with the changing times. Creating a meaningful test requires a significant 
investment of time from content experts as well as programmers. We wanted to offer 
schools a second option (SAILS being the first) that would allow them to inquire into 
students’ information literacy in a way that they probably couldn’t do on their own, 
given limited resources. Once we began the work, we were also motivated by the 
challenge of considering both information literacy knowledge practices and 
dispositions for our assessment. 
 
SLILC: How did using the Framework as a guiding document change the focus of the 
TATIL (within the context of TATIL’s predecessor, Project SAILS)? 
 
CR:​ There was really no connection between the development of SAILS and of TATIL 
except our involvement with SAILS gave us good experience to build on. We carried 
over from the SAILS project our commitment to student privacy, ease of test 
administration, and validity and reliability. 
  
The Framework gave us a different way to look at information literacy and what could 
be expected of an expert. With TATIL we had to first develop outcomes and 
performance indicators that were filtered through the Framework. Through that 
process we ended up with four modules, each with knowledge performance indicators 
and dispositions. 
 
SLILC: Were any of the six Frames especially challenging to develop test questions 
about? If so, which ones, and how did your team address that challenge? 
 
AC: T​ ATIL addresses the six frames in four modules. This approach stems from our 
work at the beginning when we explored how students would demonstrate 
information literacy within the context of the Framework. We found that some of the 
frames were so closely related that it didn’t make sense for us to test them separately. 
Here is how we grouped the frames into modules:   

TATIL Module ACRL IL Frame(s)

(1) Evaluating Process & “Information Creation as a Process” and


Authority “Authority is Constructed and Contextual”

(2) Strategic Searching “Searching as Strategic Exploration”


(3) Research & Scholarship “Research as Inquiry” and “Scholarship as a
Conversation”

(4) The Value of Information “Information Has Value”

Each module had its own challenge. Here are some examples. In Module 1, Evaluating 
Process & Authority, the challenge was to create realistic scenarios that present an 
audience need and ask a student to address that need. We needed to make complex 
scenarios readable. For Module 2, Strategic Searching, we had to create environments 
similar to what students would see in a database. Module 3, Research & Scholarship, 
required us to find the core research practices that are common across different 
disciplines, adding questions that were somewhat discipline-specific but at a level 
that undergraduates would be expected to be able to handle. For Module 4, The Value 
of Information, the trick was to create questions that get at new ways students create 
information. 
  
Luckily we had very creative and technically skilled professionals addressing these 
challenges. From custom programming to item development by teams to item review, 
cognitive interviews, and data analysis, we are set up to identify items that work, 
revise items that needed it, and eliminate items that do not meet our standards for 
inclusion. 
 
SLILC: Are the questions always asked in the same order or is the order randomized?  
 
RW: T ​ here are several different sequences of the questions for a module. When a 
student begins the test, one of these sequences is randomly selected for their use. Each 
module also include Decision Task Item Sets, composed of three or four items, where 
the questions build upon each other and are always asked in the same order. 
 
SLILC: What skill/ability are the questions that ask test takers to rate strategies on 
the "usefulness" scale meant to identify? 
 
AC: T​ hese are information literacy disposition items which investigate metacognitive 
awareness of strategies that align with information literacy dispositions. From the 
beginning, we’ve known that we wanted to address dispositions, as well as 
knowledge, in any new instrument we created. And in reading the literature, especially 
in education, we found a way to do that with scenario-based problem solving items. 
Through our rhetorical analysis of the Framework we identified four information 
literacy dispositions: mindful self-reflection, productive persistence, responsibility to 
community, and toleration for ambiguity. Each TATIL module evaluates one or more 
disposition with items that present a scenario describing an ill-defined information 
literacy challenge related to the content of the module. Students evaluate the 
usefulness of various strategies for addressing the challenge. Their choices allow us to 
determine how strongly inclined a student is toward the relevant disposition. These 
items do not have correct answers and they are scored and reported separately from 
the knowledge items. 
  
You can read more about information literacy dispositions with these essays by myself 
and another Advisory Board member, Hal Hannon. 
Getting at the Dispositions -​ h
​ ttp://blog.informationliteracyassessment.com/?p=739 
Dispositions and Training Transfer -​ h ​ ttp://blog.informationliteracyassessment.com/?p=760
 
SLILC: There is a question in the Strategic Searching module about search tools, 
where the learner matches the names of tools to a description of what they contain. 
When one of our testers got to this question, there were several tools included that 
they had never used before because their library doesn't subscribe to them. Can 
questions like this one be customized to reflect the tools a library subscribes to 
and/or teaches/promotes through LibGuides, etc.? It would seem without the ability 
to customize test questions like this one, the assessment results will not accurately 
reflect students' learning at that particular library, which seems problematic. Can 
you comment on this? 
 
AC: I​ t is important to us that we ensure the assessment is not specific to any one 
library or set of libraries. With that in mind, we generally did not refer to existing 
resources in the test questions. (One exception is Google, which, although not 
universally used, is well-known as an Internet search engine.) We invented source 
titles, authors, excerpts, quotations, abstracts, database names, and more. In the test 
question mentioned, the tools do not exist, except for Google. That said, there is no 
option for customizing the questions. 
  
RW:​ This brings up a bigger point, about the validity and reliability of the test 
questions. A critical component of the test development process is to ensure that every 
test question functions properly. This is why, even after all the expert review of draft 
questions, we conduct field testing and bring in a psychometrician to perform 
analyses of the questions based on responses from hundreds of students. We are 
checking to see if the questions perform as expected. Are there any questions that are 
exceptionally difficult, even for students who answered most questions correctly? 
Does each item help to differentiate novices from more advanced test-takers? Items 
with problems are either fixed through revision or eliminated from the test. All the 
questions in the finished modules, Evaluation Process & Authority, and Strategic 
Searching, meet established standards for effectiveness. 
 
SLILC: Would you like to share any other thoughts with our audience about your 
future endeavors or the future of the TATIL?  
 
CR: N
​ ow that we’re nearing completion of the second half of the assessment, we’ve 
been able to redirect our efforts to more listening to our customers. We’re 
interviewing people about their use of TATIL modules, seeking ways to improve the 
test management process, the testing experience, and the reports in order to meet 
their needs. We love hearing about their information literacy programs and talking 
with them about how assessments can be leveraged to improve information literacy 
on their campuses. 
 
This concludes the end of our interview with Carrick Enterprises. If you have questions 
or are interested in speaking with Carolyn Radcliff or other members of their team 
about the Threshold Achievement Test visit their website: 
https://thresholdachievement.com/​. 

You might also like