Showing posts with label clickers. Show all posts
Showing posts with label clickers. Show all posts

Saturday, August 3, 2013

Web-based Polling in Library Instruction

Jared Hoppenfeld's article "Keeping students engaged with web-based polling in the library instruction session" introduces web-polling as a means to promote student engagement.  His literature review covers several areas, such as the main constituents of library instruction in academic libraries--Millennials, active learning, Audience Response Systems and web-based polling, as well as mobile technologies.  Indeed, he has provided a thorough overview of the topic, bringing to light some web-based polling sites I was not aware of previously: Text the Mob and SMS Poll

Of more particular interest to me, he offers suggestions on types of questions that a library instructor might ask during a typical library instruction session.  Hoppenfeld starts with an icebreaker, such as "How happy are you that college football season is here?"  If the class takes place closer to Valentine's Day he might ask about profits related to chocolate sales (243).  This signals to students that the class will not be a regular library lecture class, and it also introduces them to the polling software.

Hoppenfeld's second set of questions deals more with student knowledge.  Where are they coming from?  What have they tried when conducting research?  He may ask where they might discover a journal article, in a catalog or in a database.  "An open-ended poll is also used to find out what resources the students have previously used for their research" (243).  This offers an opportunity to discuss what they have tried and explain why they would want to take advantage of the library resources.  What are the pros and cons of searching Google, Wikipedia, or About.com?

 

Wednesday, June 20, 2012

Teaching with Xtranormal, Poll Everywhere, Wikis, and Skype

It appears that many librarians today believe that technology needs to be used within librarian instruction to catch the attention of the "digital natives," or the current set of college students.  Nicole Eva and Heather Nicholson write: "Library instruction is viewed by many students as being less than enthralling.  Students may not understand how important the library can be for their academic endeavours, or they may think that they know all they need to know.  As a result, librarians often seek new and innovative ways to engage classes" (1-2).  These authors do not promote technology for its own sake, rather they tout technology as a tool for engaging student in collaborative  ways. 

They believe students "are familiar and comfortable" with technology, so it can be utilized as a means to deliver the content.  In their words: "Technology can make library instruction more engaging, more entertaining and more interactive" (2).  Their article, "DO Get Technical!  Using Technology in Library Instruction," highlights four types technology tools library instructors can use to engage students: Xtranormal, Poll Everywhere, wikis, and Skype

Xtranormal allows individuals to create their own little video with pre-fabricated characters, backgrounds, and voices.  If you can type, you can create an Xtranormal video.  Keying or typing words into dialogue boxes creates the audio component; a machine reads the words, though you can choose what kind of accent you prefer.  "You can make one Xtranormal video for free; after that you must buy points.  The more points you buy the less expensive they are, but generally it costs only two to three dollars for a basic movie" (2).  They recommend it as a tool that can share information, or teach students, without costing lots of money, while also being amusing. 

Student in their classes have enjoyed the humor and the entertainment in the process of being introduced to a topic or listening to a summary. They also suggest: "Students could also create their own videos in order to demonstrate their understanding of a topic" (3).  The machine-generated voices and the gestures throughout the videos increase the humor.  They have created a few publicly available Xtranaormal videos:
The next tool they explain is Poll Everywhere.  It lets students answer questions in real time anonymously, and the questions can be inserted to a PowerPoint presentation.  Nicholson and Eva write: "A basic account allows up to 30 responses per question and unlimited questions for free, and upgrades range from $15 to $1,400 per month" (4).  To respond, students need computers or cell phones to reply to the questions.  They explain: "As we have seen with classroom clickers, this is a great way to encourage class participation" (4).  Cell phones can text their answer, but computers with internet access may certainly respond as well.  "The polls are updated instantly, and PowerPOint slide changes dynamically as students enter their answers" (4).  Results offer instructors to correct students and solifify the learning; they can also prompt further discussion.

Nicholson and Eva also promote the use of wikis in library instruction, highlighting its effectives as a collaborative learning tool.  Like Poll Everywhere, wikis "exist in the 'the cloud' with no downloads requires" (5).  Wikis can be purchased, however, that are not available to the public at large, so just the students in a class could access the project.  They explain the essential aspects of a wiki: "The premise behind wikis is that they are collaborative; all users can edit or create new entries.  Student participation in a wiki is an effective way to promote active learning" (6).  Participation in this endeavor turns on the light for many students as they begin to understand how information is created, edited, and shared.  Therefore, they begin to realize how important it is to evaluate the information they find.

Finally, they talk about Skype.  This online videoconferencing tool can be used to teach students in distance settings, though this requires hardware such as microphones, video cameras, and speakers, not to mention high-speed internet access.  Nicholson and Eva have taken advantage of the technology to instruct students at distance sites, so they speak from experience (7).  I appreciated that they mentioned how they anticipated and prepared for technical difficulties.  When the visual feed was lost on the distance site's end, the instructor there was able to display the presentation slides that had been emailed previously while the library instructors continued speaking and teaching.  The instructor could demonstrate along with the librarians as they both walked through the presentation (8). 

Nicole Eva and Heather Nicholson believe that these technologies are "unique, effective, EASY, and low or no-cost.  When used correctly and where warranted, these applications can be usefu in engaging students in sessions in which they might otherwise tune out" (9).  Even their own lack of technological experience did not keep them from succeeding, and they believe others can experience this same kind of fruitful experience in the library instruction room (9). 

Work Cited
Eva, Nicole, and Heather Nicholson.  "DO Get Technical!  Using Technology in Library Instruction."  Partnership: The Canadian Journal of Library and Information Practice and Research 6.2 (2011): 1-9.
University of Lethbridge logo. Fiat lux is the Latin for "Let there be light."
Nicole Eva and Heather Nicholson work in the University of Lethbridge Library.

Tuesday, June 12, 2012

Clickers, Participation, Assessment Outcomes, and Library Instruction

Clickers, or personal response systems, may encourage participation and help students enjoy library instruction more.  Emily Chan and Lorrie Knight, from the University of the Pacific,  conducted a study that discovered this to be true.  They also learned that assessment outcomes may not necessarily improve as a result of using clickers. 

Published in Communications in Information Literacy, their article "Clicking with Your Audience: Evaluating the Use of Personal Response Systems in Library Instruction" first identifies the makeup of college students participating in their study.  They belong to the Millennial generation who "tend to share these main character traits: feeling special, being sheltered, having confidence, preferring team or group activities, favoring the conventional, feeling pressured, and needing to achieve" (193).  They make the following claim: "Library instruction, often delivered through one-shot sessions, may seem out of touch to Millennials if it does not incorporate technology in a meaningful and entertaining manner" (193).  With this premise as their foundation, they propose the usage of personal response systems (PRS or clickers) to engage students.

Through their look at the literature they show that others have found that professional use of PRS helps students be involved in the classroom, promotes conversations, and enhances learning among students (193).  They note that PRS make the lectures and class activities more lively and less "stagnant" (193).  As mentioned elsewhere, clickers allow instructors to adjust in the moment they are teaching.  They can understand what the students know.  Therefore, an atmosphere of active engagement and learning may be easier to establish with clickers (193). 

Not enough has been written about the PRS and actual learning outcomes, so Chan and Knight worked to look at this with their study.  They cite Anne C. Osterman's 2008 article that identifies library instructors' two greatest fears: (1) boring students and (2) teaching above their heads.  At this point they refer to another article when they write: "The use of clickers can prompt greater classroom interactivity through an assessment of students' understanding of IL concepts" (194).  Additionally, they found another article that states the finding that clickers increase student involvement in the classroom as well as their usage of resources in the library (194).  To repeat myself once more, this study looks at student enjoyment, engagement, and achievement as they relate to the implementation of clickers in the classroom.

As with other studies, they prepare the reader by defining the constituents involved--in this case freshmen at the University of the Pacific--and explain the course objectives of the freshman seminar courses and the library evaluations gathered before the study took place.  "At the end of each library session, students completed a brief evaluation measuring their achievement of learning outcomes.  The Assessment Office tabulated and analyzed the results, which proved to be inconclusive" (195).  Librarians convinced their library dean to fund a second instruction room equipped with more technology, such as a smart board, a computer for all participants, and clickers.  This allowed the librarians to conduct an experiment to see if the technology influenced student learning outcomes. 

Surprisingly enough, they found that the classes without clickers scored slightly better than the ones with them.  They write: "The students in Classroom NC (non-clicker) scored significantly higher in the assessment than the students who had their library session in Classroom C (clickers) (P value < 0.001)" (197).  That is not to say that there were no positive outcomes for students taking the instruction with the clickers.  Chan and Knight write: "The students in the technology-rich Classroom C found the library sessions to be more enjoyable, organized, well-presented, and participatory" (197).  Perhaps these positive results would continue to justify the use of clickers in the classroom.

No doubt the authors must have been perplexed that the technology did not increase content retention; however, they offer some reasons why the students in technology-rich classroom may not have achieved higher scores on the assessment measures.  They do so by noting potential benefits of a paper assessment:
  1. Able to use the paper assessment as a resource
  2. Allows the student to self-regulate order and pace during the test time
  3. Lets them to see all the questions from the start (this is similar to reason #1 above)
  4. With paper exams students can review and correct their answers before turning them in to be graded
  5. A paper test gives students the opportunity to judge how they use their time; they are more in control of this than if the test is offered with technology, especially if the instructor changes the questions (198)
If librarians use the clicker technology to assess learning, these reasons may be worth remembering. 
Boulder Chain Lakes area in White Clouds of Idaho.  Lakes in photo may be of Sliderock Lake (l) and Shelf Lake (r) Photo by Spencer Jardine.  2010.
Here are a few other things worth mentioning from this article.  Classes with clickers seemed to enjoy the instruction more, felt it was more organized, well-presented, and participatory than those that did not have them (199).  Millenials may expect and want technology to be used.  Indeed, Chan and Knight also mention another study that suggests "the use of clickers can restart the attention span of students" (199).  Sometimes this is necessary to bring back students to the subject at hand. 

The authors see clickers as useful tools to invite participation, adjust to student needs, and as a means to get things going at the beginning of library instruction sessions.  They write: "With the clickers' ability instantly to poll the audience, library faculty used warm-up questions as icebreakers in order to foster a more collaborative and engaging environment" (199).  They had wanted the clickers to increase content retention, but the non-clicker classroom student out-performed their peers in the classroom with clickers.  Naturally, other researchers, just as the authors mention, should look to see how learning outcomes are influenced by the use of technology in the library instruction classroom.

Chan, Emily K., and Lorrie A. Knight.  "Clicking with Your Audience: Evaluating the Use of Personal Response Systems in Library Instruction."  Communications in Information Literacy 4.2 (2010): 192-201.  Print.

Thursday, May 31, 2012

Effectiveness of Clickers in Big, Intro to Psychology Classes

A group of researchers from the University of Delaware, in a study that looked at the effectiveness of personal response systems, found that modest use of "clickers" increased exam performance.  They did not see evidence that clickers actually increased engagement in their study.  In a reference to the literature they write: "According to some researchers, students like clickers, and students also believe clickers make them feel more engaged" (45).  As far as their own students went, however, they note: "Although Dr. B reported that students 'got a kick out of them,' clickers had only marginal effects on self-reports of student engagement, attendance, and reading in this study--effects that may be attributable to Type I error" (48). 

Freshmen students at the University of Delaware, at the time of the study that is, mark which classes they want to take their first year, and a computer assigns their schedule.  Morley, McAuliffe, and DiLorenzo mention more than once that this made their study more reliable and random, due to the random nature of the process.  Students personal preferences did not determine when they took the psychology class.  Morley explains that the teachers used the clicker system only minimally:
Both professors taught using an interactive lecture style.  Both professors taught the earlier section without clickers ('traditional' sections) and the later section with clickers.  In clicker sections, at the beginning of class, the instructor posted five multiple-choice, fact-based questions, based on the day's required reading.  Students earned extra credit for answering these questions correctly.  Later in the class period, if relevant, the instructor would briefely elaborate on a clicker question that most students had misunderstood.  Other than this change, instructors taught the two sections identically.  (46)
 Data gathered from the exam results did indicate that "exam scores were higher for clicker sections than for tratidtional sections" (47).  This occurred regardless of the teacher; there were two teachers who taught two sections of large, introductory psychology classes.  Morling et al. summarize it in more formal language: "Our data suggest that using clickers to quiz students in class on material from their reading resulted in a small, positive effect on exam performance in large introductory psychology classes" (47).

Further studies might consider looking at teaching methods used in conjunction with the technology.  For example, they suggest looking at concept inventories, group discussions, and Just in Time Teaching (JiTT), which could all be joined with clickers to see how they might enhance learning (48).  For more clarification, the authors write: "In our study, the instructors used clickers very minimally--to administer quizzes, publicly display the results, and quickly correct any widespread misunderstandings" (47-48). 

Moreover, the article addressed the possibility that some students cheated while taking the reading quizzes, though they concede that this may have actually promoted a cooperative learning environment, which would have improved their engagement in the class (49).  Overall, this was a good article, as it found a positive result of using clickers via a scientific study, rather than relied on anecdotes or the fact that the technology was trendy at the time.

Works Cited
Morling, Beth, Meghan McAuliffe, Lawrence Cohen, and Thomas M. DiLorenzo.  "Efficacy of Personal Response Systems ("Clickers") in Large, Introductory Psychology Classes."  Teaching of Psychology 35.1 (2008): 45-50.

Stowell, Jeffrey R., and Jason M. Nelson.  Teaching of Psychology 34.4 (2007): 253-58.

Thursday, May 24, 2012

Audience Response Systems in the Classroom

Heidi Adams and Laura Howard write: "Audience Response Systems, commonly known as clickers, are gaining recognition as a useful classroom tool" (54).  In their short article titled "Clever Clickers: Using Audience Response Systems in the Classroom" they define Audience Response Systems (ARS), explain the two major systems (radio frequency and infrared), and explain how the systems can be used to gather feedback, check for understanding, assess student learning, and provide specific ideas for using ARS in the classroom. 

As a tool to promote student learning, each student must answer questions with a remote control.  Results are shown right away.  Adams and Howard write: "Since the educators are able to see the results instantly, it permits them to evaluate student understanding at that very moment and provides an opportunity to adjust the lesson accordingly to improve student comprehension" (54).  It helps instructors know if students got it.  ARS can be used and adapted to meet the needs of each student and each class.

Of particular value, this article offers twenty ideas for using ARS in the classroom.  Here are just a few:
  • Comprehension Testing
  • Drill and Practice
  • Review Games
  • Questionnaires/Surveys
  • Voting
  • Checking for understanding during a lecture
  • Fact Finding or Pre- and Post-Tests (55)
Adams and Howard cite some of the literature in making their point that ARS are good for students, because they increase engagement in the classroom.  Students seem to like it more.  They write: "With clickers, every student answers every question.  Additionally, the questions will spark more questions from students that will lead to further discussion and understanding regarding the material" (55).  Additionally, the on-the-spot assessment or feedback lets students understand if they got it right or not (56).  They do not have to guess; this seems to cement the learning and can solidify the learning process, or the correct cerebral paths in the brain. 

Naturally, the ARS do not solve all problems and have a few drawbacks.  Adams and Howard claim: "As with any other type of learning, if ARS is used too often, students tire of it" (56).  In other words, students like the newness of the technology, but with time will become less interested in it.  Still, they assert "that the advantages such as instant feedback and increased student engagement far outweigh the downsides" (56).  The potential of these systems does seem fairly expansive.
Qwizdom Clicker.  See "Spotlight on Education: Sandwood's S.A.IN.T Academy Hosts First Annual Media Day." Duval County Public Schools.
 A sidebar in the article shows a half dozen brand names of ARS:

Work Cited
Adams, Heidi, and Laura Howard.  "Clever Clickers: Using Audience Response Systems in the Classroom."  Library Media Connection 28.2 (October 2009): 54-56.

Wednesday, May 23, 2012

How does empowering versus compelling students influence participation attitudes?

A group of researchers at Brigham Young University collaborated to write an article in Active Learning in Higher Education.  It was published in 2007 and discussed a gap in the research on clickers or Audience Response Systems (ARS).  Many studies have looked into the participation factors: how they prompt discussions, how they uncover misconceptions in the learning, how they offer immediate feedback.  They write: "Studies that explored direct measures of student learning with audience response systems have shown mixed results [...] Although these studies showed mixed results, most studies that looked at indirect measures of student learning (including levels of participation and engagement) found positive results" (236-37). 

What about the groups of students who typically hold back and do not participate?  They cite studies showing that females tend to participate less, "because of they worry that they might appear to dominate discussion" (237).  Students from other cultures also participate less frequently, not wanting to give incorrect answers, "fearful of 'losing face'" (237).  These researchers call this group "reluctant participators," saying that other studies on ARS have not looked into this demographic (237). 
Reluctant Participants.  by Middle Age Biker.
They found that students perceive the ARS to be less helpful when they are used to grade or mark attendance.  Also, technical difficulties with the systems came back as the number one negative aspect of the systems, followed by their cost to the student, grading, and mandatory attendance.  Students had to pay $40 a piece, so "a significant number of students were critical of cost effectiveness" (240).   However, when the ARS were used to provide formative feedback, students perceived the use of the technology as positive.

On the whole, however, "the overall reaction to the use of the ARS in the pilot was positive.  [...] For all of the measures except one, a strong majority of the students 'agreed' or somewhat agreed' that the ARS was helpful to them in the learning experience" (238, 240).  Nonetheless, reluctant participators tended to view it as less helpful than the rest of the group.  Moreover, "students in classes where the ARS was not used for grading viewed the ARS to be a more helpful tool" (242).  This seems to be a major advantage of using Audience Response Systems (ARS).

During the survey, students were given opportunities to comment on what they liked as well as what they did not like about ARS.  "One student wrote, 'It was nice to have an immediate response so I could know whether or not I was doing something right'" (248).  Other students appreciated knowing what their peers thought on issues or content related to the class. The authors included this positive comment as well: "'The best part is that we could see the results of the surveys and what not instantly, and they were applicable to us in the class, not some foreign group of individuals.  It brought the issues to life which we were discussing in class'" (248).  This emphasizes the potential power of these systems, bringing forth relevant issues that can energize discussions.

At this point it seems appropriate to go back to the article's introduction to mention what happens when student have limited opportunities to participate and the method involves raising hands.  The authors write: "When classes have limited opportunities for students to respond, participation can be unbalanced in favor of the most knowledgable students who are most willing to respond in front of their peers" (234).  Personally, I have experienced this countless times.  Students often have good things to share, but it often happens that one or two students dominate any sort of discussion.  The strength of the ARS is that eveyone can participate anonymously.

While we are talking about active learning in general, it may be helpful to consider what the authors of this study wrote: "A diverse body of educational research has shown that academic achievement is positively influenced by the amount of active participation of students in the learning process" (233-34).  In the past, response cards have been used, and they helped to increase student participation and performance in higher education.  Today many have taken the ARS to invite participation and performance.  Active learning makes a difference, but it helps to consider the subgroups who reluctantly participate. 

Again, it is better to avoid using these systems for grading and attendance purposes, considering how many variables obstruct the success of the system.  The authors included this student comment: "'I think it's an unfair grading system.  I think they're great, as long as grades aren't based on them.  There are too many variables like battery failure, problems with being on the correct channel and so forth that interfere.  These variables make it unfair to base grades on clicker systems'" (240).  Clickers can be a powerful tool; however, students seems to dislike the tool when it appears faulty and they will be assigned a grade via the performance of this faulty tool.

Instructors should review the purpose of the tools or methodologies they use in the class.  The authors write: "The researchers in this study believe that the central concern of instructional technologists should be that of 'helping' students" (248).  Though the grading and attendance features may be helpful for instructors, if they create negative feelings in the students that inhibit potential learning, then perhas it should not be used to grade student work.  In their concluding remarks, the authors wrote: "Students perceived strategies that provided formative feedback and empowered them to evaluate their own performance as more helpful than strategies oriented towards grading and compelling participation" (251).  This explains the title of their article rather succinctly.

For researchers it may be helpful to know what they suggest for later studies with ARS.  "Future research could investigate a wider range of pedagogical strategies and the environments in which they are successful.  A measure of the goodness of the strategies could be student and instructor perceptions of their helpfulness in the learning process" (250).

Work Cited
Graham, Charles R., Tonya R. Tripp, Larry Seawright, and George L. Joeckel III.  "Empowering or Compelling Reluctant Participators Using Audience Response Systems."  Active Learning in Higher Education 8.3 (2007): 233-58.  Print.

Tuesday, May 15, 2012

Clickers: An Engaging Tool for Library Instruction

In 2008 Anne C. Osterman published an article online for librarians about the potential of student response systems (SRS) or clickers in the library instruction setting.  College & Undergraduate Libraries published it with the title "Student Response Systems: Keeping the Students Engaged."  (It appears the the print version came out first in 2007.)  She introduces the topic by mentioning many factors that go against participation in the library instruction classroom: unfamiliar setting, short opportunity (one shot at teaching library skills), and content many would not consider exciting. 

Librarians do what they can to invite participation.  They will work to make the instruction tied directly to an assignment, develop hands-on exercises, create handouts, and sometimes divide classes into groups to work together (50).  Osterman writes: "These tools do little, however, to help with one more inherent difficulty of library instruction: a wide variety of experience levels among student" (50).  Then she identifies "the two greatest fears of a library instructor [...]: (1) boring the students because they've seen it all before; and (2) losing the students because the territory is too foreign to their knowledge and experience.  Both lead students to tune out" (50).

"bored-students."  by cybrarian77 on Flickr.com.

This resonates with my own experience.  These are two of my greatest fears, and I have wondered how to deal with this.  Well, the most obvious thing to do and what Osterman calls "the last tool in the box: asking the students questions" (50).  Unfortunately, this does not always work, and Osterman recognizes that all the previous difficulties just mentioned will make this effort less effective as well.  Encouragingly, she writes: "Never fear--there is another solution" (50).  The Student Response System can make a difference, increasing participation, engaging students of all personalities and abilities, and offering a mechanism that prompts the instructor to adjust in the classroom needs and address deficiencies without belaboring the subjects students have mastered already.
Osterman observes that instructors can ask students questions spontaneously or "on the fly" (51).  She suggests questions like:
  1. Have you ever used X (JSTOR, Academic Search CompleteCQ Researcher, etc.)?
  2. What kinds of materials do you think you would find in X (the library catalog, the Special Collections digital archives, the Primo search, etc.)?
  3. Should you cite Wikipedia in a research paper?  Should you do X?
Osterman explains that the polling system remains open, then students can see what everyone else has answered (51).  Often this means that students who are embarrassed for answering incorrectly see that they are not the only ones who do not understand, so their embarrassment decreases dramatically, and they focus more on the learning than the embarrassment.

Osterman describes the two types of clicker systems: radio frequency and infrared.  Plus, she identifies some of the pros and cons of each (51).

In the next section of her article, she addresses the question: Why use clickers?  Citing the extant educational literature, she give at least five reasons:
  1. Combat passive learning environment
  2. Promote active learning
  3. Help with participation problems
  4. Provide instant feedback
  5. Interrupt lecture.
Additionally, she address the anonymous nature of the system: "Some instructors believe that anonymity makes students more comfortable and likely to participate, and this has been supported by research in students' opinions of these systems" (52).  As mentioned previous, because of the anonymity, fear of embarrassment is eliminated or at least lessened (52).  What really gets me excited is its potential for increasing the level of learning that takes place in the library classroom.  Osterman claims: "Also by encouraging students to make an actual decision about a question, the SRS makes them less likely to sit back and let the information wash over them unabsorbed.  Instead they evaluate a question and answer with engaged minds" (52).

"Law Students Use PRS."  by jonalltree on Flickr.com

As you can tell, this article really caught my interest; I can hardly stop from quoting from it.  The next section talks about how library instructors can and ought to adjust their instruction when using an SRS tool.  She talks about an average library workshop, then suggests that student questions and answers can determine the little parts of instruction that should be taught once more, passed over entirely, or explained more thoroughly.  With some forethought, instructors could devise some questions to generate discussions.  Likewise, sensitive questions could be asked that could then be compared to published data.  Along these lines Osterman suggested that students could be asked about their incomes, and then those figures could be compared with data collected by the U.S. Census Bureau as it pertains to their particular locale, for example (53). 

The system could be used to ask students to predict what might happen.  When they are required to answer, they become more committed and, thus, engaged.  She offers a pair of questions related to Boolean operators, which invite the student to predict if more or fewer results will be retrieved.  For the serious-about-learning types, she tells how some SRS systems collect the data, so they can be analyzed, which would allow for instructors to adjust their methods even more (54). 

We often hear that technology should not replace teaching, that it is just a tool to enhance learning.  This is true.  We should remember this.  As with any technology, pros and cons exist.  Osterman warns that with this technology less content may be taught, it may "distract instructors from their teaching," and students may forget clickers, use them to cheat, and may even walk away with them.  Fortunately, the benefits of learning "might easily outweigh" the con of less content being taught, and libraries who buy their own systems would not need to worry about students forgetting their clickers, though students could walk out the door with them at the end of class if one wasn't careful (54).

The last section of the article discusses "The Experience of American University Library" where Anne Osterman works.  In it she talks a bit more about vendors, different systems, training library instructors, necessary adjustments, some sample questions, using the SRS in library training sessions, and questions to ask with the system.  Encouragement and support should be given to those using the system for the first time.  Making the system available for individuals to practice with is best (55). 

From the experiences of her colleagues as well as her own, Anne Osterman writes: "Just as many beginning library instructors try to teach too much in the short amount of time they have and gradually slim their material down to an amount that is digestible, some instructors found that their first attempts in creating questions for a class were too complex" (56).  She recommends that librarians use the same questions in a series of classes; this will help instructors know how one class is different from another.  Again, the question "Have you used X resource?" may be a great standby.  "Overall, the response from library instructors at American University Library who have used the system has been very positive" (56). 

In summary, Osterman repeats that anonymity and novelty of the system generate an engagement with library instruction that increases learning.  If money is an issue, then a home-grown system may work or a "Web-based voting system" (56).  The short list of references looked helpful as well.

This article drove home the idea that polling students can really increase engagement, participation, and learning in the classroom.  Anonymity helps students participate more readily, and simple questions need to be the norm.  I really liked the sample questions she included.  This was quite helpful.

Work Cited
Osterman, Anne C.  "Student Response Systems: Keeping the Students Engaged."  College & Undergraduate Libraries 14.4 (2008): 49-57.  Print.

Monday, May 14, 2012

Information Literacy and Clickers

In my ongoing research related to audience response systems, or clickers, I discovered an aticle written in August 2009 by Patricia A. Deleo, Susan Eichenholtz, and Adrienne Andi Sosin.  Titled "Bridging the Information Literacy Gap with Clickers," this article explains how a graduate course in an Educational Leadership and Technology received information-literacy instruction with the help of clickers.  The authors used the term Class Performance System (CPS), but other education researchers call them Audience Response Systems.

They set forth the terms "digital natives" and "digital immigrants" and discuss the differences between those who are more technologically savvy than others.  Mostly, they point out that "even technologically competent students overestimate their ability to effectively search for and access information" (439).  Likewise, "graduate students display overconfidence with regard to both their research and technology skills" (439).  But how does an instruction librarian make students aware of their lacking skills while promoting learning at the same time.  Who likes to hear that they are not as competent as they think they are?

The authors of the article rightly claims that "attention to the differential level of each student's information literacy capatilities is ncessary in designing information literacy instruction" (439).  With students of different technology and information-literacy abilities in the classroom, how does a library instructor teach so that all can learn without feeling entirely lost or utterly bored.  Deleo and company write: "Information literacy classes where technology skill competence widely varies among students complicates the pedagogical situation" (440).  What can a librarian do to succeed in this environment of complexity?

Certainly, Deleo and her colleagues make an apt observation: "We have discovered that making assumptions about student technology or research skills is not effective, predictable, or advisable" (440).  Well, if one cannot rely on assumptions, what direction should be taken?  Clickers can enable librarians to hurdle some of these issues and do so gracefully.  "Clickers were initially adopted as a pre-lesson assessment tool to assist the librarian in setting an appropriate starting point at the students' levels" (440).
From "Accessibility in Education" by Lucy Greco.
One of the most valuable parts of this article may be the types of questions they asked with the clicker system.  An appendix to the article lists the questions they have required students to answer, but here is a little taste of what they looked for.  They wanted to know if students could distinguish qualities or aspects of the Library of Congress Classification System and the Dewey Decimal System, their ability to finding books in the catalog, students skills with documenting references in APA format, their understanding of popular and scholarly publications, and their knowledge of internet terminology. 
My favorite part of their article was the discussion.  It came alive and highlighted their positive experiences using the system, mentioning how it engaged students, enlivened discussions, created a sense of community, and increased interaction with the librarians.  They described how they promoted this engagement in conjunction with the technology: "After each student had clicked in their answers to a question they were instructed to turn to their nearest classmate and discuss that question and the answer they had chosen [...] As a result of inserting 'turn and talk' into the CPS procedure, the engagement level of the class rose significantly" (443).  I can see how this would generate even more interest. 
"Getting Interactive in the Classroom with Technology!" from eLearning @ Liverpool
It also takes some of the burden off of the instructor, because some of the students will/may answer the questions correctly.  At any rate, this does promote critical thinking skills.  Students will defend their answer or they will accept their classmate's answer.  Deleo, Eichenholtz, and Sosin write: "The process generated a higher level of anticipation for feedback as well" (443).  The authors explain how they would like to make the one-shot instruction session become a two-shot class.  The areas in which students were deficient could be addressed more fully in a second class (443).

The authors conclude with comments about the future potential of clicker systems in library instruction.  Essentially, the continuation of this methodology, they argue, may rely on student behavior.  They write: "student willingness and the librarian's skill at conducting the clickers session will be the larger issue, not the technology" (444).  In summary, they recommend that librarians investigate this technology.

Work Cited
Deleo, Patricia A., Susan Eichenholtz, and Adrienne Andi Sosin. "Bridging The Information Literacy Gap With Clickers." Journal Of Academic Librarianship 35.5 (2009): 438-444. Library Literature & Information Science Full Text (H.W. Wilson). Web. 11 May 2012.