Thursday, May 31, 2012

Effectiveness of Clickers in Big, Intro to Psychology Classes

A group of researchers from the University of Delaware, in a study that looked at the effectiveness of personal response systems, found that modest use of "clickers" increased exam performance.  They did not see evidence that clickers actually increased engagement in their study.  In a reference to the literature they write: "According to some researchers, students like clickers, and students also believe clickers make them feel more engaged" (45).  As far as their own students went, however, they note: "Although Dr. B reported that students 'got a kick out of them,' clickers had only marginal effects on self-reports of student engagement, attendance, and reading in this study--effects that may be attributable to Type I error" (48). 

Freshmen students at the University of Delaware, at the time of the study that is, mark which classes they want to take their first year, and a computer assigns their schedule.  Morley, McAuliffe, and DiLorenzo mention more than once that this made their study more reliable and random, due to the random nature of the process.  Students personal preferences did not determine when they took the psychology class.  Morley explains that the teachers used the clicker system only minimally:
Both professors taught using an interactive lecture style.  Both professors taught the earlier section without clickers ('traditional' sections) and the later section with clickers.  In clicker sections, at the beginning of class, the instructor posted five multiple-choice, fact-based questions, based on the day's required reading.  Students earned extra credit for answering these questions correctly.  Later in the class period, if relevant, the instructor would briefely elaborate on a clicker question that most students had misunderstood.  Other than this change, instructors taught the two sections identically.  (46)
 Data gathered from the exam results did indicate that "exam scores were higher for clicker sections than for tratidtional sections" (47).  This occurred regardless of the teacher; there were two teachers who taught two sections of large, introductory psychology classes.  Morling et al. summarize it in more formal language: "Our data suggest that using clickers to quiz students in class on material from their reading resulted in a small, positive effect on exam performance in large introductory psychology classes" (47).

Further studies might consider looking at teaching methods used in conjunction with the technology.  For example, they suggest looking at concept inventories, group discussions, and Just in Time Teaching (JiTT), which could all be joined with clickers to see how they might enhance learning (48).  For more clarification, the authors write: "In our study, the instructors used clickers very minimally--to administer quizzes, publicly display the results, and quickly correct any widespread misunderstandings" (47-48). 

Moreover, the article addressed the possibility that some students cheated while taking the reading quizzes, though they concede that this may have actually promoted a cooperative learning environment, which would have improved their engagement in the class (49).  Overall, this was a good article, as it found a positive result of using clickers via a scientific study, rather than relied on anecdotes or the fact that the technology was trendy at the time.

Works Cited
Morling, Beth, Meghan McAuliffe, Lawrence Cohen, and Thomas M. DiLorenzo.  "Efficacy of Personal Response Systems ("Clickers") in Large, Introductory Psychology Classes."  Teaching of Psychology 35.1 (2008): 45-50.

Stowell, Jeffrey R., and Jason M. Nelson.  Teaching of Psychology 34.4 (2007): 253-58.

Wednesday, May 30, 2012

Got Statistics?

According to their "What We Provide" page, the U.S. Census Bureau's American FactFinder provides statistical data the population, economy, and other bits of information about communities the United States of America.  The American Community Survey gives data and estimates related to commute time to work, age, race, income, home value, veteran status, and more.  Search boxes allow interested individuals to search for statistics by topic, race/ancestry, industries, state, county, or place.
http://3.bp.blogspot.com/-kDo_kZdQhkE/TZJ8fwAjMyI/AAAAAAAAElI/ym0_zy-NE0Q/s400/US%2BCensus%2BFactfinder.JPG
Image from Claremont Insider.
To test the system, I looked for one-year estimates of household incomes in Idaho.  Below are some 2010 estimates discovered in the search:
  • Total households in Idaho: 576,709
  • About 108,000 Idaho households make less than $20,000 annually = 18.7%
  • ~73,000 Idaho household make more than $100,000 annually = 12.7%
S1901:
INCOME IN THE PAST 12 MONTHS (IN 2010 INFLATION-ADJUSTED DOLLARS)  more information
2010 American Community Survey 1-Year Estimates
FactFinder makes it easy to limit by race or ethnicity.  Here are the numbers for income in last 12 months for white, non-hispanic households:
  • 96,000 White/Non-hispanic households earn less than $20,000 annually = 16.6% of all Idaho households
  • 65,000 White/Non-hispanic households earn more than $100,000 annually = 11.3% of all Idaho households
B190001H: HOUSEHOLD INCOME IN THE PAST 12 MONTHS (IN 2010 INFLATION-ADJUSTED DOLLARS) (WHITE ALONE, NOT HISPANIC OR LATINO HOUSEHOLDER)
Universe: Households with a householder who is White alone, not Hispanic or Latino  more information 2010 American Community Survey 1-Year Estimates
Hispanic/Latino households, according to the American Community Survey estimages, earn the following over a 12-month time:
  • 13,000 Hispanic households earn less than $20,000/year = 2.3% of all Idaho households
  • 2,000 Hispanic households earn more than $100,000/year = 0.3% of all Idaho households
B19001I: HOUSEHOLD INCOME IN THE PAST 12 MONTHS (IN 2010 INFLATION-ADJUSTED DOLLARS) (HISPANIC OR LATINO HOUSEHOLDER)
Universe: Households with a householder who is Hispanic or Latino  more information 2010 American Community Survey 1-Year Estimates

A different look at the numbers:
  • Total number of White/Non-hispanic households: 509,056
  • 18.9% of Whites earn less than $20,000/year
  • 12.8% of Whites earn more than $100,000
  • Total number of Hispanic/Latino households: 45,626
  • 28.5% of Hispanics earn less than $20,000/year
  • 4.4% of Hispanics earn more than $100,000/year
For more statistical resources see this Resources by Subject: Statistics page.  Personally, I like the statistics section on the Speech page.  It seems that many college students like to make reference to statistics in their speech and communications classes.

Thursday, May 24, 2012

Audience Response Systems in the Classroom

Heidi Adams and Laura Howard write: "Audience Response Systems, commonly known as clickers, are gaining recognition as a useful classroom tool" (54).  In their short article titled "Clever Clickers: Using Audience Response Systems in the Classroom" they define Audience Response Systems (ARS), explain the two major systems (radio frequency and infrared), and explain how the systems can be used to gather feedback, check for understanding, assess student learning, and provide specific ideas for using ARS in the classroom. 

As a tool to promote student learning, each student must answer questions with a remote control.  Results are shown right away.  Adams and Howard write: "Since the educators are able to see the results instantly, it permits them to evaluate student understanding at that very moment and provides an opportunity to adjust the lesson accordingly to improve student comprehension" (54).  It helps instructors know if students got it.  ARS can be used and adapted to meet the needs of each student and each class.

Of particular value, this article offers twenty ideas for using ARS in the classroom.  Here are just a few:
  • Comprehension Testing
  • Drill and Practice
  • Review Games
  • Questionnaires/Surveys
  • Voting
  • Checking for understanding during a lecture
  • Fact Finding or Pre- and Post-Tests (55)
Adams and Howard cite some of the literature in making their point that ARS are good for students, because they increase engagement in the classroom.  Students seem to like it more.  They write: "With clickers, every student answers every question.  Additionally, the questions will spark more questions from students that will lead to further discussion and understanding regarding the material" (55).  Additionally, the on-the-spot assessment or feedback lets students understand if they got it right or not (56).  They do not have to guess; this seems to cement the learning and can solidify the learning process, or the correct cerebral paths in the brain. 

Naturally, the ARS do not solve all problems and have a few drawbacks.  Adams and Howard claim: "As with any other type of learning, if ARS is used too often, students tire of it" (56).  In other words, students like the newness of the technology, but with time will become less interested in it.  Still, they assert "that the advantages such as instant feedback and increased student engagement far outweigh the downsides" (56).  The potential of these systems does seem fairly expansive.
Qwizdom Clicker.  See "Spotlight on Education: Sandwood's S.A.IN.T Academy Hosts First Annual Media Day." Duval County Public Schools.
 A sidebar in the article shows a half dozen brand names of ARS:

Work Cited
Adams, Heidi, and Laura Howard.  "Clever Clickers: Using Audience Response Systems in the Classroom."  Library Media Connection 28.2 (October 2009): 54-56.

Wednesday, May 23, 2012

How does empowering versus compelling students influence participation attitudes?

A group of researchers at Brigham Young University collaborated to write an article in Active Learning in Higher Education.  It was published in 2007 and discussed a gap in the research on clickers or Audience Response Systems (ARS).  Many studies have looked into the participation factors: how they prompt discussions, how they uncover misconceptions in the learning, how they offer immediate feedback.  They write: "Studies that explored direct measures of student learning with audience response systems have shown mixed results [...] Although these studies showed mixed results, most studies that looked at indirect measures of student learning (including levels of participation and engagement) found positive results" (236-37). 

What about the groups of students who typically hold back and do not participate?  They cite studies showing that females tend to participate less, "because of they worry that they might appear to dominate discussion" (237).  Students from other cultures also participate less frequently, not wanting to give incorrect answers, "fearful of 'losing face'" (237).  These researchers call this group "reluctant participators," saying that other studies on ARS have not looked into this demographic (237). 
Reluctant Participants.  by Middle Age Biker.
They found that students perceive the ARS to be less helpful when they are used to grade or mark attendance.  Also, technical difficulties with the systems came back as the number one negative aspect of the systems, followed by their cost to the student, grading, and mandatory attendance.  Students had to pay $40 a piece, so "a significant number of students were critical of cost effectiveness" (240).   However, when the ARS were used to provide formative feedback, students perceived the use of the technology as positive.

On the whole, however, "the overall reaction to the use of the ARS in the pilot was positive.  [...] For all of the measures except one, a strong majority of the students 'agreed' or somewhat agreed' that the ARS was helpful to them in the learning experience" (238, 240).  Nonetheless, reluctant participators tended to view it as less helpful than the rest of the group.  Moreover, "students in classes where the ARS was not used for grading viewed the ARS to be a more helpful tool" (242).  This seems to be a major advantage of using Audience Response Systems (ARS).

During the survey, students were given opportunities to comment on what they liked as well as what they did not like about ARS.  "One student wrote, 'It was nice to have an immediate response so I could know whether or not I was doing something right'" (248).  Other students appreciated knowing what their peers thought on issues or content related to the class. The authors included this positive comment as well: "'The best part is that we could see the results of the surveys and what not instantly, and they were applicable to us in the class, not some foreign group of individuals.  It brought the issues to life which we were discussing in class'" (248).  This emphasizes the potential power of these systems, bringing forth relevant issues that can energize discussions.

At this point it seems appropriate to go back to the article's introduction to mention what happens when student have limited opportunities to participate and the method involves raising hands.  The authors write: "When classes have limited opportunities for students to respond, participation can be unbalanced in favor of the most knowledgable students who are most willing to respond in front of their peers" (234).  Personally, I have experienced this countless times.  Students often have good things to share, but it often happens that one or two students dominate any sort of discussion.  The strength of the ARS is that eveyone can participate anonymously.

While we are talking about active learning in general, it may be helpful to consider what the authors of this study wrote: "A diverse body of educational research has shown that academic achievement is positively influenced by the amount of active participation of students in the learning process" (233-34).  In the past, response cards have been used, and they helped to increase student participation and performance in higher education.  Today many have taken the ARS to invite participation and performance.  Active learning makes a difference, but it helps to consider the subgroups who reluctantly participate. 

Again, it is better to avoid using these systems for grading and attendance purposes, considering how many variables obstruct the success of the system.  The authors included this student comment: "'I think it's an unfair grading system.  I think they're great, as long as grades aren't based on them.  There are too many variables like battery failure, problems with being on the correct channel and so forth that interfere.  These variables make it unfair to base grades on clicker systems'" (240).  Clickers can be a powerful tool; however, students seems to dislike the tool when it appears faulty and they will be assigned a grade via the performance of this faulty tool.

Instructors should review the purpose of the tools or methodologies they use in the class.  The authors write: "The researchers in this study believe that the central concern of instructional technologists should be that of 'helping' students" (248).  Though the grading and attendance features may be helpful for instructors, if they create negative feelings in the students that inhibit potential learning, then perhas it should not be used to grade student work.  In their concluding remarks, the authors wrote: "Students perceived strategies that provided formative feedback and empowered them to evaluate their own performance as more helpful than strategies oriented towards grading and compelling participation" (251).  This explains the title of their article rather succinctly.

For researchers it may be helpful to know what they suggest for later studies with ARS.  "Future research could investigate a wider range of pedagogical strategies and the environments in which they are successful.  A measure of the goodness of the strategies could be student and instructor perceptions of their helpfulness in the learning process" (250).

Work Cited
Graham, Charles R., Tonya R. Tripp, Larry Seawright, and George L. Joeckel III.  "Empowering or Compelling Reluctant Participators Using Audience Response Systems."  Active Learning in Higher Education 8.3 (2007): 233-58.  Print.

Friday, May 18, 2012

Technology to Put in the Hands of Students and Librarians

Several weeks ago I attended a the ILA Region 5 & 6 Conference at Snake River Community Library.  Here are some notes that I took at one of my favorite sessions. 

Tech Talk: Integrating 21st Century Tools into School Libraries with Gena Marker, current president of ILA.  She is a school librarian at Centennial High School in Meridian, ID.

            This session was rather exciting for me, because it talked about how technology can be used to enhance learning and the educational experience.  She began by mentioning how many teachers get in a technology rut, using software adopted a decade ago.  Gena encouraged librarians to put new tools in students’ hands.  I believe one of the educational goals for Idaho is to increase student fluency with technology; someone made reference to this in the session. 

            Gena has bought flip or pocket cameras (Sony Bloggie models I believe) into the students’ hands.  Students need to learn new information and how to present it.  They should navigate the future and can do this with the use of new tools and technology.

            Animoto: it’s great and it’s free.  As an educator you can upgrade when you login.  This will give more access.  Create a video with a few clicks.  The free version allows for a 30-second video, but educators can do more.  Embed photos and video clips.  It is quick to use once you have tried it out, as with other sites in the Cloud.  With the flip cameras, it is necessary to be close to people to hear them.  Animoto lets you choose music from (set of options) music library.

            Photo Story 3: this is good.

            Zamzar: third party that converts almost any file type, such as avi, flv, wav, mov, etc.  It has an easy three-step process:

1.       Upload file

2.      Specify the new file type you desire

3.      Enter email address, so they can send you the new file with the new extension

Let students create videos.  It forces creativity.

            AnyVideo Converter: another third party file converter.  This may cost money, but it also does more in the way of screen captures, ripping DVDs, and so forth.

            Prezi: it’s an interactive whiteboard that zooms, flips, and embeds photos.  Create an educator account.  It’s possible to download it to a USB device.  Check out flip cameras from the library.  Learn by trial and error.

            Glogster: she recommends that teachers and librarians use the Glogster EDU as it may be safer and can be managed in a private setting just for class.  Use Glogster to show things.  It is an alternative to PowerPoint.  Poster yourself.  It’s a digital poster, but it allows you to embed audio and video into the poster.  Thumbnail photos can be enlarged.  It’s more about content than design creation.  Or maybe she said to encourage students to focus on the content than on design creation.  Some students in an honors class created one that looked at McBeth(?) from a feminist perspective.  They created a video that could be viewed, and they could show the poster while they explained it to the class during a presentation. 

            Extranormal: student love this the most and can waste a lot of time here.  Type in the text, and a mechanical voice will come out of the cartoon character you have chosen.  Type in stage directions, like walk forward three steps, and point to the left.

            Audacity: free podcasting.  It’s a free download.  Use a microphone with a USB connector to attach it to the computer.  Record a book review and share it in your online catalog.  With Follett’s Destiny (opac vendor), she can do this. 

            Windows Live Movie Maker: it is good for making movies with photographs, recorded digital movies, and music.

            VoiceThread: collaborate to change a presentation or comment on it. 

            Wordle: create word clouds.  This is a fun design thing.

            Overall, this presentation interested me because of the way she tied the technology to student learning and creativity.  It gave me some ideas and made me want to try some of these things.


Tuesday, May 15, 2012

Clickers: An Engaging Tool for Library Instruction

In 2008 Anne C. Osterman published an article online for librarians about the potential of student response systems (SRS) or clickers in the library instruction setting.  College & Undergraduate Libraries published it with the title "Student Response Systems: Keeping the Students Engaged."  (It appears the the print version came out first in 2007.)  She introduces the topic by mentioning many factors that go against participation in the library instruction classroom: unfamiliar setting, short opportunity (one shot at teaching library skills), and content many would not consider exciting. 

Librarians do what they can to invite participation.  They will work to make the instruction tied directly to an assignment, develop hands-on exercises, create handouts, and sometimes divide classes into groups to work together (50).  Osterman writes: "These tools do little, however, to help with one more inherent difficulty of library instruction: a wide variety of experience levels among student" (50).  Then she identifies "the two greatest fears of a library instructor [...]: (1) boring the students because they've seen it all before; and (2) losing the students because the territory is too foreign to their knowledge and experience.  Both lead students to tune out" (50).

"bored-students."  by cybrarian77 on Flickr.com.

This resonates with my own experience.  These are two of my greatest fears, and I have wondered how to deal with this.  Well, the most obvious thing to do and what Osterman calls "the last tool in the box: asking the students questions" (50).  Unfortunately, this does not always work, and Osterman recognizes that all the previous difficulties just mentioned will make this effort less effective as well.  Encouragingly, she writes: "Never fear--there is another solution" (50).  The Student Response System can make a difference, increasing participation, engaging students of all personalities and abilities, and offering a mechanism that prompts the instructor to adjust in the classroom needs and address deficiencies without belaboring the subjects students have mastered already.
Osterman observes that instructors can ask students questions spontaneously or "on the fly" (51).  She suggests questions like:
  1. Have you ever used X (JSTOR, Academic Search CompleteCQ Researcher, etc.)?
  2. What kinds of materials do you think you would find in X (the library catalog, the Special Collections digital archives, the Primo search, etc.)?
  3. Should you cite Wikipedia in a research paper?  Should you do X?
Osterman explains that the polling system remains open, then students can see what everyone else has answered (51).  Often this means that students who are embarrassed for answering incorrectly see that they are not the only ones who do not understand, so their embarrassment decreases dramatically, and they focus more on the learning than the embarrassment.

Osterman describes the two types of clicker systems: radio frequency and infrared.  Plus, she identifies some of the pros and cons of each (51).

In the next section of her article, she addresses the question: Why use clickers?  Citing the extant educational literature, she give at least five reasons:
  1. Combat passive learning environment
  2. Promote active learning
  3. Help with participation problems
  4. Provide instant feedback
  5. Interrupt lecture.
Additionally, she address the anonymous nature of the system: "Some instructors believe that anonymity makes students more comfortable and likely to participate, and this has been supported by research in students' opinions of these systems" (52).  As mentioned previous, because of the anonymity, fear of embarrassment is eliminated or at least lessened (52).  What really gets me excited is its potential for increasing the level of learning that takes place in the library classroom.  Osterman claims: "Also by encouraging students to make an actual decision about a question, the SRS makes them less likely to sit back and let the information wash over them unabsorbed.  Instead they evaluate a question and answer with engaged minds" (52).

"Law Students Use PRS."  by jonalltree on Flickr.com

As you can tell, this article really caught my interest; I can hardly stop from quoting from it.  The next section talks about how library instructors can and ought to adjust their instruction when using an SRS tool.  She talks about an average library workshop, then suggests that student questions and answers can determine the little parts of instruction that should be taught once more, passed over entirely, or explained more thoroughly.  With some forethought, instructors could devise some questions to generate discussions.  Likewise, sensitive questions could be asked that could then be compared to published data.  Along these lines Osterman suggested that students could be asked about their incomes, and then those figures could be compared with data collected by the U.S. Census Bureau as it pertains to their particular locale, for example (53). 

The system could be used to ask students to predict what might happen.  When they are required to answer, they become more committed and, thus, engaged.  She offers a pair of questions related to Boolean operators, which invite the student to predict if more or fewer results will be retrieved.  For the serious-about-learning types, she tells how some SRS systems collect the data, so they can be analyzed, which would allow for instructors to adjust their methods even more (54). 

We often hear that technology should not replace teaching, that it is just a tool to enhance learning.  This is true.  We should remember this.  As with any technology, pros and cons exist.  Osterman warns that with this technology less content may be taught, it may "distract instructors from their teaching," and students may forget clickers, use them to cheat, and may even walk away with them.  Fortunately, the benefits of learning "might easily outweigh" the con of less content being taught, and libraries who buy their own systems would not need to worry about students forgetting their clickers, though students could walk out the door with them at the end of class if one wasn't careful (54).

The last section of the article discusses "The Experience of American University Library" where Anne Osterman works.  In it she talks a bit more about vendors, different systems, training library instructors, necessary adjustments, some sample questions, using the SRS in library training sessions, and questions to ask with the system.  Encouragement and support should be given to those using the system for the first time.  Making the system available for individuals to practice with is best (55). 

From the experiences of her colleagues as well as her own, Anne Osterman writes: "Just as many beginning library instructors try to teach too much in the short amount of time they have and gradually slim their material down to an amount that is digestible, some instructors found that their first attempts in creating questions for a class were too complex" (56).  She recommends that librarians use the same questions in a series of classes; this will help instructors know how one class is different from another.  Again, the question "Have you used X resource?" may be a great standby.  "Overall, the response from library instructors at American University Library who have used the system has been very positive" (56). 

In summary, Osterman repeats that anonymity and novelty of the system generate an engagement with library instruction that increases learning.  If money is an issue, then a home-grown system may work or a "Web-based voting system" (56).  The short list of references looked helpful as well.

This article drove home the idea that polling students can really increase engagement, participation, and learning in the classroom.  Anonymity helps students participate more readily, and simple questions need to be the norm.  I really liked the sample questions she included.  This was quite helpful.

Work Cited
Osterman, Anne C.  "Student Response Systems: Keeping the Students Engaged."  College & Undergraduate Libraries 14.4 (2008): 49-57.  Print.

Monday, May 14, 2012

Information Literacy and Clickers

In my ongoing research related to audience response systems, or clickers, I discovered an aticle written in August 2009 by Patricia A. Deleo, Susan Eichenholtz, and Adrienne Andi Sosin.  Titled "Bridging the Information Literacy Gap with Clickers," this article explains how a graduate course in an Educational Leadership and Technology received information-literacy instruction with the help of clickers.  The authors used the term Class Performance System (CPS), but other education researchers call them Audience Response Systems.

They set forth the terms "digital natives" and "digital immigrants" and discuss the differences between those who are more technologically savvy than others.  Mostly, they point out that "even technologically competent students overestimate their ability to effectively search for and access information" (439).  Likewise, "graduate students display overconfidence with regard to both their research and technology skills" (439).  But how does an instruction librarian make students aware of their lacking skills while promoting learning at the same time.  Who likes to hear that they are not as competent as they think they are?

The authors of the article rightly claims that "attention to the differential level of each student's information literacy capatilities is ncessary in designing information literacy instruction" (439).  With students of different technology and information-literacy abilities in the classroom, how does a library instructor teach so that all can learn without feeling entirely lost or utterly bored.  Deleo and company write: "Information literacy classes where technology skill competence widely varies among students complicates the pedagogical situation" (440).  What can a librarian do to succeed in this environment of complexity?

Certainly, Deleo and her colleagues make an apt observation: "We have discovered that making assumptions about student technology or research skills is not effective, predictable, or advisable" (440).  Well, if one cannot rely on assumptions, what direction should be taken?  Clickers can enable librarians to hurdle some of these issues and do so gracefully.  "Clickers were initially adopted as a pre-lesson assessment tool to assist the librarian in setting an appropriate starting point at the students' levels" (440).
From "Accessibility in Education" by Lucy Greco.
One of the most valuable parts of this article may be the types of questions they asked with the clicker system.  An appendix to the article lists the questions they have required students to answer, but here is a little taste of what they looked for.  They wanted to know if students could distinguish qualities or aspects of the Library of Congress Classification System and the Dewey Decimal System, their ability to finding books in the catalog, students skills with documenting references in APA format, their understanding of popular and scholarly publications, and their knowledge of internet terminology. 
My favorite part of their article was the discussion.  It came alive and highlighted their positive experiences using the system, mentioning how it engaged students, enlivened discussions, created a sense of community, and increased interaction with the librarians.  They described how they promoted this engagement in conjunction with the technology: "After each student had clicked in their answers to a question they were instructed to turn to their nearest classmate and discuss that question and the answer they had chosen [...] As a result of inserting 'turn and talk' into the CPS procedure, the engagement level of the class rose significantly" (443).  I can see how this would generate even more interest. 
"Getting Interactive in the Classroom with Technology!" from eLearning @ Liverpool
It also takes some of the burden off of the instructor, because some of the students will/may answer the questions correctly.  At any rate, this does promote critical thinking skills.  Students will defend their answer or they will accept their classmate's answer.  Deleo, Eichenholtz, and Sosin write: "The process generated a higher level of anticipation for feedback as well" (443).  The authors explain how they would like to make the one-shot instruction session become a two-shot class.  The areas in which students were deficient could be addressed more fully in a second class (443).

The authors conclude with comments about the future potential of clicker systems in library instruction.  Essentially, the continuation of this methodology, they argue, may rely on student behavior.  They write: "student willingness and the librarian's skill at conducting the clickers session will be the larger issue, not the technology" (444).  In summary, they recommend that librarians investigate this technology.

Work Cited
Deleo, Patricia A., Susan Eichenholtz, and Adrienne Andi Sosin. "Bridging The Information Literacy Gap With Clickers." Journal Of Academic Librarianship 35.5 (2009): 438-444. Library Literature & Information Science Full Text (H.W. Wilson). Web. 11 May 2012.

Friday, May 11, 2012

"Participatory Technologies" Article by Meredith Farkas

Meredith Farkas has written a good article for librarians involved with information literacy.  She celebrates the advantages of incorporating participatory technologies into the information-literacy classroom.  Using Web 2.0 tools such as blogs and wikis in the classroom can increase student learning and responsibility.  In fact, she claims that these tools "have the potential to create a more engaging learning environment.  Increased learner autonomy give[s] students a greater sense of responsibility for their learning and has been shown to improve student achievement" (84). 

One of the great advantages of blogs in relation to information-literacy learning is that blogs encourage reflective thinking, which can potentially guide students to think about their own research process.  Farkas notes that blogs can invite "reflection within an environment of peer interaction" (84).  Students will often listen to their peers before their instructors; they may be following the rule "Don't trust anyone over 30."

In theory, Farkas also extols the constructivist model, which naturally downgrades the traditional model of the teacher being the authority figure.  Students can learn and grow more when they interact with each other, challenging each others' ideas.  Of course, she explains this a bit more eloquently: "Constructivist pedagogy views students as active participants in learning who construct knowledge based on their existing understanding as well as interactions with peers and their instructor.  Unlike in behaviorism, the instructor is not seen as being wholly responsible for student learning" (86).  She ties this teaching theory to Web 2.0 and calls it Pedagogy 2.0, though I have not verified if she is the one to coin this term.
Microsoft Office Clipart.
As an instruction librarian myself, her final section appealed to me the most: "Information Literacy and Pedagogy 2.0."  She emphasizes the importance of evaluation:
In a world where the nature of authority has come into question (Chang et al., 2008), students will need to evaluate information in more nuanced ways then they are currently being taught at most colleges and universities.  Information literacy needs to be increasingly focused on teaching evaluative skills to students; skills that go well beyond determining whether or not something is peer-reviewed.  (90)
 Four or five years ago it seemed that librarians, myself included, still showed bogus websites to their students to raise awareness of the importance of evaluation.  In certain cases, this may still be appropriate and get students attendance.  However, it seems that college students need to evaluate information at a more sophisticated level.  Farkas' claim that "students will need to evaluate information in more nuanced ways" (90) makes sense.  Rather than looking at sources to see if their are black or white, legitimate or bogus, genuine or fake; students need to determine if information is relevant to their research question, to understand if it is credible, objective, current, accurate, and authoritative. 

The hardest thing for students to determine may be the accuracy of information, so looking at credibility, objectivity, and authority may give necessary clues for them to determine accuracy.  Most of all, they should be concerned with the relevance of sources, yet some students seem to be too quick with the gun at killing sources.  Part of college involves creativity at understanding how the broader subject relates to the more specific paper topic.

Meredith Farkas addresses a new issue, at least it did catch me off guard: "Those teaching information literacy will also need to focus on developing in students the dispositions needed to be a successful consumer and producer of knowledge" (90).  It seems easy to teach content and research strategies as a librarian, but developing new dispositions in students seems a tall order, not to say it is not desirable.  With one- or two-shot sessions how much can library instructors really do? 

Undoubtedly, this work of influencing student attitudes in the direction of knowledge creation may seem daunting for library instruction, yet it may also insert some life into the instruction.  This goes beyond just showing the steps of how to use a database and taking advantage of the features that can be easily explored independent of the instructor.  Therefore, I agree with Farkas, though it may require some stretching for most library instructors.  "It is important for librarians to consider how we can help students develop the attitudes that will make them critical and effective information seekers through learning activities" (Farkas 90).  Indeed, librarians should take the time to reflect on how to inspire students in this direction, but it may start with librarians becoming more passionate and confident about their own information-seeking abilities.

How does this translate to the library instruction classroom?  It means that librarians need to get students actively engaged in the process.  Farkas writes:
Librarians still offering lecture-based information literacy instruction need to explore ways to make their instruction more engaging and student-centered through collaborative, problem-based learning.  The Library literature is replete with case studies suggesting creative active techniques for enhancing student learning.  (90)
She goes on to encourage more questions, dialogue, and group work.  Rather than creating a set outline, librarians should conduct formative assessment to understand the constituents of their classes (91).  Each class coming to the library consists of a different group of individuals with different experiences.  Bending the instruction focus to meet students' needs seems to be more effective.  Going a step beyond this, it seems that success in the library instruction room may increase if the formative assessment is sent out and completed prior to attending the library workshop.  This allows the librarian time to think about where adjustments should be made.  Not all librarians like to adjust on the fly.
I agree with Farkas when she says the students do not reflect: "Students rarely reflect on their research process, which can result in the need to re-learn skills they used in their last assignment" (91).  Working with the instructor, a librarian may be able to leverage a reflective requirement, and they could do so with a blog (91).  I have always liked this idea, and Farkas explains why blogging and wiki creation are such good ideas.  "Blogs could also be used to have students investigate the social origins of information and identify bias within writing.  Students can engage with the peer review process through reviewing the work of their classmates on blogs and wikis" (92). 

Participatory technology, then, engages students in the peer-review process, invites them to critically assess the research process, provides "teachable moments" for the instructor, and increases student learning (92).  "These activities can generate an understanding of peer-review at a level far beyond simply checking a box in a database search interface" (92).  They can also increase the sense of community, enliven the classroom, allow the instructor to offer guidance and feedback, and lead to positive student learning outcomes.  Moreover, it may even increase writing and communication skills.
 
 
This article was very well written and offered a number of insights into the value of participatory technology as it relates to library instruction in higher education.
 
 
I do recognize her from American Libraries as the author of numerous technology columns.  She is a librarian at Portland State University and writes the blog titled Information Wants to be Free.

Work Cited
Farkas, Meredith.  "Participatory technologies, pedagogy 2.0 and information literacy."  Library Hi Tech 30.1 (2012): 82-94.