Showing posts with label teaching. Show all posts
Showing posts with label teaching. Show all posts

Monday, April 1, 2013

Information Literacy Courses at Idaho State University

            The Association of College & Research Libraries (ACRL) defines information literacy as “a set of abilities requiring individuals to ‘recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information.’”[1]  Over the last decade the Oboler Library has advocated for and educated individuals across the ISU campus about information literacy.  ISU librarians used to teach a two-credit library research course that helped students earn a certificate as media specialists or school librarians.  The College of Education sponsored this certification program; however, with the retirement of one of the professors in that college, the program died.
            Since that time, librarians have promoted information literacy in workshops and presentations.  Library representatives on campus committees have also explained and advocated the importance of information literacy in today’s information-rich society.  The foundation they laid prepared faculty for the course proposals put forth in recent years.  Therefore, Curriculum Council accepted a proposal in Fall 2011 to create a one-credit course titled LLIB/ACAD 1115: Information Research.  This change first appeared in ISU’s Undergraduate Catalog: 2012-2013.  However, students first enrolled in the course during the Fall 2011 and Spring 2012 semesters, taking it as an experimental course.  The Student Success Center assisted the Library to ensure the class appeared on the class schedule that first year, and the course was cross-listed as an ACAD and LLIB course with the experimental number designation 1199.
            Initially, the class met twice a week during the second block of eight-week classes.  This changed in Fall 2012, and the students attended class once a week for sixteen weeks.  It seems that student success increased with these changes due to the fact that the work spread out over a longer period, rather than loaded into eight weeks when students tend to be the busiest at the end of the semester.
            LLIB/ACAD 1115 seeks to help students accomplish the following objectives:
  • Identify sources of academic, popular, and professional research
  • Show evidence that you can select relevant and credible sources in support of a research question
  • Summarize, interpret, and analyze sources
  • Document sources in an accepted style format
  • Navigate search engines, article databases, and library catalogs to find relevant sources
  • Demonstrate an ability to distinguish between primary and secondary sources
 
            Assignments every two weeks require that students find a specific type of source, like a reference article, a book, a scholarly article, a newspaper article, etc.  They must explain how they found the source, summarize it, and evaluate its credibility.  Completion of these assignments prepares them to create an annotated bibliography, which is the final project of the course.  In-class activities also get students on track to complete the bi-weekly assignments.  Students who have completed the course often say that this course should be required for all students or that they wished they had taken the course as freshmen, because it would have been very helpful.
            Library faculty believed this course would be beneficial for all students.  Consequently, they put forth a proposal to change the course to a three-credit course and for the course to be considered as an option to fulfill a general education requirement for undergraduate students.  In recent years the General Education Requirements Committee (GERC) had drafted a revision of the requirements (www.isu.edu/gened), and information literacy appeared as one of the new objectives.  Beginning in Fall 2013, incoming students must meet either the critical thinking or information literacy objective.  LLIB 1115: Introduction to Information Research will be a course that fulfills the information literacy objective, since the Curriculum Council and GERC both approved the necessary proposals. 
 
            LLIB 1115: Introduction to Information Research will be taught in Fall 2013 as a three-credit course with the following objectives:
  • Determine the nature and extent of information needed
  • Access the needed information effectively and efficiently
  • Evaluate information and its sources critically
  • Incorporate selected information into one’s knowledge base and value system
  • Use information effectively to accomplish a specific purpose
  • Understand the economic, legal, and social issues surrounding the use of information, and access and use information ethically and legally
            These objectives were adapted by the University and Library from the ACRL’s Information Literacy Competency Standards for Higher Education.
            Currently, the Library plans to teach five sections of LLIB 1115 with one section being taught entirely online in an asynchronous format where students will complete assignments independently and view course materials and recorded presentations online.  The other four sections will meet on the Pocatello campus in computer laboratories to accommodate the hands-on nature of the course instruction and assignments.  Oboler Library faculty look forward to this new endeavor and are working to make this course a valuable one that will contribute to future student success.




[1] Association of College and Research Libraries.  Information Literacy Competency Standards for Higher Education.”  Information Literacy Competency Standards for Higher Education, ACRL, 2013. Web.  27 February 2013. http://www.ala.org/acrl/standards/informationliteracycompetency#ildef.

Tuesday, August 21, 2012

Evaluation Form for a One-credit Information Literacy Course

Last year I taught a one-credit, information-literacy course for the first time.  As an experimental course it was sponsored by ISU's Student Success Center. It was titled ACAD 1199: Information Research and ran during the last eight weeks of the semester, meeting twice a week.  In the first semester I created a student evaluation form to gather information from the students to understand how much work they put into the course, how effective the professor was, and how useful the course was overall.  This was an important thing to do as it gave me firsthand feedback directly from the students.

It is a bit lengthy, but I wanted to get a lot of information from the students before they left.  Have you created your own evaluation form, or do you administer one created by your institution?

Friday, June 22, 2012

Teaching Non-Traditional Students in the Library

Non-traditional students can be the most rewarding ones to teach in the library.  They often have more questions, are more lively, and seem to be more grateful for library instruction.

This week I taught a class full of non-traditional students.  I uploaded my outline presentation to Slideshare and titled it "Library Research" on the presentation, though the more accurate title may be: "TGE 0199: Library Instruction for Non-Traditional Students."  These students, most of them anyway, had to earn a GED in order to make it to college. This class was helping them transition into college.

See the presentation below:
We used the Cephalonia Method during our tour.  The questions given to students were color coded to correspond to the different floors of the Library.  I printed out call numbers to specific books and maps that helped to answer some of the questions; these I handed to students and coached them in finding the materials along the way.

How do you teach non-traditional students?  Do you ever teach classes that consist only of non-traditional students?  What are their strengths? 

On the whole I enjoye teaching the non-traditional students, because they seem more attentive, ask more questions, and are glad to learn in most cases.

Wednesday, June 20, 2012

Teaching with Xtranormal, Poll Everywhere, Wikis, and Skype

It appears that many librarians today believe that technology needs to be used within librarian instruction to catch the attention of the "digital natives," or the current set of college students.  Nicole Eva and Heather Nicholson write: "Library instruction is viewed by many students as being less than enthralling.  Students may not understand how important the library can be for their academic endeavours, or they may think that they know all they need to know.  As a result, librarians often seek new and innovative ways to engage classes" (1-2).  These authors do not promote technology for its own sake, rather they tout technology as a tool for engaging student in collaborative  ways. 

They believe students "are familiar and comfortable" with technology, so it can be utilized as a means to deliver the content.  In their words: "Technology can make library instruction more engaging, more entertaining and more interactive" (2).  Their article, "DO Get Technical!  Using Technology in Library Instruction," highlights four types technology tools library instructors can use to engage students: Xtranormal, Poll Everywhere, wikis, and Skype

Xtranormal allows individuals to create their own little video with pre-fabricated characters, backgrounds, and voices.  If you can type, you can create an Xtranormal video.  Keying or typing words into dialogue boxes creates the audio component; a machine reads the words, though you can choose what kind of accent you prefer.  "You can make one Xtranormal video for free; after that you must buy points.  The more points you buy the less expensive they are, but generally it costs only two to three dollars for a basic movie" (2).  They recommend it as a tool that can share information, or teach students, without costing lots of money, while also being amusing. 

Student in their classes have enjoyed the humor and the entertainment in the process of being introduced to a topic or listening to a summary. They also suggest: "Students could also create their own videos in order to demonstrate their understanding of a topic" (3).  The machine-generated voices and the gestures throughout the videos increase the humor.  They have created a few publicly available Xtranaormal videos:
The next tool they explain is Poll Everywhere.  It lets students answer questions in real time anonymously, and the questions can be inserted to a PowerPoint presentation.  Nicholson and Eva write: "A basic account allows up to 30 responses per question and unlimited questions for free, and upgrades range from $15 to $1,400 per month" (4).  To respond, students need computers or cell phones to reply to the questions.  They explain: "As we have seen with classroom clickers, this is a great way to encourage class participation" (4).  Cell phones can text their answer, but computers with internet access may certainly respond as well.  "The polls are updated instantly, and PowerPOint slide changes dynamically as students enter their answers" (4).  Results offer instructors to correct students and solifify the learning; they can also prompt further discussion.

Nicholson and Eva also promote the use of wikis in library instruction, highlighting its effectives as a collaborative learning tool.  Like Poll Everywhere, wikis "exist in the 'the cloud' with no downloads requires" (5).  Wikis can be purchased, however, that are not available to the public at large, so just the students in a class could access the project.  They explain the essential aspects of a wiki: "The premise behind wikis is that they are collaborative; all users can edit or create new entries.  Student participation in a wiki is an effective way to promote active learning" (6).  Participation in this endeavor turns on the light for many students as they begin to understand how information is created, edited, and shared.  Therefore, they begin to realize how important it is to evaluate the information they find.

Finally, they talk about Skype.  This online videoconferencing tool can be used to teach students in distance settings, though this requires hardware such as microphones, video cameras, and speakers, not to mention high-speed internet access.  Nicholson and Eva have taken advantage of the technology to instruct students at distance sites, so they speak from experience (7).  I appreciated that they mentioned how they anticipated and prepared for technical difficulties.  When the visual feed was lost on the distance site's end, the instructor there was able to display the presentation slides that had been emailed previously while the library instructors continued speaking and teaching.  The instructor could demonstrate along with the librarians as they both walked through the presentation (8). 

Nicole Eva and Heather Nicholson believe that these technologies are "unique, effective, EASY, and low or no-cost.  When used correctly and where warranted, these applications can be usefu in engaging students in sessions in which they might otherwise tune out" (9).  Even their own lack of technological experience did not keep them from succeeding, and they believe others can experience this same kind of fruitful experience in the library instruction room (9). 

Work Cited
Eva, Nicole, and Heather Nicholson.  "DO Get Technical!  Using Technology in Library Instruction."  Partnership: The Canadian Journal of Library and Information Practice and Research 6.2 (2011): 1-9.
University of Lethbridge logo. Fiat lux is the Latin for "Let there be light."
Nicole Eva and Heather Nicholson work in the University of Lethbridge Library.

Tuesday, June 12, 2012

Clickers, Participation, Assessment Outcomes, and Library Instruction

Clickers, or personal response systems, may encourage participation and help students enjoy library instruction more.  Emily Chan and Lorrie Knight, from the University of the Pacific,  conducted a study that discovered this to be true.  They also learned that assessment outcomes may not necessarily improve as a result of using clickers. 

Published in Communications in Information Literacy, their article "Clicking with Your Audience: Evaluating the Use of Personal Response Systems in Library Instruction" first identifies the makeup of college students participating in their study.  They belong to the Millennial generation who "tend to share these main character traits: feeling special, being sheltered, having confidence, preferring team or group activities, favoring the conventional, feeling pressured, and needing to achieve" (193).  They make the following claim: "Library instruction, often delivered through one-shot sessions, may seem out of touch to Millennials if it does not incorporate technology in a meaningful and entertaining manner" (193).  With this premise as their foundation, they propose the usage of personal response systems (PRS or clickers) to engage students.

Through their look at the literature they show that others have found that professional use of PRS helps students be involved in the classroom, promotes conversations, and enhances learning among students (193).  They note that PRS make the lectures and class activities more lively and less "stagnant" (193).  As mentioned elsewhere, clickers allow instructors to adjust in the moment they are teaching.  They can understand what the students know.  Therefore, an atmosphere of active engagement and learning may be easier to establish with clickers (193). 

Not enough has been written about the PRS and actual learning outcomes, so Chan and Knight worked to look at this with their study.  They cite Anne C. Osterman's 2008 article that identifies library instructors' two greatest fears: (1) boring students and (2) teaching above their heads.  At this point they refer to another article when they write: "The use of clickers can prompt greater classroom interactivity through an assessment of students' understanding of IL concepts" (194).  Additionally, they found another article that states the finding that clickers increase student involvement in the classroom as well as their usage of resources in the library (194).  To repeat myself once more, this study looks at student enjoyment, engagement, and achievement as they relate to the implementation of clickers in the classroom.

As with other studies, they prepare the reader by defining the constituents involved--in this case freshmen at the University of the Pacific--and explain the course objectives of the freshman seminar courses and the library evaluations gathered before the study took place.  "At the end of each library session, students completed a brief evaluation measuring their achievement of learning outcomes.  The Assessment Office tabulated and analyzed the results, which proved to be inconclusive" (195).  Librarians convinced their library dean to fund a second instruction room equipped with more technology, such as a smart board, a computer for all participants, and clickers.  This allowed the librarians to conduct an experiment to see if the technology influenced student learning outcomes. 

Surprisingly enough, they found that the classes without clickers scored slightly better than the ones with them.  They write: "The students in Classroom NC (non-clicker) scored significantly higher in the assessment than the students who had their library session in Classroom C (clickers) (P value < 0.001)" (197).  That is not to say that there were no positive outcomes for students taking the instruction with the clickers.  Chan and Knight write: "The students in the technology-rich Classroom C found the library sessions to be more enjoyable, organized, well-presented, and participatory" (197).  Perhaps these positive results would continue to justify the use of clickers in the classroom.

No doubt the authors must have been perplexed that the technology did not increase content retention; however, they offer some reasons why the students in technology-rich classroom may not have achieved higher scores on the assessment measures.  They do so by noting potential benefits of a paper assessment:
  1. Able to use the paper assessment as a resource
  2. Allows the student to self-regulate order and pace during the test time
  3. Lets them to see all the questions from the start (this is similar to reason #1 above)
  4. With paper exams students can review and correct their answers before turning them in to be graded
  5. A paper test gives students the opportunity to judge how they use their time; they are more in control of this than if the test is offered with technology, especially if the instructor changes the questions (198)
If librarians use the clicker technology to assess learning, these reasons may be worth remembering. 
Boulder Chain Lakes area in White Clouds of Idaho.  Lakes in photo may be of Sliderock Lake (l) and Shelf Lake (r) Photo by Spencer Jardine.  2010.
Here are a few other things worth mentioning from this article.  Classes with clickers seemed to enjoy the instruction more, felt it was more organized, well-presented, and participatory than those that did not have them (199).  Millenials may expect and want technology to be used.  Indeed, Chan and Knight also mention another study that suggests "the use of clickers can restart the attention span of students" (199).  Sometimes this is necessary to bring back students to the subject at hand. 

The authors see clickers as useful tools to invite participation, adjust to student needs, and as a means to get things going at the beginning of library instruction sessions.  They write: "With the clickers' ability instantly to poll the audience, library faculty used warm-up questions as icebreakers in order to foster a more collaborative and engaging environment" (199).  They had wanted the clickers to increase content retention, but the non-clicker classroom student out-performed their peers in the classroom with clickers.  Naturally, other researchers, just as the authors mention, should look to see how learning outcomes are influenced by the use of technology in the library instruction classroom.

Chan, Emily K., and Lorrie A. Knight.  "Clicking with Your Audience: Evaluating the Use of Personal Response Systems in Library Instruction."  Communications in Information Literacy 4.2 (2010): 192-201.  Print.

Thursday, May 31, 2012

Effectiveness of Clickers in Big, Intro to Psychology Classes

A group of researchers from the University of Delaware, in a study that looked at the effectiveness of personal response systems, found that modest use of "clickers" increased exam performance.  They did not see evidence that clickers actually increased engagement in their study.  In a reference to the literature they write: "According to some researchers, students like clickers, and students also believe clickers make them feel more engaged" (45).  As far as their own students went, however, they note: "Although Dr. B reported that students 'got a kick out of them,' clickers had only marginal effects on self-reports of student engagement, attendance, and reading in this study--effects that may be attributable to Type I error" (48). 

Freshmen students at the University of Delaware, at the time of the study that is, mark which classes they want to take their first year, and a computer assigns their schedule.  Morley, McAuliffe, and DiLorenzo mention more than once that this made their study more reliable and random, due to the random nature of the process.  Students personal preferences did not determine when they took the psychology class.  Morley explains that the teachers used the clicker system only minimally:
Both professors taught using an interactive lecture style.  Both professors taught the earlier section without clickers ('traditional' sections) and the later section with clickers.  In clicker sections, at the beginning of class, the instructor posted five multiple-choice, fact-based questions, based on the day's required reading.  Students earned extra credit for answering these questions correctly.  Later in the class period, if relevant, the instructor would briefely elaborate on a clicker question that most students had misunderstood.  Other than this change, instructors taught the two sections identically.  (46)
 Data gathered from the exam results did indicate that "exam scores were higher for clicker sections than for tratidtional sections" (47).  This occurred regardless of the teacher; there were two teachers who taught two sections of large, introductory psychology classes.  Morling et al. summarize it in more formal language: "Our data suggest that using clickers to quiz students in class on material from their reading resulted in a small, positive effect on exam performance in large introductory psychology classes" (47).

Further studies might consider looking at teaching methods used in conjunction with the technology.  For example, they suggest looking at concept inventories, group discussions, and Just in Time Teaching (JiTT), which could all be joined with clickers to see how they might enhance learning (48).  For more clarification, the authors write: "In our study, the instructors used clickers very minimally--to administer quizzes, publicly display the results, and quickly correct any widespread misunderstandings" (47-48). 

Moreover, the article addressed the possibility that some students cheated while taking the reading quizzes, though they concede that this may have actually promoted a cooperative learning environment, which would have improved their engagement in the class (49).  Overall, this was a good article, as it found a positive result of using clickers via a scientific study, rather than relied on anecdotes or the fact that the technology was trendy at the time.

Works Cited
Morling, Beth, Meghan McAuliffe, Lawrence Cohen, and Thomas M. DiLorenzo.  "Efficacy of Personal Response Systems ("Clickers") in Large, Introductory Psychology Classes."  Teaching of Psychology 35.1 (2008): 45-50.

Stowell, Jeffrey R., and Jason M. Nelson.  Teaching of Psychology 34.4 (2007): 253-58.

Thursday, May 24, 2012

Audience Response Systems in the Classroom

Heidi Adams and Laura Howard write: "Audience Response Systems, commonly known as clickers, are gaining recognition as a useful classroom tool" (54).  In their short article titled "Clever Clickers: Using Audience Response Systems in the Classroom" they define Audience Response Systems (ARS), explain the two major systems (radio frequency and infrared), and explain how the systems can be used to gather feedback, check for understanding, assess student learning, and provide specific ideas for using ARS in the classroom. 

As a tool to promote student learning, each student must answer questions with a remote control.  Results are shown right away.  Adams and Howard write: "Since the educators are able to see the results instantly, it permits them to evaluate student understanding at that very moment and provides an opportunity to adjust the lesson accordingly to improve student comprehension" (54).  It helps instructors know if students got it.  ARS can be used and adapted to meet the needs of each student and each class.

Of particular value, this article offers twenty ideas for using ARS in the classroom.  Here are just a few:
  • Comprehension Testing
  • Drill and Practice
  • Review Games
  • Questionnaires/Surveys
  • Voting
  • Checking for understanding during a lecture
  • Fact Finding or Pre- and Post-Tests (55)
Adams and Howard cite some of the literature in making their point that ARS are good for students, because they increase engagement in the classroom.  Students seem to like it more.  They write: "With clickers, every student answers every question.  Additionally, the questions will spark more questions from students that will lead to further discussion and understanding regarding the material" (55).  Additionally, the on-the-spot assessment or feedback lets students understand if they got it right or not (56).  They do not have to guess; this seems to cement the learning and can solidify the learning process, or the correct cerebral paths in the brain. 

Naturally, the ARS do not solve all problems and have a few drawbacks.  Adams and Howard claim: "As with any other type of learning, if ARS is used too often, students tire of it" (56).  In other words, students like the newness of the technology, but with time will become less interested in it.  Still, they assert "that the advantages such as instant feedback and increased student engagement far outweigh the downsides" (56).  The potential of these systems does seem fairly expansive.
Qwizdom Clicker.  See "Spotlight on Education: Sandwood's S.A.IN.T Academy Hosts First Annual Media Day." Duval County Public Schools.
 A sidebar in the article shows a half dozen brand names of ARS:

Work Cited
Adams, Heidi, and Laura Howard.  "Clever Clickers: Using Audience Response Systems in the Classroom."  Library Media Connection 28.2 (October 2009): 54-56.

Wednesday, May 23, 2012

How does empowering versus compelling students influence participation attitudes?

A group of researchers at Brigham Young University collaborated to write an article in Active Learning in Higher Education.  It was published in 2007 and discussed a gap in the research on clickers or Audience Response Systems (ARS).  Many studies have looked into the participation factors: how they prompt discussions, how they uncover misconceptions in the learning, how they offer immediate feedback.  They write: "Studies that explored direct measures of student learning with audience response systems have shown mixed results [...] Although these studies showed mixed results, most studies that looked at indirect measures of student learning (including levels of participation and engagement) found positive results" (236-37). 

What about the groups of students who typically hold back and do not participate?  They cite studies showing that females tend to participate less, "because of they worry that they might appear to dominate discussion" (237).  Students from other cultures also participate less frequently, not wanting to give incorrect answers, "fearful of 'losing face'" (237).  These researchers call this group "reluctant participators," saying that other studies on ARS have not looked into this demographic (237). 
Reluctant Participants.  by Middle Age Biker.
They found that students perceive the ARS to be less helpful when they are used to grade or mark attendance.  Also, technical difficulties with the systems came back as the number one negative aspect of the systems, followed by their cost to the student, grading, and mandatory attendance.  Students had to pay $40 a piece, so "a significant number of students were critical of cost effectiveness" (240).   However, when the ARS were used to provide formative feedback, students perceived the use of the technology as positive.

On the whole, however, "the overall reaction to the use of the ARS in the pilot was positive.  [...] For all of the measures except one, a strong majority of the students 'agreed' or somewhat agreed' that the ARS was helpful to them in the learning experience" (238, 240).  Nonetheless, reluctant participators tended to view it as less helpful than the rest of the group.  Moreover, "students in classes where the ARS was not used for grading viewed the ARS to be a more helpful tool" (242).  This seems to be a major advantage of using Audience Response Systems (ARS).

During the survey, students were given opportunities to comment on what they liked as well as what they did not like about ARS.  "One student wrote, 'It was nice to have an immediate response so I could know whether or not I was doing something right'" (248).  Other students appreciated knowing what their peers thought on issues or content related to the class. The authors included this positive comment as well: "'The best part is that we could see the results of the surveys and what not instantly, and they were applicable to us in the class, not some foreign group of individuals.  It brought the issues to life which we were discussing in class'" (248).  This emphasizes the potential power of these systems, bringing forth relevant issues that can energize discussions.

At this point it seems appropriate to go back to the article's introduction to mention what happens when student have limited opportunities to participate and the method involves raising hands.  The authors write: "When classes have limited opportunities for students to respond, participation can be unbalanced in favor of the most knowledgable students who are most willing to respond in front of their peers" (234).  Personally, I have experienced this countless times.  Students often have good things to share, but it often happens that one or two students dominate any sort of discussion.  The strength of the ARS is that eveyone can participate anonymously.

While we are talking about active learning in general, it may be helpful to consider what the authors of this study wrote: "A diverse body of educational research has shown that academic achievement is positively influenced by the amount of active participation of students in the learning process" (233-34).  In the past, response cards have been used, and they helped to increase student participation and performance in higher education.  Today many have taken the ARS to invite participation and performance.  Active learning makes a difference, but it helps to consider the subgroups who reluctantly participate. 

Again, it is better to avoid using these systems for grading and attendance purposes, considering how many variables obstruct the success of the system.  The authors included this student comment: "'I think it's an unfair grading system.  I think they're great, as long as grades aren't based on them.  There are too many variables like battery failure, problems with being on the correct channel and so forth that interfere.  These variables make it unfair to base grades on clicker systems'" (240).  Clickers can be a powerful tool; however, students seems to dislike the tool when it appears faulty and they will be assigned a grade via the performance of this faulty tool.

Instructors should review the purpose of the tools or methodologies they use in the class.  The authors write: "The researchers in this study believe that the central concern of instructional technologists should be that of 'helping' students" (248).  Though the grading and attendance features may be helpful for instructors, if they create negative feelings in the students that inhibit potential learning, then perhas it should not be used to grade student work.  In their concluding remarks, the authors wrote: "Students perceived strategies that provided formative feedback and empowered them to evaluate their own performance as more helpful than strategies oriented towards grading and compelling participation" (251).  This explains the title of their article rather succinctly.

For researchers it may be helpful to know what they suggest for later studies with ARS.  "Future research could investigate a wider range of pedagogical strategies and the environments in which they are successful.  A measure of the goodness of the strategies could be student and instructor perceptions of their helpfulness in the learning process" (250).

Work Cited
Graham, Charles R., Tonya R. Tripp, Larry Seawright, and George L. Joeckel III.  "Empowering or Compelling Reluctant Participators Using Audience Response Systems."  Active Learning in Higher Education 8.3 (2007): 233-58.  Print.

Friday, May 18, 2012

Technology to Put in the Hands of Students and Librarians

Several weeks ago I attended a the ILA Region 5 & 6 Conference at Snake River Community Library.  Here are some notes that I took at one of my favorite sessions. 

Tech Talk: Integrating 21st Century Tools into School Libraries with Gena Marker, current president of ILA.  She is a school librarian at Centennial High School in Meridian, ID.

            This session was rather exciting for me, because it talked about how technology can be used to enhance learning and the educational experience.  She began by mentioning how many teachers get in a technology rut, using software adopted a decade ago.  Gena encouraged librarians to put new tools in students’ hands.  I believe one of the educational goals for Idaho is to increase student fluency with technology; someone made reference to this in the session. 

            Gena has bought flip or pocket cameras (Sony Bloggie models I believe) into the students’ hands.  Students need to learn new information and how to present it.  They should navigate the future and can do this with the use of new tools and technology.

            Animoto: it’s great and it’s free.  As an educator you can upgrade when you login.  This will give more access.  Create a video with a few clicks.  The free version allows for a 30-second video, but educators can do more.  Embed photos and video clips.  It is quick to use once you have tried it out, as with other sites in the Cloud.  With the flip cameras, it is necessary to be close to people to hear them.  Animoto lets you choose music from (set of options) music library.

            Photo Story 3: this is good.

            Zamzar: third party that converts almost any file type, such as avi, flv, wav, mov, etc.  It has an easy three-step process:

1.       Upload file

2.      Specify the new file type you desire

3.      Enter email address, so they can send you the new file with the new extension

Let students create videos.  It forces creativity.

            AnyVideo Converter: another third party file converter.  This may cost money, but it also does more in the way of screen captures, ripping DVDs, and so forth.

            Prezi: it’s an interactive whiteboard that zooms, flips, and embeds photos.  Create an educator account.  It’s possible to download it to a USB device.  Check out flip cameras from the library.  Learn by trial and error.

            Glogster: she recommends that teachers and librarians use the Glogster EDU as it may be safer and can be managed in a private setting just for class.  Use Glogster to show things.  It is an alternative to PowerPoint.  Poster yourself.  It’s a digital poster, but it allows you to embed audio and video into the poster.  Thumbnail photos can be enlarged.  It’s more about content than design creation.  Or maybe she said to encourage students to focus on the content than on design creation.  Some students in an honors class created one that looked at McBeth(?) from a feminist perspective.  They created a video that could be viewed, and they could show the poster while they explained it to the class during a presentation. 

            Extranormal: student love this the most and can waste a lot of time here.  Type in the text, and a mechanical voice will come out of the cartoon character you have chosen.  Type in stage directions, like walk forward three steps, and point to the left.

            Audacity: free podcasting.  It’s a free download.  Use a microphone with a USB connector to attach it to the computer.  Record a book review and share it in your online catalog.  With Follett’s Destiny (opac vendor), she can do this. 

            Windows Live Movie Maker: it is good for making movies with photographs, recorded digital movies, and music.

            VoiceThread: collaborate to change a presentation or comment on it. 

            Wordle: create word clouds.  This is a fun design thing.

            Overall, this presentation interested me because of the way she tied the technology to student learning and creativity.  It gave me some ideas and made me want to try some of these things.


Tuesday, May 15, 2012

Clickers: An Engaging Tool for Library Instruction

In 2008 Anne C. Osterman published an article online for librarians about the potential of student response systems (SRS) or clickers in the library instruction setting.  College & Undergraduate Libraries published it with the title "Student Response Systems: Keeping the Students Engaged."  (It appears the the print version came out first in 2007.)  She introduces the topic by mentioning many factors that go against participation in the library instruction classroom: unfamiliar setting, short opportunity (one shot at teaching library skills), and content many would not consider exciting. 

Librarians do what they can to invite participation.  They will work to make the instruction tied directly to an assignment, develop hands-on exercises, create handouts, and sometimes divide classes into groups to work together (50).  Osterman writes: "These tools do little, however, to help with one more inherent difficulty of library instruction: a wide variety of experience levels among student" (50).  Then she identifies "the two greatest fears of a library instructor [...]: (1) boring the students because they've seen it all before; and (2) losing the students because the territory is too foreign to their knowledge and experience.  Both lead students to tune out" (50).

"bored-students."  by cybrarian77 on Flickr.com.

This resonates with my own experience.  These are two of my greatest fears, and I have wondered how to deal with this.  Well, the most obvious thing to do and what Osterman calls "the last tool in the box: asking the students questions" (50).  Unfortunately, this does not always work, and Osterman recognizes that all the previous difficulties just mentioned will make this effort less effective as well.  Encouragingly, she writes: "Never fear--there is another solution" (50).  The Student Response System can make a difference, increasing participation, engaging students of all personalities and abilities, and offering a mechanism that prompts the instructor to adjust in the classroom needs and address deficiencies without belaboring the subjects students have mastered already.
Osterman observes that instructors can ask students questions spontaneously or "on the fly" (51).  She suggests questions like:
  1. Have you ever used X (JSTOR, Academic Search CompleteCQ Researcher, etc.)?
  2. What kinds of materials do you think you would find in X (the library catalog, the Special Collections digital archives, the Primo search, etc.)?
  3. Should you cite Wikipedia in a research paper?  Should you do X?
Osterman explains that the polling system remains open, then students can see what everyone else has answered (51).  Often this means that students who are embarrassed for answering incorrectly see that they are not the only ones who do not understand, so their embarrassment decreases dramatically, and they focus more on the learning than the embarrassment.

Osterman describes the two types of clicker systems: radio frequency and infrared.  Plus, she identifies some of the pros and cons of each (51).

In the next section of her article, she addresses the question: Why use clickers?  Citing the extant educational literature, she give at least five reasons:
  1. Combat passive learning environment
  2. Promote active learning
  3. Help with participation problems
  4. Provide instant feedback
  5. Interrupt lecture.
Additionally, she address the anonymous nature of the system: "Some instructors believe that anonymity makes students more comfortable and likely to participate, and this has been supported by research in students' opinions of these systems" (52).  As mentioned previous, because of the anonymity, fear of embarrassment is eliminated or at least lessened (52).  What really gets me excited is its potential for increasing the level of learning that takes place in the library classroom.  Osterman claims: "Also by encouraging students to make an actual decision about a question, the SRS makes them less likely to sit back and let the information wash over them unabsorbed.  Instead they evaluate a question and answer with engaged minds" (52).

"Law Students Use PRS."  by jonalltree on Flickr.com

As you can tell, this article really caught my interest; I can hardly stop from quoting from it.  The next section talks about how library instructors can and ought to adjust their instruction when using an SRS tool.  She talks about an average library workshop, then suggests that student questions and answers can determine the little parts of instruction that should be taught once more, passed over entirely, or explained more thoroughly.  With some forethought, instructors could devise some questions to generate discussions.  Likewise, sensitive questions could be asked that could then be compared to published data.  Along these lines Osterman suggested that students could be asked about their incomes, and then those figures could be compared with data collected by the U.S. Census Bureau as it pertains to their particular locale, for example (53). 

The system could be used to ask students to predict what might happen.  When they are required to answer, they become more committed and, thus, engaged.  She offers a pair of questions related to Boolean operators, which invite the student to predict if more or fewer results will be retrieved.  For the serious-about-learning types, she tells how some SRS systems collect the data, so they can be analyzed, which would allow for instructors to adjust their methods even more (54). 

We often hear that technology should not replace teaching, that it is just a tool to enhance learning.  This is true.  We should remember this.  As with any technology, pros and cons exist.  Osterman warns that with this technology less content may be taught, it may "distract instructors from their teaching," and students may forget clickers, use them to cheat, and may even walk away with them.  Fortunately, the benefits of learning "might easily outweigh" the con of less content being taught, and libraries who buy their own systems would not need to worry about students forgetting their clickers, though students could walk out the door with them at the end of class if one wasn't careful (54).

The last section of the article discusses "The Experience of American University Library" where Anne Osterman works.  In it she talks a bit more about vendors, different systems, training library instructors, necessary adjustments, some sample questions, using the SRS in library training sessions, and questions to ask with the system.  Encouragement and support should be given to those using the system for the first time.  Making the system available for individuals to practice with is best (55). 

From the experiences of her colleagues as well as her own, Anne Osterman writes: "Just as many beginning library instructors try to teach too much in the short amount of time they have and gradually slim their material down to an amount that is digestible, some instructors found that their first attempts in creating questions for a class were too complex" (56).  She recommends that librarians use the same questions in a series of classes; this will help instructors know how one class is different from another.  Again, the question "Have you used X resource?" may be a great standby.  "Overall, the response from library instructors at American University Library who have used the system has been very positive" (56). 

In summary, Osterman repeats that anonymity and novelty of the system generate an engagement with library instruction that increases learning.  If money is an issue, then a home-grown system may work or a "Web-based voting system" (56).  The short list of references looked helpful as well.

This article drove home the idea that polling students can really increase engagement, participation, and learning in the classroom.  Anonymity helps students participate more readily, and simple questions need to be the norm.  I really liked the sample questions she included.  This was quite helpful.

Work Cited
Osterman, Anne C.  "Student Response Systems: Keeping the Students Engaged."  College & Undergraduate Libraries 14.4 (2008): 49-57.  Print.

Monday, May 14, 2012

Information Literacy and Clickers

In my ongoing research related to audience response systems, or clickers, I discovered an aticle written in August 2009 by Patricia A. Deleo, Susan Eichenholtz, and Adrienne Andi Sosin.  Titled "Bridging the Information Literacy Gap with Clickers," this article explains how a graduate course in an Educational Leadership and Technology received information-literacy instruction with the help of clickers.  The authors used the term Class Performance System (CPS), but other education researchers call them Audience Response Systems.

They set forth the terms "digital natives" and "digital immigrants" and discuss the differences between those who are more technologically savvy than others.  Mostly, they point out that "even technologically competent students overestimate their ability to effectively search for and access information" (439).  Likewise, "graduate students display overconfidence with regard to both their research and technology skills" (439).  But how does an instruction librarian make students aware of their lacking skills while promoting learning at the same time.  Who likes to hear that they are not as competent as they think they are?

The authors of the article rightly claims that "attention to the differential level of each student's information literacy capatilities is ncessary in designing information literacy instruction" (439).  With students of different technology and information-literacy abilities in the classroom, how does a library instructor teach so that all can learn without feeling entirely lost or utterly bored.  Deleo and company write: "Information literacy classes where technology skill competence widely varies among students complicates the pedagogical situation" (440).  What can a librarian do to succeed in this environment of complexity?

Certainly, Deleo and her colleagues make an apt observation: "We have discovered that making assumptions about student technology or research skills is not effective, predictable, or advisable" (440).  Well, if one cannot rely on assumptions, what direction should be taken?  Clickers can enable librarians to hurdle some of these issues and do so gracefully.  "Clickers were initially adopted as a pre-lesson assessment tool to assist the librarian in setting an appropriate starting point at the students' levels" (440).
From "Accessibility in Education" by Lucy Greco.
One of the most valuable parts of this article may be the types of questions they asked with the clicker system.  An appendix to the article lists the questions they have required students to answer, but here is a little taste of what they looked for.  They wanted to know if students could distinguish qualities or aspects of the Library of Congress Classification System and the Dewey Decimal System, their ability to finding books in the catalog, students skills with documenting references in APA format, their understanding of popular and scholarly publications, and their knowledge of internet terminology. 
My favorite part of their article was the discussion.  It came alive and highlighted their positive experiences using the system, mentioning how it engaged students, enlivened discussions, created a sense of community, and increased interaction with the librarians.  They described how they promoted this engagement in conjunction with the technology: "After each student had clicked in their answers to a question they were instructed to turn to their nearest classmate and discuss that question and the answer they had chosen [...] As a result of inserting 'turn and talk' into the CPS procedure, the engagement level of the class rose significantly" (443).  I can see how this would generate even more interest. 
"Getting Interactive in the Classroom with Technology!" from eLearning @ Liverpool
It also takes some of the burden off of the instructor, because some of the students will/may answer the questions correctly.  At any rate, this does promote critical thinking skills.  Students will defend their answer or they will accept their classmate's answer.  Deleo, Eichenholtz, and Sosin write: "The process generated a higher level of anticipation for feedback as well" (443).  The authors explain how they would like to make the one-shot instruction session become a two-shot class.  The areas in which students were deficient could be addressed more fully in a second class (443).

The authors conclude with comments about the future potential of clicker systems in library instruction.  Essentially, the continuation of this methodology, they argue, may rely on student behavior.  They write: "student willingness and the librarian's skill at conducting the clickers session will be the larger issue, not the technology" (444).  In summary, they recommend that librarians investigate this technology.

Work Cited
Deleo, Patricia A., Susan Eichenholtz, and Adrienne Andi Sosin. "Bridging The Information Literacy Gap With Clickers." Journal Of Academic Librarianship 35.5 (2009): 438-444. Library Literature & Information Science Full Text (H.W. Wilson). Web. 11 May 2012.

Friday, May 11, 2012

"Participatory Technologies" Article by Meredith Farkas

Meredith Farkas has written a good article for librarians involved with information literacy.  She celebrates the advantages of incorporating participatory technologies into the information-literacy classroom.  Using Web 2.0 tools such as blogs and wikis in the classroom can increase student learning and responsibility.  In fact, she claims that these tools "have the potential to create a more engaging learning environment.  Increased learner autonomy give[s] students a greater sense of responsibility for their learning and has been shown to improve student achievement" (84). 

One of the great advantages of blogs in relation to information-literacy learning is that blogs encourage reflective thinking, which can potentially guide students to think about their own research process.  Farkas notes that blogs can invite "reflection within an environment of peer interaction" (84).  Students will often listen to their peers before their instructors; they may be following the rule "Don't trust anyone over 30."

In theory, Farkas also extols the constructivist model, which naturally downgrades the traditional model of the teacher being the authority figure.  Students can learn and grow more when they interact with each other, challenging each others' ideas.  Of course, she explains this a bit more eloquently: "Constructivist pedagogy views students as active participants in learning who construct knowledge based on their existing understanding as well as interactions with peers and their instructor.  Unlike in behaviorism, the instructor is not seen as being wholly responsible for student learning" (86).  She ties this teaching theory to Web 2.0 and calls it Pedagogy 2.0, though I have not verified if she is the one to coin this term.
Microsoft Office Clipart.
As an instruction librarian myself, her final section appealed to me the most: "Information Literacy and Pedagogy 2.0."  She emphasizes the importance of evaluation:
In a world where the nature of authority has come into question (Chang et al., 2008), students will need to evaluate information in more nuanced ways then they are currently being taught at most colleges and universities.  Information literacy needs to be increasingly focused on teaching evaluative skills to students; skills that go well beyond determining whether or not something is peer-reviewed.  (90)
 Four or five years ago it seemed that librarians, myself included, still showed bogus websites to their students to raise awareness of the importance of evaluation.  In certain cases, this may still be appropriate and get students attendance.  However, it seems that college students need to evaluate information at a more sophisticated level.  Farkas' claim that "students will need to evaluate information in more nuanced ways" (90) makes sense.  Rather than looking at sources to see if their are black or white, legitimate or bogus, genuine or fake; students need to determine if information is relevant to their research question, to understand if it is credible, objective, current, accurate, and authoritative. 

The hardest thing for students to determine may be the accuracy of information, so looking at credibility, objectivity, and authority may give necessary clues for them to determine accuracy.  Most of all, they should be concerned with the relevance of sources, yet some students seem to be too quick with the gun at killing sources.  Part of college involves creativity at understanding how the broader subject relates to the more specific paper topic.

Meredith Farkas addresses a new issue, at least it did catch me off guard: "Those teaching information literacy will also need to focus on developing in students the dispositions needed to be a successful consumer and producer of knowledge" (90).  It seems easy to teach content and research strategies as a librarian, but developing new dispositions in students seems a tall order, not to say it is not desirable.  With one- or two-shot sessions how much can library instructors really do? 

Undoubtedly, this work of influencing student attitudes in the direction of knowledge creation may seem daunting for library instruction, yet it may also insert some life into the instruction.  This goes beyond just showing the steps of how to use a database and taking advantage of the features that can be easily explored independent of the instructor.  Therefore, I agree with Farkas, though it may require some stretching for most library instructors.  "It is important for librarians to consider how we can help students develop the attitudes that will make them critical and effective information seekers through learning activities" (Farkas 90).  Indeed, librarians should take the time to reflect on how to inspire students in this direction, but it may start with librarians becoming more passionate and confident about their own information-seeking abilities.

How does this translate to the library instruction classroom?  It means that librarians need to get students actively engaged in the process.  Farkas writes:
Librarians still offering lecture-based information literacy instruction need to explore ways to make their instruction more engaging and student-centered through collaborative, problem-based learning.  The Library literature is replete with case studies suggesting creative active techniques for enhancing student learning.  (90)
She goes on to encourage more questions, dialogue, and group work.  Rather than creating a set outline, librarians should conduct formative assessment to understand the constituents of their classes (91).  Each class coming to the library consists of a different group of individuals with different experiences.  Bending the instruction focus to meet students' needs seems to be more effective.  Going a step beyond this, it seems that success in the library instruction room may increase if the formative assessment is sent out and completed prior to attending the library workshop.  This allows the librarian time to think about where adjustments should be made.  Not all librarians like to adjust on the fly.
I agree with Farkas when she says the students do not reflect: "Students rarely reflect on their research process, which can result in the need to re-learn skills they used in their last assignment" (91).  Working with the instructor, a librarian may be able to leverage a reflective requirement, and they could do so with a blog (91).  I have always liked this idea, and Farkas explains why blogging and wiki creation are such good ideas.  "Blogs could also be used to have students investigate the social origins of information and identify bias within writing.  Students can engage with the peer review process through reviewing the work of their classmates on blogs and wikis" (92). 

Participatory technology, then, engages students in the peer-review process, invites them to critically assess the research process, provides "teachable moments" for the instructor, and increases student learning (92).  "These activities can generate an understanding of peer-review at a level far beyond simply checking a box in a database search interface" (92).  They can also increase the sense of community, enliven the classroom, allow the instructor to offer guidance and feedback, and lead to positive student learning outcomes.  Moreover, it may even increase writing and communication skills.
 
 
This article was very well written and offered a number of insights into the value of participatory technology as it relates to library instruction in higher education.
 
 
I do recognize her from American Libraries as the author of numerous technology columns.  She is a librarian at Portland State University and writes the blog titled Information Wants to be Free.

Work Cited
Farkas, Meredith.  "Participatory technologies, pedagogy 2.0 and information literacy."  Library Hi Tech 30.1 (2012): 82-94.