Each of these posts originally appeared someplace else. I gathered them together here to provide a fuller view of who I am and how I think about librarianship, technology, and higher education.
My full list of publications is also available.
Note: This was originally published at ACRLog.
ACRLog welcomes a guest post from Kelly Blanchat, Electronic Resources Librarian at Queens College, CUNY, and Megan Brooks, Director of Research Services at Wellesley College.
This blog post is the culmination of a Twitter conversation between librarians talking about their experiences playing a phone game. The game is called Nekoatsume and it involves taking care of digital cats in a virtual backyard. Nekoatsume is entirely in Japanese, a key fact that actually started the Twitter conversation (and not the fact that the game involves cats, as might be expected).
In short: a librarian started playing a game, wrote some enticing tweets, and many more librarians joined in — and still are to this day.
While this was happening, Kelly wrote on her personal blog about the joy & ease of understanding the game despite its language barriers, and how it would be nice if students felt the same way about using databases and library resources. Library databases should be just as user-friendly as a game in a foreign language, but too often they’re not. Our students do use recreational technology, and Nekoatsume isn’t the first app in Japanese or Chinese to gain popularity in the U.S. in the last month. And it’s not that recreational technology is always user-friendly, either. Torrenting platforms, such as the Pirate Bay, are notoriously convoluted – especially in regards to persistent content – but anecdotal evidence suggests that our students are able to navigate these platforms with relative ease.1
As more and more librarians join in to play Nekoatsume, there’s a common experience that happens early on: the digital cats have disappeared — maybe they died, or ran away — and we believe that we’ve played the game wrong.
Megan even initially deleted the app out of frustration. The experience of not understanding phone cats, even when “everyone else” seemed to, left her in a position that many of our students might find themselves: lost, stupid, and unwilling to engage any further. Sadly, library resources do not contain cute digital cats to lure users back after a bad experience. Megan, on the other hand, was willing to give Nekoatsume another shot after the Twitter conversation, and she also found a walk-through for the game online.
The satisfaction from playing Nekoatsume comes from getting more & more cats, and more & more points. For library resources the outcome is often much less immediate: find resources, analyze evidence, fill a resource quota for a bibliography. The research process can also be very solitary, and having the ability to apply similar or shared experience can counteract that as well as other obstacles with online library resources. That is to say, having a related experience can help the process to feel seamless, less daunting. In the case of Nekoatsume, the language barrier subsides once the basic movements of the game are understood, whether through trial and error, consultation of the Twitter hive-mind, or reading online tutorials. Such resources are comparable to “cheat codes” in the gaming world, elements that facilitate getting to the next achievement level. In the library world, they are often referred to as “threshold concepts”. And while most online library resources do contain the same basic functionalities, such as as a button for “Search” and and a link for “Full-Text”, differences from platform to platform in placement and style contribute to a block in fulfilling that need for seamless usability.
Libraries do make a great effort to provide users with workshops, tutorials, and LibGuides to facilitate user understanding and research methods. However, such content can require a lot of explanation whether with words, pictures, live demonstrations, or a mix of all three. Sometimes it can feel like tutorials need their own tutorials! Discovery layers, such as Summon and Primo, begin to address the usability issue by providing a single destination for discovery, but with that libraries still need to address issues of demonstrating research purpose, enthusiasm, and information synthesis. With so many variables in acquiring research — design, functionality, search queries, tutorials — the outcome of research can be overshadowed by the multitude of platform interfaces, both within the library and on the open Web.
The hype for Nekoatsume may eventually subside (or not), but another app will likely take its place and we librarians will still be asking ourselves how to facilitate the next steps of scholarly research for our students. If we can find a way to foster essential research skills by relating them to similar experiences — like with social media, searching on the open Web, downloading torrents, and playing games with digital cats — perhaps the process to knowledge can feel less daunting.
…but maybe we should just embed cute cats into all things digital.
Note: This was originally published on my blog, librarygrrrl.net. I reproduce it here because it makes me laugh.
Every semester during reading period and finals, there is a protracted online discussion in which students bemoan their current state of affairs in haiku format. Yesterday, I was working at the reference desk and thought I’d jump into the foray in an attempt to drum up some business. What follows is a series of haikus I sent out over the course of the 4 hours I was working.
writing a paper?
librarians have mad skills.
consult with megan.
here till 5pm
i can answer your questions
research pressure mounts
pubmed lion econlit
move beyond google
footnotes and endnotes galore
save time talk to me
what is your style?
help with citations
i’ll help you today
someone is here tomorrow
use our great knowledge
(Reference Librarian Haiku by Megan Brooks is licensed under a Creative Commons Attribution 3.0 United States License.)
After posting those to my Facebook account, my friends Eric, Steve, and Mac all supplemented my offerings with ones of their own, which I offer here:
Find a citation easy
Document your sources right
Science Eye Brain Neuron
what in the hell does he know?
go right to the source.
She knows where to find
the answers contained herein.
Your fault if you fail.
FInd the best sources.
The librarian knows where
they are all hiding.
Paper is due soon
Your Zotero is empty.
You should ask for help.
Do you understand
what the reference desk does?
You should ask, buddy.
The cursor blinks.
You will need more evidence
to support your claims.
You chose your topic,
but are not sure which journals
might aid your research.
I write these haiku
Hoping to charm you into
Asking me questions.
I am an expert
in tracking down resources.
I earned a degree.
Please don’t walk by me
another time looking lost.
I am here to help.
(Nine Library Haiku by Eric Behrens is licensed under a Creative Commons Attribution 3.0 United States License.)
I am highly entertained.
The number of citations an article receives is often used as an indicator of impact but there are many instances where this metric fails to capture the wider results of how ideas are distributed. Adele Wolfson and Megan Brooks look particularly at education-related publications and how alternative metrics could be utilised to count instances currently undervalued.
When those involved in traditional research publish papers, the main audience is other scholars in the same or related fields. Researchers hope the published work will be verified or refuted and built upon, and that new publications will result. The end product is the creation of new knowledge. So, it makes sense that the impact of the original publication would be measured as citations. (Even that conclusion is increasingly in doubt, but more on that, below.)
When those involved in educational research publish papers, the audience includes other researchers, but is more often directed at educators who teach the subject matter in their classrooms. Pedagogical researchers hope their published work will be picked up in curricula, methods, and instructional approaches, and that more effective teaching will result. The end product is better student learning. Citations do not capture any measure of the use of the original publication, much less of the desired increase in student learning.
You can see the citation problem very starkly just by looking at impact factors for journals. In my own field of chemistry, one of the top U.S. venues for publication in all subfields is Journal of the American Chemical Society, which has an impact factor of 10.677. The education journal published by the same society, Journal of Chemical Education has an impact factor of 0.817. The journals published by the Royal Society of Chemistry show the same trends: Chemical Society Reviews has an impact factor of 24.892, whereas Chemistry Education Research and Practice of 1.075. These patterns are repeated for traditional research vs. educational research in other fields.
I started thinking very naively about this issue and wondered what metrics other than citations might be used for education-related publications, not to mention results and ideas distributed in ways other than publication. Luckily I was directed to colleagues in Library and Technology Services, who introduced me to the relatively new field of altmetrics. Even for traditional scholarship, journal impact factors and the h-index for individuals have become too limiting. Research goes beyond the academy, appearing in articles in the popular press, testimony to government and non-government committees, patents, social media, etc. Altmetrics has arisen as a way to capture this impact, which is often observed more quickly and more widely than scholarly citation (summarized in this blog by Pat Loria in October 2013). There are drawbacks to its methods, too, since many of these impacts are subject to manipulation and lack of peer review, but it does provide a much broader view of impact that is also not so dependent on publication venue as older measures. The field of altmetrics is becoming sufficiently mature that there are already committees and workshops to establish standards and recommended practices for the industry.
As we considered our original questions about impact of education research, we realized that even altmetrics as currently defined did not fully address our concerns. My colleagues and I wrote a commentary about this topic that described three approaches to raising the profile of education research – overall and for individual contributions – in hopes of generating discussion and inspiring new measures of impact.
One of these approaches is simple and easily accomplished, two much more complicated. The simple one is just to get education researchers and others involved in this type of pedagogical scholarship to use the full range of tools currently available to keep track of how their work is being used. In our commentary we listed a number of these and encouraged those in the education community to select at least one of these tools to get a sense of their own influence and standing. To measure impact using article-level citation counts, journal impact factor and/or h-index, we encourage the use of Web of Science, Scopus, or Google Scholar My Citations, and for measuring article-level impact using altmetrics, leaders include ImpactStory, Plum Analytics, or Altmetric.com.
The next approach is more difficult in that it requires that more members of the education research community engage with social media. Since altmetrics algorithms take into account blog mentions, saves/shares in Mendeley and CiteULike, tweets, Facebook postings, and so on, we need to see more of discussion of education-related topics in these venues so that they are “counted” more in determining impact. Furthermore, we ask that instructors post the sources of their materials on syllabi on websites so that they are accessible to the public, not just on handouts and portals limited to their own classes and institutions.
The final approach is the hardest since it means getting the attention of altmetrics companies and convincing them to expand the range of sites they monitor for inclusion. We would like to see a list created of venues where educational materials appear and to work with altmetrics developers to incorporate these as appropriate. What sorts of venues are we talking about? Textbooks, presentations at workshops, sites for test preparation materials, test banks and syllabi repositories, funded grant proposals, and others might be included. But the first step would be convening some meetings, probably centered around education research in specific areas (science, writing, mathematics, etc.) that would include researchers, practitioners, and altmetrics representatives to define what would be most useful to everyone. If we don’t do this now, when altmetrics is still developing, we will soon be in the same position as we are now, where education research is undervalued because its impact undercounted.
Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.
Adele Wolfson is the Nan Walsh Schow ’54 and Howard B. Schow Professor in the Natural & Physical Sciences and Professor of Chemistry at Wellesley College. She received her A.B. in chemistry from Brandeis University and her Ph.D. in biochemistry from Columbia University and has worked or studied in Israel, France, and Australia. Her scientific research is in the area of protein biochemistry, particularly the role of neuropeptidases in reproduction and cancer. She also conducts research on educational and pedagogical topics, including concept inventories for biochemistry, and the ways that students connect learning between science and non-science courses.
Megan Brooks is the Director of Research Services for Wellesley College’s Library and Technology Services, where she has the pleasure of working with and managing a team of creative and thoughtful research librarians, and supporting faculty and highly motivated and intelligent students. She earned her B.A. in psychology from the College of St. Benedict and her M.L.S. from Syracuse University.
Long considered primary way to measure the impact of research publications, the impact factor is being both challenged and supplemented by the use of altmetrics. Faculty members who want to explore altmetrics can work with LTS staff to use web-based tools to measure the impact their research and scholarship beyond simple citation tracking.
What are altmetrics?
With the explosion of “big data,” we can find out more about articles and other research output, and we can do it on a faster timeline than in the past. Altmetrics have the potential to do the following: “provide a broader, more diverse view of the impact of any piece of scholarship, to reflect the impact of the article itself, not its publication venue, to track impact outside the academy, to track the impact of influential but uncited work, and to track track impact from sources that aren’t peer-reviewed.” (source: http://altmetrics.org/manifesto)
Altmetrics services generally track impact by looking at and quantifying:
Depending on the service, we can identify altmetrics for a large range of scholarly objects, including articles, blog posts, book chapters, books, cases, clinical trials, conference papers, datasets, figures, grants, interviews, letters, media, patents, posters, presentations, source code, theses and dissertations, videos, and web pages.
LTS Contact: Megan Brooks
What happens when you’re teaching and need to be away from campus during one or more class sessions? Cancel the class? Have a guest lecturer? Use Zoom or Skype to be a talking head on-screen in the classroom? What if your course schedule is so full that you need those sessions to accomplish everything on the syllabus?
Professor Mark Lauer ran into this exact situation in spring 2015, when he needed to be away from campus for two meetings of his Elementary German 102 course in mid-March. Rather than using traditional videoconferencing tools like Zoom or Skype to bring him to the classroom, Lauer partnered with Library, Information and Technology Services (LITS) to teach his classes with an experimental telepresence robot from Double Robotics. The robot consists of an iPad attached to a rolling base (like a Segway). The driver of the robot logs in using a web browser or iPad, takes control of the robot, and drives it around, speaking and interacting with people as though in the room.
The robot was introduced to the German class a few weeks ahead of time, so the students wouldn’t be taken off-guard on the days the professor was using it. From just down the hall, Lauer logged in and drove the robot – and himself! – into the classroom for a few minutes at the beginning of class. The students laughed, said hello to him, and then he arrived in person to teach class as normal. The next time the students encountered robotic Professor Lauer, he was across the country, logging in with his iPad, and collaborating with his student teaching assistant on-site in South Hadley (she helped with projection of his powerpoint and writing on the whiteboard). In Lauer’s words, “After the first laughs, use of the robot turned into a ‘non-event,’ which was our aim. The robot became ‘natural’ – and students focused on the class and the material.” A post-class survey indicated that the students were enthusiastic about their professor’s use of the robot for this kind of situation.