INT230: Introduction to Libraries and Archives

Along with Mark Armstrong, former archivist and records manager, and Kate Boylan, former director of archives and digital initiatives, I developed this required course for the Galleries, Libraries, Archives, & Museums program at Wheaton College MA. It first ran in spring 2022 as an experimental course (INT 298). Subsequently it was approved as a permanent course scheduled to run every other spring.

The course focuses on the fields of library science and archives. Topics include history of information, history of US libraries and archives, organizational structures and missions, community engagement, and contemporary challenges and opportunities in libraries and archives. The course has a particular focus on social justice, diversity, equity, and inclusion in libraries and archives.

Assignments include:

  • a reflective personal essay
  • a persuasive privacy project
  • a presentation on a library core value
  • a supervised experiential project in a library or archive
  • a weekly reading log

Guest lecturers, field trips, and panels with alumni and MLS graduate students expose students to libraries and archives – and the workers in them – beyond Wheaton College.

ePortfolio as a tool for curating a professional online persona in the 21st century

Note: This story originally appeared on the Mount Holyoke College LITS web site, on the LITS-faculty collaboration page. I love seeing all the things my colleagues are doing with faculty and students.


“What do you find when you Google yourself?”
“What experiences are you most proud of?”
“Who do you want to find you when they search for you online?”

These are some of the questions students grappled with during an eportfolio creation sprint in fall 2016. Library, Information and Technology Services (LITS) staff Nick Baker, Megan Brooks, and Caro Pinto partnered with Nexus Director and Professor of Sociology Eleanor Townsley to offer a 3-session crash course in eportfolio development for Nexus students taking College 211: Reflecting Back: Connecting Internship and Research to Your Liberal Arts Career.

Students in Nexus are particularly well-positioned to create eportfolios. Not only do these students have a major, which requires they delve into a discipline, but they also have a Nexus, which requires them to explore a pre-professional track broadly across the curriculum. Nexus students participate in an internship, present publicly at the LEAP Symposium, and often participate in other co-curricular, athletic, activist, and social activities. The eportfolio provides them with a location to bring those disparate experiences together into a cohesive whole.

Students in the eportfolio creation sprint engaged in a lively discussion with one another and LITS staff around questions of online identity. Several hadn’t searched for themselves online before and the results alternately surprised, dismayed, and relieved them. This exercise forced them think about how they wanted to craft their online identity going forward, whether in Mount Holyoke’s ePortfolio platform, on other platforms like LinkedIn, or on a personal website. Megan Irgens ’17, said, “Creating an eportfolio made me more aware of what I was putting online on my social media and how I wanted people to view my professional side.”

After grappling with questions of online identity, students wrote personal narratives to provide the foundation for their eportfolios. Saumya Sudhir ‘17 noted that this, “enabled us to find our voice and showcase it in a professional setting.” They gathered documentation of the experiences they wanted to highlight – photos from internship experiences, written papers, videos of performances, and more – and used those as evidence in their eportfolios. Each student personalized her eportfolio so it made the most sense for her personal story and online presence. After giving the eportfolios a couple of weeks to marinate, the students gathered with LITS staff, Townsley, and Nexus Coordinator Katie Walker to present their draft eportfolios to one another and receive praise and constructive criticism. Jocelyn Mosman ‘17 reflected, “…I thought the process was great! Very insightful! I loved how much emphasis was put on making it accessible.”

Each student came away with an eportfolio, content to use on other platforms like LinkedIn, and an understanding of the practice of curating a professional online persona in the 21st century. If your students would benefit from a similar experience, please contact your LITS Liaison for more information.

Introducing first year writing students to scholarly, trade, & popular sources

Note: This was originally published in the Research & Instruction Showcase on the Wellesley College LTS web site. Students in this session reported feeling more confident about their ability to contextualize online sources after examining physical sources.


Composite image of covers of Sierra Magazine, Environment Today, Environmental Ethics, Christianity Today, and WIRED.
Students examined popular, trade, and scholarly magazines and journals in order to understand why they would want to use each one.

First year students often do not know to differentiate between scholarly journal articles, popular magazine and news articles, and publications from trade journals or magazines when doing research. In an increasingly online research environment, this challenge is magnified because articles of all sorts are “unbundled” from their tables of contents and other articles in the same issue.

Students in two first-year writing courses, Environmental Ethics in Christian Traditions and Religion and New Media, had the opportunity to get hands-on experience differentiating between scholarly, popular, and trade sources, and to translate that hands-on experience into the online environment.

Before the instruction session, the librarian identified scholarly, popular, and trade sources for each course, and brought several copies of each source to class. Students were paired and given a guide to help them differentiate between scholarly, popular, and trade sources. They spent 15 minutes working through the guide to make claims about which source was scholarly, which was popular, and which was trade. They then discussed their findings as a group and tried to figure out the salient clues to help them identify types of sources found online.

LTS Staff: Emily Belanger, Megan Brooks

Learning to identify sources: a short lesson plan

Note: This was originally published on my now-defunct website, librarygrrrl.net. I repost it here as a companion to the post describing it for a faculty audience.


A couple of days ago, on a list I follow, someone posted a link to “Just what am I looking at?” from the Distant Librarian. The person was wondering if anyone had any ideas about how to “unflatten” the online world for students. It reminded me of an article Barbara Fister wrote called Rebundling the Unbundled Article. A recent Project Information Literacy report – How Freshmen Conduct Research Once They Enter College – find that “many freshmen were unfamiliar with the formats of scholarly publications before entering college.” When you add in the challenge of everything appearing on a computer screen, things get complicated.

I don't get it

In response to the message, I shared the following in response (my version here is significantly edited, because blog post = editing, while email reply = less editing.) This is by no means a unique or novel exercise, but sometimes reading someone else’s take on common activity will spur you to try something a little differently next time. With that, enjoy!


One of my colleagues demonstrated this activity in her interview. I have adapted it for several 100-level first-year writing courses, where students struggle to understand how various sources differ.

Identifying Types of Sources – A Hands-On Exercise

materials needed:

  • 4-5 copies each of a scholarly journal, a trade publication, and a popular magazine* related to the course topic (depending on class size, you may need more)
  • handout describing the different types of sources (see any of these web pages for inspiration)
  • in-class worksheets
  • access to computers
  • whiteboard

Students brainstorm on the whiteboard about the kinds of sources they’ve used in the past; the list usually includes web pages, books, scholarly articles, movies, and magazines. I explain that in the realm of “articles” there’s a lot more diversity than meets the eye, and learning to discern what’s what isn’t a something we spring forth from the womb knowing how to do.

The students break into groups of 3. Each group gets one of each type of print source and a worksheet. Each student gets a copy of the handout describing different types of sources. The students examine each item and look for clues to help them identify its type. We typically work for about 15 minutes on this part. I encourage the students to fill out the worksheet.

After  15 minutes, each group reports out on one item. They tell the class what they thought it was, and why. The rest of the class agrees or disagrees with them, and provides reasons why. Each of the report-outs takes about 2 minutes. Students tend to do pretty well identifying the scholarly journal and popular magazine, but struggle a lot more with the trade publications. They tend to have a lot of good conversations during the reporting-out.

At that point, the students always appear to have a lot of confidence about what they’ve just done (with good reason!), so I throw them the following questions:

  1. Where do you normally find materials for your papers? (Cue the chorus of “online!”)
  2. How do you figure out what kinds of materials you’re working with online?

There’s usually a moment of hesitation (unbearably long, no matter how short it might actually be), and then someone will say “could we try using the clues we just used for the print sources?”

And so we do! In small groups, the students use our discovery layer to identify an item on a topic of interest. With an item on-screen, they answer the following question:

Is this a scholarly, popular, or trade source? List at least 3 items from your handout which help you make that claim. Are there clues not on the handout which also help you determine what kind of source this is? List those too.

We come back together as a group to report out again, and it’s AMAZING how quickly and easily the students are able to accurately describe the what their online items are. The clues from the handout are helpful, but I’m always surprised at how many students use the search facets in our discovery layer to limit items to different types of sources as well without any prompting from me or their professor. In classes where we haven’t done this kind of exercise, the students rarely notice the discovery layer facets (even when they might be really useful.)


Outcomes

I did this exercise twice in classes last spring. I ended up meeting with many of the students in the classes after our instruction session as they tackled their research projects. All of those students were able to describe what they’d already found in the language they learned in class, and were able to say, “I’m looking for more popular sources” or “The scholarly sources I found were too complex for me to understand with my current level of knowledge. How can I find scholarly sources that aren’t so complex?” I count both of these reactions as anecdotal evidence that the students learned in our sessions. The fact that the professor reported back to me that the students asked her fewer source-type questions AND wrote papers that fulfilled her requirements around types of sources is the other piece of evidence I use to point to this as a useful exercise to adopt in classes where students struggle with understanding how to contextualize sources they find online.

Thumbs up!
Thumbs up!

* This can be really hard in some academic libraries! For one class I actually went to a bookstore and purchased 3 popular environmental sciences/spirituality magazines last spring because we just didn’t have anything like that in our collection – or our staff lounge! – in print.

Coordinated instruction for econometrics

Note: This was originally published in the Research & Instruction Showcase on the Wellesley College LTS web site. This was what came out of 2014’s course workshopping.


Chart showing deaths in the US and Europe in 1918 and 1919 due to the influenza pandemic
“Spanish flu death chart” by National Museum of Health and Medicine – Pandemic Influenza: The Inside Story. Nicholls H, PLoS Biology vol. 4/2/2006, e50, http://dx.doi.org/10.1371/journal.pbio.0040050. Licensed under public domain via Wikimedia Commons.

All students majoring in economics at Wellesley are required to take ECON 203: Econometrics, with 3 sections offered every semester. A research and instruction librarian and instructional technologist have long been involved in supporting this course. In Fall 2014, with enthusiastic agreement from the faculty, we changed in-class instruction to better prepare the students to tackle their semester-long group projects. In the library session, students would be asked to consider an existing study and figure out how and why the author constructed it in a particular way. In the instructional technology sessions, students would participate in small group exercises.

The faculty suggested that we build our instruction sessions on a paper about the economic outcomes of people born during the 1918 influenza epidemic in the United States. We selected this article because it relied on IPUMS data, a commonly-used data set in this course; the article outlines a common type of econometric study; and it demonstrates to students that you can use econometric methods to examine aspects of public health or other social concerns.

During the research and instruction librarian’s class visit, students read a popular news magazine article or blog post about the paper and reviewed the microdata section of the paper, then discussed how they might answer the questions the author posed using econometric methods. For example, students might want to know if someone’s mother had the flu, but they couldn’t get at that data easily. Instead, students had to learn to ask different questions:

  • Was a person in utero during flu epidemic?
  • Were they living in a state with a high incidence rate of flu?
  • What were their life outcomes compared to people born in states where the flu illness rate was lower?
  • What were their life outcomes compared to people born right before or right after the pandemic?

Following that discussion, the librarian and students talked about how to get into the IPUMS dataset and how to translate plain English questions into identifiable variables in IPUMS.

The instructional technologist visited each class twice: once to teach the students how to use Stata, and once to work with them on IPUMS data. In both sessions, students not only received basic Stata training, but also were required to do hands-on work in small groups. All the examples in both sessions used data identified in the influenza article, maintaining a common thread throughout the sessions to maintain student comprehension.

As a result of this new approach, we saw enhanced student engagement during class. Coordinating our efforts allowed us to give the students a cohesive message about approaching questions quantitatively, while making the learning process more interactive.

LTS Staff: Carolin Ferwerda, Megan Brooks

Course workshopping: library instruction for econometrics

Note: This was originally published on my now-defunct website, librarygrrrl.net. What follows is my personal reflection on the course workshopping session I described in another post. We pitched this example of coordinated instruction to faculty as described in this post.


Last week I wrote about the course workshopping my team did over the summer, to revamp how we teach particular library instruction sessions. In that post, I promised to write about how that course workshopping changed what I was doing with ECON 203: Econometrics, a mid-level, required course for all economics majors at Wellesley.


Here’s what the old instruction session looked like:

I went to each section of ECON 203 for 15 minutes. I was a total talking head. “Here are the three aspects of the group projects I can help with. Here’s the research guide. Here’s how to sign up for a group meeting.” There was literally nothing about this instruction session that was pedagogically effective.

  • There were no learning objectives.
  • There was no opportunity for interaction between me and the students in the class session.
  • When the students came to see me after class, they were ridiculously confused about what they needed.

Gag.


Workshopping prep message to my colleagues

Background: this is a required economics course. Most majors take it in the their second or third year. There are typically 3 sections each semester. The instructional technologist meets with them to teach them how to use Stata and, depending on the class, IPUMS. I meet with most sections each semester for a 15-minute intro to finding data and literature, in their classroom. There are computers in the classroom, but the students generally don’t have them fired up and ready to go when class starts and I’m hesitant to change that class culture. Mostly, we review the guide.

Typical post-session questions from students:
  • Literature:  does whether gambling is legalized in the state when one turns 18 affects an individual’s decision to go to college? JSTOR and EconLit results are mostly about athletic gambling or gambling among college students.
  • Policy: finding state by state marginal income tax rates by year – hoping to find something like this.
  • Policy: history of standardized testing across states by dates
  • Data: crime rates by state; compulsory education by state; how to find other panel data by state
  • Policy, literature, data: looking at effect of US immigration policies on the real wage of Americans, before and after of 1965 lift of the national origins quota system and want suggestions on other recent immigration policy changes or incident similar to the Mariel Boatlift on the Miami Labor Market through which there was a significant increase in immigrants of specific origin in a specific region, in order to do a cross-comparison among different states.
What I need help with:
 
I’m wondering if there is something more useful I could do in that time.
  • Would it be useful to take a sample question and work through it in that time? If so, what question might be good?
  • Where do I focus my time most effectively in this class: finding data, finding policy, or literature research?
  • Would it be useful to give the students a pre-meeting assignment? If so, what would be on it?

Results of the Workshopping Session (in photo form, click to embiggen)

Notebook pages with handwritten notes from workshopping session.
Notes from author’s workshopping session for econometrics library instruction.

My instructional technology colleague also workshopped her two sessions with this course. Then we set up a meeting with the faculty members teaching this fall to see if we could try something different in all three of our sessions. One of the ideas we pitched was that the students would all get an article ahead of the class session and read it, then in class, they would think-pair-share about:

  • what data set is used?
  • what variables are used from the data set?
  • what is the policy change or moment in time when something changes?

One of the faculty suggested we use this awesome article by Douglas Almond about the fetal origins hypothesis and the 1918 influenza pandemic in the U.S.

Not only were the faculty members were on board with the idea, they also agreed that 15 minutes wouldn’t be long enough, so they shifted my visit to their lab session.


Here’s what the new instruction session looked like:

Both instructors invited me to attend their lab session instead of a class session; the students were in a different frame of mind to work in that session.

In the first faculty member’s lab, we agreed to send the students this synopsis of the article ahead of time. Then in class, she led all of us through an in-depth discussion about how you would construct a study to answer the question: if your mother had the flu while you were in utero, would your life outcomes be worse than people whose mothers didn’t have the flu? Then we handed out a printout of the data section of the Almond article (p. 683-686) to the students and I led them in a discussion of the questions we’d identified: what data sets are in use; and what variables does the author use?

In the second faculty member’s lab, I flew solo (the instructor didn’t attend – eek!) The students read this blog post ahead of time, and read the data section of the article in lab. In pairs, the students answered the following questions:

  • what is the research question in plain English?
  • what data sets does Almond use for his analysis?
  • what are the x (independent) andy (dependent) variables?
  • why didn’t Almond just find out whether the moms of the babies had the flu?
  • what’s the strength of the his two methods of analysis?

As a large group, we closely examined the data section of the article , identifying  names of variables, years when those variables were available, etc. We went into IPUMS-USA and selected the samples and variables Almond used. Then we walked through how to create an extract, change it to Stata format, and download it.


Outcomes

What a difference doing actual instruction makes! I’ve met with more than 75% of the small groups so far this semester, and every group has come to me significantly more prepared to work with the concepts in their project.

  • They understand the difference between the data they are finding in IPUMS and the data they are finding that reflects a policy change.
  • They are conversant with how to look at, read, and understand the variables and samples they identify in IPUMS.
  • They are still asking me questions about how to build their queries in Stata (I happily pass them to my instructional technology counterpart for that) but we can have really good conversations about how they might want to logically consider their question.

In short, I’m thrilled with the changes I see in the students after changing this instruction session up. I can’t wait until the end of the semester to talk with the faculty members to see what their impressions of the group projects is. I hope to use that feedback when talking with the faculty member who’s teaching the class in the spring, to see if she’ll be on board as well.

When tried and true doesn’t work any longer: workshopping library instruction

Note: This was originally published on my now-defunct website, librarygrrrl.net. I enjoy puzzling through tricky teaching conundrums in small groups. What may seem obvious to someone else can be a breath of fresh air to me. This session included 5 research & instruction librarians and an instructional technologist who had a high level of comfort and trust with one another. I remain thrilled with the outcome of this workshopping.


There comes a time in many instruction librarians’ careers when all of a sudden, it’s really obvious that what they’ve been doing in a class just isn’t cutting it any longer.

  • Maybe there’s a new faculty member teaching the class.
  • Maybe professor changes what they’re teaching in the class.
  • Or maybe, just maybe, everyone is bored with the same old, same old (this is particularly true for those mid-level, required courses in a discipline – I’m looking at YOU, econometrics.)

This past summer, the team I’m on decided to tackle this problem in a head-on, low-risk, manageable way. Everyone picked one course they regularly taught that they wanted to revamp in some fashion, and the rest of the team helped them think through how to change it.

The courses we workshopped each week this summer ranged across the disciplines and across course levels:

  • 100-level first-year writing course focusing on Wellesley College
  • 200-level required econometrics course (librarian)
  • 200-level required econometrics course (instructional technologist)
  • 300-level French course on La Belle Epoque (in French)
  • 300-level history seminar on ethnic and religious violence

You’ll notice we had no science courses, for a good reason: our science librarian was on leave for the summer.

The format of each session worked the same way. Each person sent out material ahead of time, including:

  • course description
  • background on the course
  • description of how they have been involved in the course, learning outcomes, etc.
  • assignments, handouts, guides, or other supporting materials
  • description of “what I need help with” – which ranged from figuring out how to talk with an instructor about changing an assignment to dealing with a class where people in the class might be complete novices or subject experts to using extremely limited class time differently to basing an instruction session solely on discussion

Each workshopping meeting lasted an hour. The person who was “on” that week gave a brief overview of what was in the material she sent out and provided more context than what would fit in email. The conversation went from there. The “on” person always had free rein to say, “this conversation is straying and I need us to focus over here for a while.” (Because we are a chatty, idea-filled group, this happened more than once.) Typically, the “on” person and one other person in the room took notes or wrote on the whiteboard, so there was a record of all the wonderfully smart, intelligent, and 100% feasible ways to revamp the instruction.

In one of my later posts, I’m going to write about how I revamped instruction for econometrics, focusing on how this workshopping meeting gave me a different kind of confidence when I spoke with the faculty members teaching the course and asked if we could please please PLEASE do things differently because I was being swallowed alive by supporting the students after our session.

(Interestingly enough, the teaching and learning center at MPOW is hosting a series of similar lunchtime meetings for faculty this year, based on the Knotty Problem Roundtable developed by Mary Rigsby and Suzanne Sumner at the University of Mary Washington. I’m eager to try that method out with this team in the spring, focusing on different courses. The barrier to entry seems lower than what we did last summer – although the outcomes might be less dramatic as well.)

If you’re in New England, it looks like NELIG’s December meeting is going to be exactly this kind of thing! If you don’t have a group of colleagues at your workplace who you can do this kind of workshopping with, consider applying to present/get feedback from the wonderful NELIG community! Deadline for that is tomorrow, November 7, so get on it!