These slides (pdf) were part of a December 2015 presentation at the Vermont Alliance for the Social Studies Conference in Manchester, VT.
Session description: Empower students to do the work of historians by using online archives and online tools such Google Maps, graphing sites, text-analysis utilities, and free website hosting platforms. Learn how one AP U.S. History class created an original digital history project consisting of an interactive map and data visualizations. The workshop will explain how the class collaborated and proceeded, step by step, through text mining, brainstorming, researching, data processing, mapping, writing, and using spreadsheets for developing graphics. The class’s final product, Federalists and Antifederalists in New Hampshire (apush.omeka.net) will be presented. How the project fits into the digital humanities (DH) will also be discussed.
On the last day of school this year, my AP U.S. History students put the finishing touches on a digital history project they had started in May, following their AP Exam. The collaborative project, on which students contributed individual parts, shared research tasks, and worked with online tools, was my first classroom foray into the digital humanities. The whole process, which from start to finish lasted about three weeks, had two primary goals: contribute something new to the historical community and create and share the work digitally. After explaining to students that new scholarship was typically derived from primary sources, I led the class in a series of small lessons and tasks that gradually resulted in original research findings. Albeit small, the project gave students an introduction to the digital humanities and gave them the opportunity to collectively practice the work of historians. Their efforts resulted in Federalists and Antifederalists in New Hampshire.
Here’s an overview of how the project was accomplished:
Introduction to Text Analysis and Digital History
To begin, students were made aware of text mining as a methodology that can be used to generate research ideas. Together, the class read a short intro to text mining and watched a few related YouTube videos, including Big Data + Old History, a snappy overview of how computer science and distance reading has changed historical analysis. Students later experimented with the concept of text analysis by pasting full web pages into the playful academic search tool Serendip-o-matic, which searches image libraries like Flickr by extracting keywords within large text.
In preparation for the next part of the process, I generated three separate word-frequency text analyses based on three volumes of the State and Provincial Papers of New Hampshire, and had students estimate the time period associated with each word list. (I simply used an online text analyzer for the lists.) Once several students made accurate estimates, and the class could see events of the past reflected in the word lists, I had students run their own practice analyses using several different milestone documents in American History, including the Proclamation of 1763, the Marshall Plan, and the Federal Highway Act. When students explained to the class the extent to which their analyses produced expected results, word frequency seemed like a viable way to “read” a large collection of primary sources without really reading a large collection of primary sources.
Investigating Research Topics in Early NH History
Twenty-nine volumes of the State and Provincial Papers, covering the years 1630-1800 and consisting of approximately 25,000 pages, were divided among the class and each student ran at least two text analyses. After time was spent evaluating the results, and after students also explored other books like Jeremy Belknap’s 18th-century History of New Hampshire, each student was asked to use several pieces of scrap paper to propose potential research topics to be shared in a classroom gallery.
All proposals were considered, the most interesting were discussed, and in the end I steered the class in the direction of three possibilities: Indian problems in different regions of New Hampshire; border disputes among different towns; and the state’s role in ratifying the Constitution. Regarding the Constitution idea, the student’s actual proposal had addressed New Hampshire’s “hesitancy” in the ratification vote. When this idea was investigated online, including Google Books and Google Scholar, analyses were found but it seemed that a town-by-town accounting of federalist and antifederalist sympathies was unavailable. Something like this ratification map of Massachusetts, in other words, was apparently not yet on the web. It was at this point that the final direction of the research project became clear.
Writing About the Map and Making Data Visualizations
With the map in place, students next worked on language for the website (which I was putting together at omeka.net) and then visualizations that would help clarify research findings. Before writing, students were asked to refresh their knowledge of federalist and antifederalist voting habits by reading overview articles from the library’s electronic reference collection. The readings were briefly discussed and then, in groups, students wrote copy and submitted their work in Google Classroom. The submissions were combined into one document, given one voice, and then collectively edited for inclusion on the website. (Yes, this took a while.)
Data visualizations were completed individually by students using Plot.ly, an online graphing tool that allows for uploading spreadsheet data. Each student submitted at least one visualization and the best were included on the project website. While Plot.ly nicely presented data, DistanceFromTo.net greatly facilitated data collection. An online distance calculator, DistanceFromTo enabled students to easily measure the linear distance between key water ways and New Hampshire towns, and proved vital in proposing linkages between geography, economics, and the ultimate decision of New Hampshire residents to vote for or against the Constitution.
About a year ago I sat down to research and write an article for Knowledge Quest, the Journal of the American Association of School Librarians. I was asked by guest editor Pam Harland to contribute a piece on library spaces. Before agreeing, I thought about finding an under-documented angle on the topic. That brings me to the first two sentences of the article, which appear below, along with the full article (but only in a proof version because ALA locks down their content):
Much has been written about the advantages associated with the learning commons model of library design. Less has been written about its drawbacks. The open, technology-rich, and collaborative atmosphere of a commons nicely supports teaching, group work, and digital communication. Yet, for some tasks and for some students, the commons atmosphere might also limit the library’s usefulness.
For tasks requiring concentration, such as reading and problem solving, the social and active nature of a learning commons could be distracting. And for the more introverted student, a loud and busy room might seem uninviting. For these reasons, library design should take into consideration different types of work and different personality types. In a learning commons, therefore, space and resources should be organized and managed in ways that meet 21st-century learning needs but also ensure fairness and ….
Last week, Molly Wood, the NYT’s tech blogger, asked if a Chromebook is all you need. Last year, I asked myself the same question. My library was due for computer replacements and I wanted to use the opportunity to make the room’s workspace more spacious, flexible, and efficient. Chromebooks seemed perfect. I thought that by removing desktops and bringing in a cart of ultra-thin laptops, table space would emerge, as would the ability for students to work in comfortable chairs or even on the floor. Further, it’d be easy to collaborate in small, ad-hoc groupings. And it could be done quickly. The Chromebook’s solid-state structure, with its fast boot-up time, seemed like an excellent way to be “always on.” I wondered, though, how the place would function with a whole new OS and a whole new approach to computing. Would browser-based cloud computing meet the academic requirements of my school?
After a full year of trying it, it’s clear that the answer to this question is, yes. The desktops, along with their slow network connection time, weren’t missed. Initially, more limited printing seemed like a significant loss, but it didn’t take long for teachers and students to become experts at sharing work through Drive. Similarly, my school’s adoption of Google Apps for Education seemed to be accelerated by the Chromebooks’ single sign-on for multiple integrated services, like mail, calendar, Docs, and add-ons like EasyBib. Now, many students can’t even imagine working with clunky software on a big, bulky machine. In short, as a result of abandoning desktop (and laptop) computing for a mobile approach to schoolwork, my library has become a more modern, more collaborative, and more productive workspace.
We also chose to deliver this new computing throughout the building. By using cushioned camera cases for five Chromebooks, we created what we call “Chromepacks” — kind of like field packs, but for computers. The library has a total of 41 Chromebooks, 25 of which are in Chromepacks available for use in the library or for classroom check-out. At the end of the day the camera cases are simply opened and placed alongside a DIY charging cart. The charging system, like the use of the Chromebooks themselves, is fast and easy.
While I haven’t yet used this rubric, I anticipate applying it to forum discussions over the summer for my AP U.S. History students. My plan is for the APUSH class to begin online in mid-August so that in-class coverage in September can begin with the later colonial period and follow a pace that matches the College Board’s exam schedule. Before making the rubric, which is based on one for student blogs, I searched the web for useful, ready-to-go samples. Although I like option #4 on this page from Simmons College, I thought that I could adapt the blog rubric to accomplish as much in fewer words.
Excellent. Student’s responses appropriately address initial and/or follow-up questions and replies. The student neither monopolizes the discussion nor simply joins in and drops out. Responses are focused and coherently integrate examples with insight and analysis. The student’s answers reflect an awareness of issue complexities and, when appropriate, consider alternative perspectives. Responses reflect thorough preparation and in-depth engagement with the topic.
Good. Responses are focused and appropriate in number, but analysis is limited as fewer connections are made between ideas. Although new insights are offered, they are not fully developed. The student’s participation reflects moderate engagement with the topic.
Fair. Responses are mostly description or summary and show only partial awareness of complexities or alternative perspectives. Few connections are made between ideas. The student’s participation is underdeveloped and reflects passing engagement with the topic.
Poor. Responses are unfocused or simply rehash previous comments. Participation is limited and displays little evidence of student engagement with the topic.
No Credit. No participation or responses consists of disconnected sentences and show no preparation and no engagement with the topic.