Teaching History in the Digital Age
Updated: 6 hours 9 min ago
The title of this post is purely rhetorical because no one has asked me to teach a MOOC. In fact, I have not been involved with MOOCs at all, except as an observer from afar. Instead, the title is the result of me wondering why anyone would teach a course with tens of thousands of students enrolled (maybe more), who you would never meet, and for which there is an enormous amount of start up effort (designing the course, filming the lectures, figuring out the grading algorithms, etc., etc.)?
I understand why universities want to get MOOCs out there with their most prominent professors teaching them. Having a big name professor offer a MOOC brings many, many eyeballs to your campus logo (and even better to the website) and helps burnish your image in a global market for higher education. In short, MOOCs are marketing dollars well spent, even if they aren’t yet showing any sign they are good for the bottom line, given the terms that companies like Coursera are offering colleges and universities.
But why would a professor, especially a prominent (and presumably busy) professor, bother to spend all the time and effort necessary to bring a MOOC to market and then, one assumes, have some connection to its implementation? After all, designing a new course or redesigning an old one takes a lot of time in the analog world. When you consider the time required to film lectures, work with an editor to polish up that film and add in B-roll, design online assignments and assessments, and think through how students are going to progress through the various online materials, a MOOC represents a lot of time and effort.
After puzzling on this question, I can think of two answers.
The first is what we might call educational altruism. MOOCs offer faculty members a chance to make their courses available, for free, to the widest possible audience. As scholars we are supposed to be engaged in the circulation of knowledge, and being able to circulate one’s knowledge of a particular subject to 70,000 or 100,000 students, even if only a tiny fraction of them complete the course, is a potentially wonderful thing. I’m not sure that those students learn anywhere near what they would learn in a well designed face to face class, given that MOOCs largely replicate the lecture/listen binary model that is so ubiquitous in large American universities. That model has been demonstrated in countless studies by cognitive scientists to yield only minimal learning gains, even when taught by famous, or brilliant lecturers. But if the purpose of teaching a MOOC on one’s subject is to make one’s expertise in a given subject available, for free, to as many people as possible, that’s a laudable act. I’m not sure how much of this educational altruism there is out there, but I’m willing to admit that it might really exist.
The second reason is more mercenary and involves the sale of books and/or other collateral products. In particular, I wondered whether MOOCs offered faculty members an opportunity to make some serious money on the teaching and learning products that they have created?
To test my idea that book sales might just be part of the reason why some faculty members would teach a MOOC, I randomly selected eight courses across the disciplines and from various universities on the Coursera website. I tried to do the same thing at the Udacity site, but one cannot read the course syllabi there. What I found was that on all eight syllabi, the only readings students were expected to do were from free and open source/open access materials. However, five of the eight professors recommended or suggested as optional books that they had written, ranging in price from $8 to $110. One of the professors recommends only open source works, and the other two recommend books published by others for either $44 or $142.
If we assume for a minute that some fraction of the tens of thousands of students taking part in a given MOOC go ahead and purchase the “recommended” or “optional” book written by the professor teaching the course, the potential for significant earnings via book sales is very real. For the sake of argument, let’s say that I taught a MOOC that drew 50,000 students and I recommended as optional the ebook version of my new book ($19.95). And, for the sake of this same argument, let’s say that 10% of the students purchased a copy. Under the terms of my contract with the press, I would make just under $7,000 in royalties from the sale of those books. While $7,000 is not enough for the downpayment on that beach house I’ve been wanting, it’s still $7,000 in additional income.
Different states and different institutions have widely varying rules (and even laws) governing whether faculty members can require students to purchase a book from which the faculty member receives income. But those rules were made with the standard course for credit model in mind. MOOCs disrupt that model by not offering credit and in the cases I looked at, by having all textbooks be “recommended” or “optional.” Once MOOCs move to the credit bearing/tuition charging mode, it will be interesting to see whether there is any change in this approach. I suspect there won’t be, if only because the openness of a MOOC begins to break down once it starts to get expensive for students.
Over the weekend my friend and colleague Peter Haber passed away after an extended illness. I was only fortunate enough to know Peter for the past four years, but I benefitted greatly from his friendship, his collegiality, his ideas, and his good humor.
Like my former colleague Roy Rosenzweig, Peter was a “connector” — one of those people who brought others together for the benefit of everyone. Through Peter I have met and begun to work with a number of colleagues in Switzerland and Austria, colleagues I never would have met otherwise. More importantly, though, my understanding of digital history and digital humanities is so much richer for having read Digital Past. Geschichtswissenschaft im digitalen Zeitalter (2011). What Peter brought to the study of digital history was a scientific rigor, a style of analysis, that is so often lacking in English language scholarship on our field. If I could quibble with one thing about the edition of the book that I own, it is the photograph of Peter on the back cover. In that photo, he seems dark and mysterious. Those who knew him well, know he was anything but dark or mysterious.
Perhaps the most tangible evidence of Peter the Connector is his co-authored volume (with Marin Gasteiner), Digitale Arbeitstechniken (2010). When I read these essays I came away with a much better sense of the kinds of work being done by my German-speaking colleagues in digital history — work I would likely not know if Peter and Martin had not collected it. More importantly, though, I began to think about several issues near and dear to me in new and different ways. That is what the best scholarship does for us.
But really, Peter’s greatest academic contribution, in many ways, has been Hist.net, perhaps the longest-lived digital history blog in any language. With his close friend and collaborator Jan Hodel, Peter spent more than a decade making all things digital and historical available and accessible to a wide audience. I knew of the blog before I knew Peter and Jan, and one of my happiest professional moments was the day I received an email from the two of them inviting me to speak at a conference in Basel. For my own family health reasons, I couldn’t attend that meeting and so I was very pleased (and relieved) when they kindly invited me back the following year to speak in Basel. That meeting was the starting point of our three way friendship and collaboration on Global Perspectives on Digital History, a project that kept us connected until he became too sick to continue.
One the most enjoyable days I’ve spent in the past several years was with Peter, when he was still feeling fine, touring the Fondation Beyeler, then returning to Basel for a coffee. That is the Peter I will remember. But I will also remember the Peter who, when you said something he didn’t entirely agree with, would cock and eyebrow, pause, and then ask a probing question that politely disagreed, while trying to find a way that the two of us could agree. I will miss both of those Peters very much.
As a follow up to my previous post about history’s gender problem, I now want to offer some possible solutions for our discipline. Before I do, however, a bit more context on the gender problem History has here at George Mason seems warranted. Of the undergraduate programs in our college with more than 100 declared majors, only three have enrollments where fewer than two-thirds of those declared majors are female — History (40%), Government (41%), and Economics (34%). Every other substantially enrolled major in our college is more female than the university average of 62%.
Further, our MA enrollments are similarly skewed. Overall MA enrollments in the College of Humanities and Social Sciences are 60% female, but in History, MA enrollments are only 42% female. Thus, the problem I identified in my previous post extends beyond the undergraduate years into the MA. Given what Rob Townsend has written for the American Historical Association, I suspect we are very typical of history departments nationwide.
What then can be done to deal with history’s gender problem (and not just at George Mason)?
Too often, the standard answers to this sort of gender problem in an academic discipline are to increase the number of female faculty and/or to teach more courses that will appeal to female students. To my mind, the first of these is pretty obvious and needs constant attention. Even in a department that is changing rapidly, only 40% of the tenure track faculty in History here at Mason are female, so further attention to finding a full gender balance is something we’ll need to continue to work on. But it’s the second of those proposed solutions that I think is off the mark.
First of all, such phrasing assumes that male and female students can’t or won’t be interested in the same things about history, and second, it tends to turn on simplistic notions about preferences, such as male students want military history (and women don’t) and/or female students want women’s history (and men don’t). While I think information about student preferences for course content is important, the problem is more complex than simply offering a few more of this or a few less of that type of course.
Instead, I think the problem seems to lie in the way history is taught and in the ways we conceive of and describe to students what they might do with their degrees in history. One of the most important reasons I say “seems to” here is that there is very little in the way of solid data on the role that gender plays in the choice of major in college, and what little data exist tend to be focused on the much greater gender gap in the STEM fields.
Nevertheless, it is possible to glean some useful information from some of the STEM-focused studies. For instance, in a 2009 report by Basit Zafar, an economist at the Federal Reserve Bank of New York (“College Major Choice and the Gender Gap“), offers some very interesting data on the role gender plays in the choice of major. Zafar’s study was limited to students at Northwestern University and so does not pretend to be broadly predictive. However, it does offer a very rigorous analysis of data. Zafar concludes that gender differences in major choice between men and women are not based on expectations of future income, nor are they explained by differential levels of confidence in one’s academic abilities, nor (for those with US born parents) do beliefs about the status of a future job resulting from a major play an important role in the choice of major.
Instead, Zafar concludes that for those with US born parents the most important factor in the choice of major is the degree to which one expects to enjoy the coursework and the degree to which one expects to enjoy a future career tied to that major, with female students having a much greater concern for these two factors than male students (pages 25-28). For those with foreign born parents, whether male or female, perceptions of the status of the major and the status of jobs that might result from that major play a more important role for both male and female students, but especially for male students (20).
Assuming for a minute that Zafar’s data could be replicated across a much broader sample of students, then we need to think very carefully about the ways we teach about the past. Ask a group of graduating history majors how much diversity there was in the teaching methodologies they experienced in their history courses and I think it’s a safe bet that they will say, “not much.” The vast majority of history classes follow a general lecture-plus model in which professors mostly lecture with some discussion time thrown in daily or weekly. At some point this style of teaching has to become boring, no matter how good the professor is at delivering it.
We also need to think very carefully about the ways we talk about careers our students might pursue after graduation. As the digital economy rolls over us, the work our students will be doing after graduation is increasingly very different from the work they might have done five or ten years ago, but by and large our descriptions of that work remain the same, rooted in a series of generalized notions about what one might do with a liberal arts degree. It’s time for us to get much more specific about the jobs our students are getting/will get in the new economic reality they’ll be living in.
Which brings me to my final point — these two considerations do not exist in isolation from one another. Instead, they are inextricably linked. One way to increase the levels of enjoyment our students experience (or expect to experience) is to begin creating courses that break the lecture-plus model and begin to incorporate project work, service learning, and other forms of “doing history.” Rather than continuing to talk to them or with them about the past, it’s time to develop courses that get them into the field, into the archives, into employment sites, at museums or historic sites, in short, give them a chance to exercise their creative energies. One more great lecture or one more well thought out five page essay assignment just isn’t going to do that.
Examples of what I’m talking about exist all over the country, but they are the exceptional courses in history curricula. If we are going to take seriously the notion that our gender problem — which is very real — needs to be addressed, then it’s time for a national conversation about how changing our curriculum is the way to address that problem.
In the April issue of Perspectives, Rob Townsend offers what is perhaps his last analytical article for the American Historical Association’s monthly newsletter (Rob has moved on from the AHA to a new job): “Data Show a Decline in History Majors.”
From the title of this post, you might be inclined to think that I’m worried that a decline in history majors is the looming disaster for history departments around the country. If only it were that simple. You see, undergraduate history programs don’t have an enrollment problem. We have a gender problem.
According to the National Center for Educational Statistics, in 2010 just under 57% of all undergraduate students at 4-year non-profit institutions of higher education were female and the data for degrees conferred are similar. According to Rob’s article, fewer than 41% of the BA degree recipients in history departments were female in 2011. Our data here at George Mason are even worse. Female history majors represent only 40% of our total at an institution where 62% of our undergraduate students are female.
That yawning gap between overall undergraduate enrollments and history enrollments is the size of our gender problem.
The problem is bad enough on its own to require us to take action as a profession. In addition to the obvious need to do something about the relatively low popularity of history as a discipline among undergraduate women, we also need to fix this problem for pragmatic reasons. As has been reported widely over the past several years, institutions of higher education are increasingly enrollment driven. This isn’t news to private institutions who have been living and dying by their enrollment numbers for years. But it is a new experience for many public institutions, who only in the past decade or so have been learning what it’s like to live or die by the same data. In this fiscal environment, if we don’t fix our gender problem soon, history departments all across the country should expect to see tenure lines and other important resources shifting to departments with more robust enrollments — enrollments that will only be robust with large numbers of female students.
What is to be done? None of the answers are simple or obvious and there is certainly no silver bullet that could solve our gender problem in undergraduate history education. Instead, I think it is high time we embark on a sustained conversation about change in undergraduate history education — including changes that will make our discipline just as appealing as other majors are to the largest segment of the undergraduate enrollment on our campuses.
The alternative is to decide that history is doomed to be an ever smaller part of the undergraduate enterprise. I believe that if we really commit ourselves to doing something about our gender problem, we can and will find ways to change for the better. But we need to commit. And soon.
My previous post about digital historical text generated some very interesting comments, both here and on Twitter. I met with my students again last night and we had an extended discussion about those discussions, so thanks to everyone who chimed in. What follows is a summary, more or less, of our conversation last night.
We were particularly taken by Steve Ramsey’s critique of my post, especially the following paragraph:
If so, your problem clearly necessitates access to the original work. But if you are concerned merely to read it, it seems to me very hard to argue against a digital copy. And the truth is that even digital copies can rival the originals for problems that apparently involve the “thingness” of the thing. Scans of the Beowulf manuscript — which no responsible scholar should ever touch — are of such density that one can see the hills and valleys of the vellum. I’m unable to imagine what it is about scans of the War Papers that make the original “disappear from view” or resistant to prioritization as historical sources. Are you prepared to argue that Spencerian handwriting moves documents up and down the hierarchy of importance?
None of us was arguing that digitizing texts was, in and of itself, bad. We all agreed that access to the content of those texts was an unqualified good. And I’ve gone back into the original post and clarified my language about the War Department project, because the way I wrote one sentence made it sound as though I was unhappy with the scans of the documents (which are copies of the originals due to a fire that destroyed the originals — see the project page for more on this issue).
Nevertheless, we all agreed that as historians, we care about the “thingness” of the source, and we care a lot about that. Not because of some “thinly veiled nostalgia” for the thing itself, but because texts are both texts and historical artifacts and so students of the past need access to that thingness if they are to understand both aspects of the source — it’s content and its materiality.
The importance of the text itself is pretty obvious and so doesn’t need clarification. But the materiality does. We discussed, for instance, the problems posed in teaching using historical newspapers via a database like ProQuest Historical Newspapers. The ProQuest search delivers the story requested abstracted from the page that it appeared on. The full page is available as well, but unless students are taught what a newspaper is, how the arrangement of content on the page and its placement in a section is the result of a dynamic process involving editors, writers, and layout staff, they will have no sense for why the placement of that story matters sometimes as much as the content of the story itself. “Above the fold” or “below the fold” become meaningless when a database serves up only the story.
ProQuest at least returns a pdf of the original story, so students can see the type face and (often but not always) the images that went along with the story. And they can examine the headline and consider why a headline might be more sensational than the content of the story warrants — again, as a result of that dynamic process involving several actors I just described.
As for the hierarchy we assign to sources, we also agreed that sometimes we might just assign a different importance to a source based on things other than the words in the text — that all sorts of other factors, most of them material, might convince us that this or that source was of greater import. Knowing everything about the source — not just its words, but the marginalia, its placement in a collection, or where it was found — can all shed potentially important light on what the source means and meant to others at the time it was created or later.
Given all of that, we wanted some sort of best practices for digitizers, that included common standards about such things as images of the original to go with the plain text on a white screen. As Sarah Werner wrote in her comment, creating such standards will require historians, bibliographers, archivists, and technologists to get together and discuss, among other things, what they (and our students) aren’t seeing when all we get is black pixels on a white screen.
Several years ago I took a group of Mason students to Prague, Vienna, and Budapest. Among the things I’d planned for them was a visit to the Klementinum in Prague where the Codex Gigas (the “Devil’s Bible“) was on display. Needless to say, when I told them we were going to a library to look at a book, they were decidedly underwhelmed. Until they saw it up close and personal.
At 90cm x 50cm and weighing in at 75 pounds, it’s quite a book and was unlike anything they had seen or expected. More intriguing to them, though, was the legend surrounding the work. Created sometime between 1200 and 1230 in a monastery in Bohemia, the story that goes with the bible is that the devil himself helped a monk create it in just one night. In exchange, the monk included an image of the devil as part of the text decoration. Despite their earlier reluctance to go look at a book, the students pronounced the whole thing kind of cool.
I was reminded of that trip the other night during a tutorial I’m leading with four of our most talented doctoral students. One of those four, Jeri Wieringa, asked one of those questions students ask us with regularity that makes us think really hard. I’ll paraphrase what she asked: “If we digitize texts and present them to students as just so many pixels, are they losing an essential connection to the text as a historical artifact?”
This question led to an energetic discussion around our table. On the one hand, there are obvious advantages to digitizing texts. At the most obvious, the texts, especially those before the age of the typewriter, become much more legible and so therefore accessible to a wide audience. Anyone who has taught pre-typewriter texts knows just how reluctant students can be when it comes to trying to make sense of handwriting from back in the day. Even excellent tutorials like the one on decoding Martha Ballard’s diary can reinforce the notion that such handwriting is essentially unreadable except by experts or code breakers.
A second obvious advantage is that the text becomes fully searchable in ways that it can’t be when it is just an image of a document. Our Papers of the War Department project here at RRCHNM is a great example of the advantages of having transcribed texts to sort through and analyze using the text analysis algorithm of your choice.
Finally, making the text available in this way opens up any digitized collection to crawling by the various search engines, thereby opening up the collection to a much larger audience.
But, and this was the but that we got stuck on in our discussion, the artifact itself can disappear from the view of the researcher if, as in the case of the War Department project an image of the original is not also available to the researcher.
To put it another way, the coolness of the text as artifact disappears when all the researcher/student sees is black pixels on a white screen. Yes, it’s much more readable and accessible. But there is a bigger potential problem–and this is the one that really troubled Jeri. An essential task of the historian is to assign greater or lesser value to a particular historical source based on his/her growing expertise in a given subject. Some documents are just more important to a given problem or interpretation than others and it’s up to us to help others see that.
But if all documents are reduced to black pixels on a white screen, they start to seem all the same. Given that students/novice historians often have a difficult time placing sources in a hierarchy of importance that they are developing, if all texts look the same, are we making it more difficult for them to develop this skill of prioritizing some sources over others?
We arrived at no answer in our conversation and despite two weeks of ruminating on the issue, I still don’t have one. I’m just going to have to worry about this one for a while longer.
Regular readers of this blog know that in 2008 I created a course called “Lying About the Past” in which my students studied how, over the past several centuries, a variety of people have created false versions of the past, for fun or profit. The goal of the course was to teach my students much greater skepticism about historical sources, especially online historical sources, and I feel very confident in saying that the course, which I taught a second time in 2012, achieved that goal with flying colors.
What made this course controversial, to a small degree in 2008 and to a much wider degree in 2012, was that in each iteration of the course the students created a historical hoax and turned it loose online for ten days to see if they could fool anyone. Because we were not in the business of creating what a colleague calls “zombie facts,” the students exposed their hoaxes after the allotted ten days and then assessed what had and hadn’t worked in their project and why.
Those who disagreed with the notion that my students should turn their (very innocuous) hoaxes loose for a few days felt that I was teaching my students to behave in very unethical ways, that we were somehow polluting the web, or that we had violated something one critic called the implied “academic trust network” that exists online. Of course, my students and I completely understood these criticisms–they were all issues we discussed in great detail in the course. You had to be there each semester to see the care my students took thinking through these and other ethical issues to understand just how central ethical discussions were to the entire course. In fact, I think it’s fair to say that my students spent more time discussing the ethics of the historical profession in this course than in any other history course they have taken or will take.
In 2012 I proposed to my department that Lying About the Past be made a part of the regular curriculum of the department, by which I mean the course would receive its own number and be added to the university catalog as one optional course among dozens that we offer. The undergraduate committee in my department decided that the proposal could go forward only if I agreed to change the central component of the course–make the hoaxes purely classroom presentations rather than turning them loose online. Because the fact that the hoaxes would be placed in front of an unknown audience is the thing that gave the course its energy, its excitement, and made it fun, changing the format in this way would have turned the class project into yet another abstract classroom only exercise and would have sucked the life out of the course. I therefore declined to make the change and the undergraduate committee subsequently rejected my proposal.
What this means is that I won’t be teaching Lying About the Past any longer at George Mason, which I’m sure will make my critics happy, especially Jimmy Wales, who pronounced himself “annoyed” about the whole thing.
But I also think it’s worth considering what the decision of the undergraduate committee means in terms of how we regulate teaching as opposed to research. In essence, my colleagues (who, by the way, I respect very much) decided that it was acceptable to tell a faculty member that he could not teach a course because they disagreed with the teaching methodology. Can you imagine the furor that would ensue if the word “research” were substituted for “teaching” in the previous sentence?
I asked several of my colleagues who had been at Mason for more than 20 years if they could remember a time when a professor had been denied the right to teach a course as he/she saw fit and none could. It’s an interesting and potentially disturbing precedent my colleagues have set, because it says that teaching methods can be regulated in ways we would never allow when it comes to our research.
I have another course up my sleeve that will be almost, but not quite as disruptive as Lying About the Past was. As soon as it is in the schedule of classes, I’ll be sure to post an advance notice here.
[NB: I'm posting this on March 31, not April 1 so that it's clear the entire above message is not a hoax. Trust me, it's not.]
It won’t be long (one month, actually) before Teaching History in the Digital Age is available. But the cover has now appeared on the Michigan Press website and I’m very pleased with the result.
Of the many different courses I teach, the one I’ve made the fewest changes in over the past decade is my survey of modern Eastern Europe. Every other course I teach has been reconfigured in various ways as a result of my research into the scholarship of teaching and learning, but for some reason, I’ve never gotten around to altering this course. I’m ashamed to say that when I taught it last semester, it was really not that much different from the way I taught it for the first time way back in 1999.
I could offer various excuses for why that course seems so similar to its original incarnation, but really the only reason is inertia. I’ve rewritten four other courses and have created five others from scratch in the past six or seven years and because my East European survey worked reasonably well, it was last in line for renovation.
The good news for future students is that I’ve taught it that way for the last time.
Like all upper division survey courses, HIST 312 poses a particular set of challenges. Because we have no meaningful prerequisites in our department (except for the Senior Seminar, that requires students to pass Historical Methods), students can show up in my class having taken no history courses at the college level. And even if they had, the coverage of the region we used to call Eastern Europe is so thin in other courses, it is as though they had never taken another course anyway. That means I always spent a fair amount of time explaining just where we are talking about, who the people are who live there, and so on, before we get to the real meat and potatoes of the semester.
And then there is the fact that this course spans a century and eight countries (and then five more once Yugoslavia breaks up), it’s a pretty complex story.
To help students make sense of that complexity, over the years I’ve narrowed the focus of the course substantially, following Randy Bass’s advice to me many years ago: “The less you teach, the more they learn.” We focus on three main themes across all this complexity and by the end of the semester, most of the students seem to have a pretty good grasp of the main points I wanted to make. Or at least they reiterated those points to me on exams and final papers. And it’s worth noting that they like the course. I just got my end of semester evaluations from last semester and the students in that class rated it a 5.0 on a 5 point scale, while rating my teaching 4.94.
What I don’t know is whether they actually learned anything.
This semester I’m part of a reading group that is working its way through How Learning Works and this past week we discussed the research on how students’ prior knowledge influences their thinking about whatever they encounter in their courses. This chapter reminded me a lot of an essay by Sam Wineburg on how the film Forrest Gump has played such a large role in students’ learning about the Viet Nam wars. Drawing on the work of cognitive psychologists and their own research, Ambrose et al and Wineburg come to the same conclusion, namely, that it is really, really difficult for students (or us) to let go of prior knowledge, no matter how idiosyncratically acquired, when trying to make sense of the past (or any other intellectual problem).
The research they describe seems pretty compelling to me, especially because much of it comes from lab studies rather than water cooler anecdotes about student learning. Because it’s so compelling, I’ve decided to rewrite my course around the notion of working from my students’ prior knowledge. Getting from where they are when they walk in the room on the first day of the semester and where I want them to be at the final exam is the challenge that will animate me throughout the term.
My plan right now (and it’s a tentative plan because I won’t teach the course again for a couple of semesters) is to begin the semester with three short in class writing assignments on the three big questions/themes that run through the course. I want to know where my students are with those three before I try to teach them anything. Once I know where they are, then I can rejigger my plans for the semester to meet them where they are rather than where I might like them to be. And then as we complete various segments of the course I’ll have them repeat this exercise so I can see whether they are, as I hope, building some sort of sequential understanding the material. By the end of the semester I ought to be able track progress in learning (at least I hope I will), which is an altogether different thing than hoping to see evidence of the correct answer compromise.
What do we really know about how our students generate answers to historical questions? Thanks to Sam Wineberg, Peter Seixas, Bob Bain, Stephane Levesque, and others in their orbits, we know a good bit about how K-12 history students reach their conclusions about the past, but when it comes to higher education, we know far too little. In fact, we’re often puzzled by the answers our students arrive at. Why did they assign great importance to a particular piece of evidence when our view is that this piece of evidence was just a of run of the mill source, not particularly worthy of extra attention? Why is it so hard to shake them from their belief that, say, people in the past wanted the same things that people today want?
To date, too many of our answers to these and other such questions have been based on folk wisdom about “kids today” or an over reliance on what we observe in our classrooms as being representative of “all students.” Real research, based on real data, would surely take us much farther down the road toward understanding how our students think.
Fortunately, scholars in other disciplines than history have done some hard thinking about these issues and, just as fortunately, have done that real research generating real data.
It’s not every day that a historian reads an article with a title like, “The Role of Intuitive Heuristics in Students’ Thinking: Ranking Chemical Substances,” but read it you should. [Science Education, 94/6, November 2010: 963-84] The authors, Jenine Maeyer and Vincente Talanquer, proceed from the assumption that the we better understand how our students think, the better our curricula can be. This is an entirely different approach from one that asks, “What should students who graduate with a degree in chemistry/history/sociology know?” That question needs to be answered in every discipline, but if learning is the goal of our teaching, then we must understand how that learning occurs as we design those curricula. To do otherwise is to waste our time and our students’.
Maeyer and Talanquer begin with a question: What are the cognitive constraints that impede their students’ ability to engage in the kind of careful and complex analysis that they want to induce in their courses? Drawing on 30 year’s worth of research from cognitive science as well as classroom research in the sciences, they describe two constraints and four reasoning strategies arising from those constraints. While they are writing about the analysis of chemical substances, a history teacher could very easily substitute “primary sources” and “history” for “substances” and “chemistry” and learn a lot from their results.
The two cognitive constraints they describe are implicit assumptions and heuristics (short cut reasoning procedures). In history, an implicit assumption would be that during the era of the women’s suffrage movement, all women wanted the vote, because of course women would want the vote. These implicit assumptions are very powerful and difficult to break down, in large part because they are so rooted in a learner’s view of how the world is.
Heuristics are the root of many problems in education in whatever discipline, but the authors argue that if students can learn how these heuristics govern their analytical strategies, they can then begin to learn differently. And once that happens, they are more likely to examine their implicit assumptions about the world.
All of us are beneficiaries and victims of our own heuristics. For example, the quick thinking that results from years of driving experience helps us recognize, without even thinking about it, that the car in front of us is about to do something stupid, so we slow down and give the driver room to do whatever he is about to do. The short cut reasoning procedures we develop as drivers lead us to reasonable conclusions at lightning speed.
But our short cut reasoning can also lead to into errors of analysis. Maeyer and Talanquer identify four heuristics that get in the way of the kinds of learning we want to induce: the representativeness heuristic, the recognition heuristic, one-reason decision making, and the arbitrary trend heuristic.
The representativeness heuristic is one in which we judge things as being similar based on how much they resemble one another at first glance. We see this often in our history classrooms as, for instance, when a student leaps to the conclusion that two works of art separated by both temporal and cultural boundaries must be similar because they kind of look alike.
The recognition heuristic is what happens when we look at a number of pieces of historical evidence, but recognize only one of them, and so assign a higher value to the one we recognize for no reason other than that we recognize it. In the history classroom, this happens when a student is confronted with four or five texts, one of which is familiar, and so focuses all of her attention on that text, to the point of deciding that this text is the most important in the group, even if it is not.
One-reason decision making happens when students make their decisions about evidence based on a single differentiating characteristic of that evidence. So, for instance, in that group of four or five texts, our student might decide that because only one of them actually mentioned something of importance that she is studying, it is somehow more important than the other four when trying to figure out what happened back when the texts were written.
The arbitrary trend heuristic is one we see not only in our students, but in the works of our colleagues. Because several historical sources were generated within a few miles of one another, or within a few weeks of one another, we assume that they must, somehow, be connected to one another, without any evidence to support this hypothesis.
All of these heuristics occur at various moments throughout the semester in our classrooms, regardless of the discipline we teach. Not all students utilize these short cut strategies all the time, but most of them deploy one or the other at some point in semester. Knowing that this is the case, we can then design our courses to address these thinking strategies.
I wish someone had assigned me this article 20 years ago. Of course, it hadn’t been written yet, so that wouldn’t have been possible. But if it had, and I’d read it back before I started teaching history, my life would have been so much easier and my student’s learning would have been so much richer.