It’s redundant work for everyone, except for the “Outcomes Assessment” administrators who are paid to make $hitwork up for faculty and students who would prefer to be left alone to get on with the business of studying physical anthropology, or engineering, or zoology, or Romantic literature, or something else that has actual interest and value to people other than “Outcomes Assessment” administrators.Historiann, and the blogger she cites, Clio Bluestocking, and some of the commenters, have the following complaints about Outcomes Assessment: it's a bunch of non-experts coming in from outside and dictating how content experts should be teaching things, it's a process that attempts to standardize content and teaching across departments and colleges and locations, it's a time- and labor-intensive process but departments are usually not granted any extra money or resources to comply with it, it homogenizes learning (through testing) and trains students to be uncreative and timid, and, most important, it assumes that the humanities and the outcome of a humanities class or education can be measured quantitatively. (did I leave anything out?)
Reading this made me think of my MLA summer newsletter, which made me vaguely worried and uncomfortable when I read it a little while ago, but was unable to put my finger on it at the time (I thought maybe it was because it reminded me that the job list was coming out soon. Eeek!) If you haven't gotten around to reading it yet, here's an excerpt from Rosemary G Feal's column:
Leave the Kleenex. Take the Data.While I am glad that the MLA is actually doing something (Historian's earlier post about a professor who ignored controversial emails attributed to him makes a good case for the dangers of ignoring distasteful developments), I worry that our willingness to jump on the quantitative-data bandwagon will produce some short-term benefits but be hugely detrimental in the long term.
Humanities Advocacy Day, 2009. I am sitting in the audience with the MLA's president, Catherine Porter, and vice president, Sidonie Smith, at a panel called Making the Case for the Humanities. A university president anchors his talk with this little quip from his days as a dean: when faculty members from the sciences came to see him, he took out the checkbook; when faculty members from the humanities visited, he took out the Kleenex. Leaving aside the gendered attitudes and other biases encoded here, I wonder what made him view the humanities faculty as a bunch of whiners without a cause. Is it in part because we are without something the scientists have when they visit the dean: the data?
The scientific community enjoys the benefits of a federally funded data collection project, Science and Education Indicators, prepared by the National Science Foundation's Division of Science Resource Statistics, with guidance from the National Science Board (www.nsf.gov/statistics/seind08/start.htm). Our turn is coming, though. Thanks to the efforts of many ... the Humanities Indicators, a project of the American Academy of Arts and Sciences, has recently been launched in prototype form (www.humanitiesindicators.org).
I know it's ironic, coming from someone who is currently compiling a table of incidences and who just posted a picture to her blog to document the quantity of her recent reading (not the quality --- whoo no, there is no quality in that pile of boredom), but I would argue that the essence of the study of the humanities is qualitative. In fact, in the past, the "human" part of the humanities has meant that part of life which cannot be quantified.
I worry that, in relying on quantitative data to justify such a deeply qualitative field of study to our administrators, legislators, donors, and the general public, that we end up commodifying humanistic study ---- that we train those aforementioned people to not value the very subjects and methods we are advocating for, that they will become even less likely to understand the humanities work we do and why it is important. Will relying more and more on tables and charts and graphs of humanities "outcomes" and "excellence" end up devaluing, or transforming, the qualitative types of work we do in the humanities classroom?
After all, what's so great about quantitative data anyway? Well, let's see: it is a fast and efficient way of conveying information ---- it will take you a lot less time to read my charts and timelines of who-was-where-when than it would for you to read all those documents and for us to have a deep conversation about it. (Let's save that for the novel, which is actually worth considering closely.)
We also currently have a strong cultural inclination to value quantitative data as very important and somehow more real or more rigorous (here I could link to Lyotard or Evelyn Fox-Keller or Donna Haraway or whoever made this point first but I'm lazy and don't want to hunt it up). I'd say this trend dates back at least to the invention of the stock market ticker and the mystical belief that somehow throwing around numbers necessarily produces more money (I could link to any number of recent articles about the stock bubble, housing bubble, or Jon Stewart's attacks on Jim Cramer, if you'd like.)
So is outcomes assessment just another part of the factory university speedup? The whole point of humanistic study is to read and think in depth and then to talk about it, and to train our students to read and think in depth as well ---- and then to communicate what they have discovered through writing and speech.* That's it. It's a model that doesn't lend itself well to Taylorization, rationalization and efficiency. It's hard to turn it into a commodity with the attendant cycles of innovation and obsolescence of various bells and whistles, at least hard to do that and have it remain recognizably the same. There's no product, no profits or dividends; ideally these activities --- reading, thinking, communicating, in a humanistic manner --- should be carried out in all aspects of life and won't be easily correlated to getting a job or making X amount of money. In fact, humanistic study is supposed to be so much more all-encompassing and affect so much more of your life than your earnings, that measuring such a limited outcome as wealth, rather than a holistic assessment, should just be silly.
Of course, the idea that there are whole swaths of life and society that are not about money and measurements and profits makes some people veeery upset. As does the idea that maybe profits and making money don't have to be an important part of your life, or how it is measured. As for me, I think that when a university mission changes from educating students to proving to people that it is educating students, things have gotten really out of whack.
*I know, I know, craziness: the Marxist scholar endorsing humanism? Well, remember that Marx himself advocated that everyone work for 4 hours a day so that they all had time to study and create and entertain. Hell, I'd put in my 4 hours at the communal garden or garbage dump if I knew I had the freedom to think and read and talk about things that interested me in complete security.