Reverse detail from Kakelbont MS 1, a fifteenth-century French Psalter. This image is in the public domain. Daniel Paul O'Donnell

Forward to Navigation

Cædmon Citation Network - Week 9

Posted: Jul 18, 2016 09:07;
Last Modified: Jul 18, 2016 09:07

---

Hi all!

I finally get to start reading this week!!! While I am still not 100% complete in my sourcing of all the books and articles, it is looking as though I will definitely be able to start reading by Wednesday if not earlier.

I also have a bunch of books from inter-library loans that I need to scan portions of. That will be part of my job today.

The database will be ready this week as well. Garret says that there will be a few improvements that he will want to make, but I will be able to start using it this week. All the information that I collect will still be available as the database is upgraded.

You may have noticed that I have switched to blogging at the beginning of the week as opposed to the end. I have found that at this point it is more beneficial to myself to post at the start of the week outlining some goals and then adding an update post sometime during the middle of the week. I am going to continue this model for the next while.

Until next time!

Colleen

----  

A "Thought Piece" on Digital Space as Simulation and the Loss of the Original

Posted: Feb 11, 2015 11:02;
Last Modified: Mar 04, 2015 05:03

---

A “Thought-Piece” on Digital Space as Simulation and the Loss of the Original: Final Paper for Dr. O’Donnell’s English 4400: Digital Humanities, Fall 2014

          In beginning to think about how I could integrate theory into my final project, I recalled Kim Brown, the DH Maker-Bus, and how she spoke about how her workshops with children have prompted kids to ask “big questions”. It occurred to me that the way in which humanists approach their own work is often very dependent on the ways humanity and culture are defined. It also occurred to me that now, more than ever, humanity and technology are converging. In this paper I want to explore the ways technology and the digital are seen as “copies” of an “original”. Drawing on theories post-humanism and post-modernism I will discuss technology and the internet as simulation. This paper will examine technophobia in the humanities and look to Jean Baudrillard’s theories of simulacra, simulation, and the hyperreal in an attempt to explain resistance to the digital and technology, in terms of scholarship, but also examine the larger implications of copy replacing the original. I will attempt to deconstruct the lamentation of the loss of an original, with simulations made possible by technology, and how this affects understandings of things like research, the humanities, and humanity itself.

          To begin to deconstruct the lamentation of the loss of the original; a resistance to the simulated, or technology in the humanities. I think it is important to discuss the theoretical Baudrillardian notion of simulacra. The internet can be seen as a hub of simulation. Sites like Facebook, and Twitter, email, and skype, simulate physical forms of communication, and online shopping websites simulate the physical shopping experience. People have virtual relationships, pets, can gamble, can send money, can publish, and can donate to charity online. If one “goes shopping” online, did they really go shopping? The idea that “shopping” means anything other than physically going to a store is relatively new. With online shopping, the consumer is very much detached from any product, and uses simulations of money (debit, credit, or PayPal) in the privacy of their own home. The physicality is removed, and the process becomes much more abstract. However, the lack of physicality does not make it any less “valid”. Rather, the way shopping has traditionally been defined must be re-examined in the context of a hyperreal digital era. Researching online is not less valid than in a library. The idea I have heard purported by some of my professors that online research is easier or equates to less zealous or engaged students is supported only by the elevation of the original. The original in this sense being the physical book in the physical library. However, whether information is in print or online, the idea that knowledge is easier to learn, or less valued as digital seems to suggest that there is an obvious hierarchy in the value of medium. However, present (although not necessarily pervasive) fear or resistance to digital spaces in the humanities perhaps be explained by the notion that because there is so much virtual content, and simulations online in a digital environment, the truth is elusive. I think this stems from the idea that the “real” truth exists as something that is physical; which have been authenticated in simply by existing in a physical form, and simulations (being further detached from “Reality”) distort and become further removed from truth. The internet can be understood in many ways as the epitome of simulation and the hyperreal. Baudrillard recognized the virtual world as a fourth level of simulacra, building off of his previous three levels: counterfeit, production, and the code-governed phase. “The counterfeit” (Baudrillard 50) level, being in close proximity to the original, “production” being the reproduction of the original and the code-governed phase, a much more abstract assemblage, rooted in signs, and completely detached from the original. This third phase, the “code governed” phase refers primarily to language as code. For Baudrillard (and Derrida, Saussure and others), language creates a distance from reality. In many ways, language is a tool used to simulate reality. In hyperreality, a space comprised primarily of “copies”, and for the purpose of this paper, virtual, digital spaces, and the third or fourth level of simulacra, the simulation often becomes the original. Digital hyperreality, allows for interaction with the thing that is not present, or the lost or displaced “original”.

          If we apply this understanding of simulation and hyperreality to online scholarship, research, reading, teaching, and interaction, the “original” is the physical. That is to say, that texts contained in physical books, in physical spaces are privileged as closer to nature (although not equated to nature, since text itself is simulation, or code). Consistently, digital spaces as a viable research option are subverted in the humanities. Digital text disrupts the lasting finality of print, and seems to threaten the sanctity of Truth in the nature of its detachment from the physical or original. The act of moving our bodies from a home space, to a study space authenticates my research; I conducted research because what I interacted with was real, in the molecular sense. Reading “From Modernism to Post-Modernism”, and holding the book in my hands means I read the book. Did I still read the book if I read it online? Did I still “talk” to my professor if I sent an email? Am I still a person, a human in my entirety, if I have a digital eye?

          Many post-modern theorists such as Fredrich Jameson, saw simulation in terms of its artificiality, and that in itself carried the connotation of inferiority. For example, the consumption of non-artificial, non-simulated foods is praised and sought after. There is a desire to purify foods. More and more often, marketers carefully craft product information to of that which is not natural, or “originally” present in nature. Marketers include words such as “all natural, organic, home-made, home-grown, authentic” and many products (food specifically) is advertised as “GMO-free, no artificial colors, flavors”. In a more abstract sense, consuming food which has deviated from an “original”, is seen as inferior, despite the fact that studies such as A. L. Van Eenennaam and A. E. Young’s “Prevalence and impacts of genetically engineered feedstuffs on livestock populations” have concluded that: “Numerous experimental studies have consistently revealed that the performance and health of GE-fed animals are comparable with those fed isogenic non-GE crop lines” (Van Eenennaam, Young). The mere fact that certain foods use technology threatens the sanctity of the original. In a similar way, technology is often demonized as a violation of biology. There are exceptions, and certainly, the average person would not reprimand (in any explicit way) an elderly person with a pacemaker, or a someone with prosthetic limbs. Figures like Donna Harraway posit that the definition of “human” is largely based on biological, anatomical qualities, such as DNA, and naturally occurring physical features. Anatomically, an amputee with prosthetics does not qualify as a human under Wikipedia’s bio-centric definition of a human. Is a person with prosthetics 81% human 19% machine? Where is this line drawn? At what point is a person too far removed from the “original” to be considered a human? If I am anatomically 75% machine am I still a human? No, not based on the current definition. “In the post-human, there are no essential differences or absolute demarcations between bodily existence and computer simulation, cybernetic mechanism and biological organism, robot teleology and human goals” (Lenoir 204). The copy, or simulation of body parts removes the cyborg from the current category of what it means to be human. However, this loss of the original allows for production of emancipatory copies. (Baudrillard) This can be seen in terms of the cyborg, (a failing heart allows for its copy, a pacemaker). We can look at the integration of technology into the body as a step in the direction of the eradication of distinction between original and copy. This has serious implications for humanity itself, in addition to the humanities as a discipline.

          In privileging only the original, natural, biological, and physical, it leaves no space for “the copy”, the simulation, or the hyperreal, where the original fails, or inconveniences. Baudrillard says: “the extinction of the original reference alone facilitates the general law of equivalences, that is to say, the very possibility of production” (52). This is particularly applicable to things like web-based journalism, scholarship, and communication… Baudrillard sees the loss of the original as emancipatory. He continues: “Through reproduction from one medium into another, the real becomes volatile, it becomes the allegory of death, but it also draws strength from its own destruction, becoming the real for its own sake, a fetishism of the lost object which is no longer the object of representation, but the ecstasy of denigration and its own ritual extermination: the hyperreal” (72). If we apply this to the notion of online research in English literature, or the consumption of e-books, the real, that is, the physical, does draw strength from its destruction. The simulation through virtual mediums allows for people to engage with content despite physical limitations. Murray McGillivray illustrated this perfectly in his talk. He discussed the nostalgia for the original, as Baudrillard alludes to. He commented how the original manuscripts of medieval texts are extraordinary in their physical state. But he also recognized and concluded that, for the average student, accessing these originals is simply not possible for a number of reasons. The simulation of this text allows for other people to view the content. In turn, The Cotton Nero A.x Project, a simulation of medieval manuscripts, for a reader such as myself, literally replaces the original. That is powerful, given that I will likely not see the physical manuscript in my lifetime.

          Digitized text and content may be an element of hyperreality insofar as a website containing a book is not the same thing as a book. However, digital simulations can be seen as emancipatory for a number of reasons. Permitting, information is open access, or free to view, any person with a device and access to Wi-Fi is able to view that document, regardless of time, and place. (This does not take into account disadvantaged groups, or third world countries with little or no internet accessibility). However, multiple people can view the same thing simultaneously, freeing the content from the confines of a physical object, which cannot be viewed in its “original”. It’s copies can transcend space. While the internet and digital content is often blamed as being the cause of distraction, and poor performance, digital content has also proved to help people become more efficient. The physical book is nostalgic, it is comforting and personal and often carries with it, the sense of attachment, due to its physicality. I have heard professors and fellow students observe how students become easily distracted, how it is more difficult to sit down and read a book with unwavering concentration. Of course it becomes more difficult, since so much of how we operate within the world is now digital, instantaneous, and simulated. However the emancipation Baudrillard alluded to can also be applied to the consumption digital text. Instead of seeing digital content as this formidable “copy”; and lament the loss of the original, we can look to technology which relies on our detachment from the physical, as its selling point. Take “Spritz” for example. I use the “ReadMe” application, which is partnered with a developer called “Spritz”. Spritz’s whole software concept is to utilize the fast-paced, ever-changing, video-centric component of the internet and use it to help people read quickly. I downloaded the Application, “ReadMe”. An Application which separate the words one by one and displays them individually in order. The words are displayed, individually as fragments, of digital text, but as the websites points out, it is not practical for the average reader to read 500 words per minute. This format, however allowed me to read 25% of a 200 page book in about half an hour, with arguably better comprehension than using the “original” method. Accessing books this way is even farther removed from their originals. But that is its precise advantage, its medium is advantageous yes, but it makes reading less about the book, and more about the words. As an English major I found this technology unbelievably invaluable. “When reading, only around 20% of your time is spent processing content. The remaining 80% is spent physically moving your eyes from word to word and scanning for the next ORP. With Spritz we help you get all that time back” (Spritzinc.com). The original, in this case, can be seen as a hindrance because it is simply not as efficient, in my own experience. The copy in this instance is not a book at all. Baudrillard explains how digital environments, in a way, erode the thing(s) simulated in these digital spaces: “at this (virtual) level, the question of signs and their rational destinations, their ‘real’ and their ‘imaginary’ their repression, reversal, the illusions they form of what they silence or their parallel significations is completely effaced” (Baudrillard 57). What Baudrillard is saying here is that the divide between the signifier, in this case, virtual content, is so far removed from the original that the definition of the book itself is completely eroded. The e-book is no longer a book, the e-transfer is no longer a transfer in terms of its physical definition, and therefore reading a book online, is not really “reading a book”. Reading an e-book is reading a simulation of book; a copy of an original.

          The assertion that the copy does something to the original (Baudrillard 425) is true. In many cases the “copy” or simulation is superior to the original. The humanities is based on the study of humanity. History, philosophy, anthropology, sociology, all rely on a bio-centric understanding of humanity. There is a line that has been drawn as to what is considered human. The “original”; the natural. This biological understanding of humanity fared pretty well for centuries. Although there is lots of speculation amongst scholars as to what qualifies as a cyborg, the modern digital landscape has transformed most of the western world into cyborgs. The integration of technology onto our bodies, into our bodies is now more possible than ever. Baudrillard speculates: “Even today, there is a thriving nostalgia for the natural referent of the sign” (51). There is a sense of comfort in the “real”, following the historical assumption that for there to be real things there has to be a reliable, knowable system of production and in a digital space this does not exist. In a digital age, I think it is necessary to re-assess how humanity itself is classified. To stretch the definition beyond the biological and to recognize that the natural, or biological is not always superior.

          The idea that by doing something in a virtual setting or digital space, is almost like it never happened, is another theme Baudrillard closely explores in The Gulf War Did Not Take Place. Digital environments as simulations does something to the original. In the simulation’s instantaneousness, multiplicity, accessibility, and artificiality, the original becomes sacred, and unknown in an overwhelming sea of homogeneity of simulations. Baudrillard does see the loss of the original as potentially freeing, but also recognizes the effect that simulations have on the original. The Gulf War as people came to understand it through its simulation or virtuality in film is not the same as the original war. The simulation cannot be the thing it copies. It can replace the original in the sense that people only access an original, through a copy, but it does not equate to the original its totality. It might be crude to compare war and research but the theoretical assertion that online, virtual research is not the same as researching or reading in its original bodily, physical sense. However for those who viewed the Gulf War through television, that simulation became the original, in the viewer’s inability to access the original. The experiences are different and the same. Online research is not the dictionary definition of research, but in one’s strict engagement with the simulation of texts in a digital or virtual space, that new simulated experience becomes the original. The result is that one may not go to the library, unless the simulation is not available, at which point one tries to access it’s original.

          The idea that research can occur as a purely visceral, mental experience (simulation), fundamentally changes the definition of research. The simulatory nature of anything digital or technological fundamentally changes the definition of the thing it simulates. I think in demonizing the simulation, you resist progress. Very broadly, resisting technology and the digital on the grounds that it is inferior to the physical or original, means that in many cases, progress or efficiency is delayed. Discrediting the power of virtual technology as a means to communicate because it does not carry the same nostalgia as face-to-face communication means that valuable, virtual conversations with Alex Gil for instance would never have occurred. Similarly, professors requiring students to seek out an expected number of print resources in research can correlate to missing out on valuable virtual or digital research. I always find myself back to the concept of “big questions”. The topic of technophobia and resistance to the digital in some humanities spaces can be explored as a discussion of theory. Baudrillard’s work on simulacra and simulation has allowed me to explore the sometimes subordinate status of simulation and copies. My paper focused mostly on the loss of the original and the ways in which this can be seen as emancipatory, especially when we begin to consider the implications the digital and simulated has on how humanities research is conducted, and the discipline itself is defined. These theories can be applied to the greater understanding of humanity. The merging of technology and humanity has led to massively complicated questions, not only about simulation and original, in terms of research and scholarship, but of humanity in general. Where does machine begin and human end? And are cyborg’s the new species? I don’t think I would have pushed myself to trying to understand the boundaries of humanity and machinery in a post-human sense, without this course. In this merging of technology into classroom’s and bodies, it is clear that the definition of the original must be expanded to include its copies, and simulations.

Works Cited

Baudrillard, Jean. “Symbolic Exchange and Death.” From Modernism to Postmodernism. Malden: Blackwell Publishing, 2003.

          421-434. Print.

Baudrillard, Jean. Symbolic Exchange and Death Theory, Culture & Society. Sage Publications Inc., 1993. Web. 2 Dec. 2014.

Baudrillard Jean. The Gulf War Did Not Take Place. Bloomington, Indiana: Indiana University Press, 1991. Print.

Harraway, Donna. From Modernism to Postmodernism. Malden: Blackwell Publishing, 2003. 460-484. Print.

Horn, Eva. “Editor’s Introduction: “There Are No Media”.” (Abstract). Grey Room 29 (2007): 6-13. Web. 4 Dec. 2014

Lenoir, Timothy. “Makeover: Writing the Body into the Posthuman Technoscape: Part One: Embracing the Posthuman.” (Excerpt).

         Configurations 10.2 203–220. Web. 6 Dec. 2014

Van Eenennaam, A. L., and A. E. Young. “Prevalence and Impacts of Genetically Engineered Feedstuffs on Livestock Populations.”

         American Society of Animal Science (2014): 1–61. Web. 7 Dec. 2014.



----  

The People’s Field: The Ethos of a Humanities-Centred Social Network

Posted: Feb 05, 2015 10:02;
Last Modified: Mar 04, 2015 05:03

---

Hello readers of Daniel Paul O’Donnell’s blog. My name is Megan and I am a former student of his, having completed (among others) his 2014 seminar on the Digital Humanities. The following is a paper I wrote for that class, which Dan has kindly offered to feature on his blog.

The inspiration for this essay comes from my experience as a musician, specifically a guitarist. It has always been possible to — indeed, far more common anyway, I would think — to learn to play outside of a classroom setting. But the Web has given us something spectacular: huge social networking websites aiming to encompass all aspects of playing guitar, whether learning, teaching, critiquing, or making music with others. The education is there, and the community too, similar to the post-secondary experience. If non-academic music education can thrive online, why not the humanities?

I’m sure many of you are humanities people, and so I’m also sure you’ve thought about the State of the Humanities — their financial viability, their usefulness, their place in academia and in general public life. This essay is not so much an argument as it is a reflection, an appeal to all of us humanists to broaden the venue and audience of what we do. It is an appeal to think bigger: not what the humanities should look like just in academia, but what they could look like in the wider world. Humanists hate clichés, but I think one rings true in this case: if we love what we do, we have to set it free.

The People’s Field: The Ethos of a Humanities-Centred Social Network

In a 2010 article for the Chronicle of Higher Education, Frank Donoghue attempts to finally answer why the academic importance of the humanities is seemingly in permanent dispute:

The shift in the material base of the university leaves the humanities entirely out in the cold. Corporations don’t earmark donations for the humanities because our research culture is both self-contained and absurd. Essentially, we give the copyrights of our scholarly articles and monographs to university presses, and then buy them back, or demand that our libraries buy them back, at exorbitant markups. And then no one reads them. The current tenure system obliges us all to be producers of those things, but there are no consumers.

The public simply does not need humanities research the way it needs scientific or medical research – incest in Hamlet or the meaning of Finnegan’s Wake are still great questions worth pursuing, but no one’s life hinges on the resolution of Hamlet’s fraught relationship with his mother; people will and do, however, die of cancer and diabetes, and James Joyce cannot fry our brains if climate change does it first. For a field whose very name suggests a focus on all humankind, the humanities’ products are remarkably individualistic in scope, the pet projects of bookworms. Yet this is not to say these products have no value, nor is the decline of the humanities as an academic field indicative of a culture that no longer cares for literature, history, languages, or philosophy. Donoghue notes that

Intelligent popular novels continue to be written; the nonfiction of humanists who defy disciplinary affiliation . . . will still make best-seller lists; and brilliant independent films . . . will occasionally capture large public audiences. The survival of the humanities in academe, however, is a different story. The humanities will have a home somewhere in 2110, but it won’t be in universities. We need at least to entertain the possibility that the humanities don’t need academic institutions to survive, but actually do quite well on their own.

People will always enjoy creating and consuming literature; it is the demand for literary (and other humanities-centered) criticism that is constantly being called into question in modern academia. But if not in universities, whither that criticism? The answer lies in the one of the most ubiquitous – and perhaps most important – technological developments in recent history: social media. The aim of this paper is twofold: firstly, to prove that social media, specifically social networking websites, is a viable way to build a consumer base for literary criticism; second, to provide an outline of the features of a theoretical humanities-centred social network and how it would operate. For simplicity’s sake, my project will focus primarily on only one aspect of the humanities, namely literary criticism, and it will admittedly be North American-centric in its analysis of the state of the humanities and assumptions of available technology.

First let us take a more in-depth look at what is seemingly Wrong with the humanities. Little academic research has gone into this topic (though Stuart Hall formally explores the disconnect between the less-than-concrete goals of the humanities and its potential for informing social activism in his pre-Web 2.0 “Emergence of Cultural Studies and the Crisis of the Humanities”). However, the last five years have provided a plethora of popular articles devoted to parsing out this perennial problem. Two key themes endure: for one, the typical argument that the humanities do not make employable graduates, thus turning the field into more of an economic burden than an aide (Sinclair); second, the more interesting idea of a disconnect between the public and humanist academia, causing the hoi polloi to distrust the humanities and therefore not value them. With regards to the first argument, Stefan Sinclair claims that, “the attacks on the humanities are bolstered by the underlying assumption that in this model [the “knowledge-based” economy] every department must rely solely on their own market revenues. Whether or not humanities departments would actually be viable in this model is up for debate, but commentators often assume this would not be the case.” David Lea attributes these assumptions to a shift from collegial to managerial principles in university governance; administrative and technology-centred spending has thus increased at the expense of cuts to the humanities (261). Sinclair obviously finds these assumptions and their resulting cuts unfair and unimaginative, and he does remain somewhat justified in that no one seems to have expended any thinking on how to make the humanities a more profitable academic field. Yet that point brings us right back to Donoghue: the humanities are inherently insular, and their societal effects are markedly indirect compared to the immediate benefits of the natural sciences, technology, and medicine.

This self-contained nature brings us to the second key explanation of the humanities’ decline. Surprisingly, much of the popular criticism involves not the economics-centred points above, but the argument that the humanities have become inaccessible to the wider population. Mark Bauerlein recounts the myriad points made during the 2011 symposium “The Future of the Humanities,” and summarizes the perceived problem as the “neglect or inability or lack of desire . . . [of humanists] to speak directly to the public in a public language” (Bauerlein). One can easily take objection to this argument: scientific papers are just as – if not more so – incomprehensible to the average citizen. Academia is fundamentally esoteric. But the humanities differ greatly from the sciences in one key aspect, laid out by Steven Knapp:

An investment in their [art and literature’s] particularity and therefore in their history is what most deeply and importantly separates the objects and events studied by the humanities from the phenomena studied by the natural and even the social sciences. In science, what matters is not the irreplaceable particularity, the irreplaceable origin, of the phenomenon in question but instead its generalizability and therefore precisely the replaceability of its particular history.(Knapp)

In other words, the sciences are necessarily future-oriented; they are always looking for answers to improve upon current human knowledge, to make generalizations such as “climate change is caused by greenhouse gas emissions” and thus replace the old understandings. The humanities do not operate in this way. They are concerned with and motivated by “the pleasure human beings take in the particularity of lived experience . . . the pleasure human beings take in preserving and enjoying particular things” (Knapp). The subject matter of the humanities therefore belongs to the public in a way that of the sciences does not: the vast majority of us cannot learn to explain the physical world in Newton’s laws without at least some instruction, but most of us can read Hamlet and get something out of it, regardless of formal instruction in literary criticism. Knapp elaborates: “What matters to the public is Shakespeare, not the logic of theatrical representation. What matters is the story of America, not the ideological structure of American essentialism” (Knapp). North American academe’s current love affair with (in his opinion) deconstruction, Marxism, feminism, and post-colonialism has alienated the public: “Humanities professors disrespected great works, so naturally the public turned around and disrespected them” (Bauerlein). Of course, we can hardly blame humanities scholars for examining literature through these ideological lenses; to expect professors to never question the messages of canon texts and dominant cultural narratives is tyrannical. But we must respect the fact that art and literature belong to the people and therefore traditional readings remain valid as well.

Yet the pleasure-motivated, “particularity”-centred nature of literary criticism is also a point against the humanities in academia. Laurie Fendrich claims “the only way to justify studying the humanities is to abandon modern utilitarian arguments in favor of much older arguments about the end, or purpose of man. Yet Darwin, in firmly swatting down the idea that man has an end, makes returning Aristotle . . . difficult for most modern thinkers” (Fendrich). There is a kind of nobility in studying literature, but as reasoned above, the humanities simply do not provide the kind of progress-fuelling information that the STEM fields do. Fendrich further recalls the highly elite nature of the early university, where well-to-do young men would go to learn the classics, philosophy, and languages, becoming “knowledgeable” but ultimately “useless” – a place where they could spend their time before inheriting their prospective family wealth (Fendrich). Now that universities have opened up to the “common” people, the study of literature has opened as well; however, this democratization of education means that post-secondary institutions must prepare students whose various social classes necessitate they will spend their lives in the workforce, not luxury drawing rooms.

None of these commentators propose any real solutions to the humanities problem; in fact, they all admit that the purpose of the humanities will always be called into question in utilitarian modern (i.e. Western) society. So what can we do with them? The answer is difficult – perhaps ultimately irresolvable – and this paper’s scope can only propose a partial remedy. The humanities are so deeply entrenched in academia that it would be unreasonable to simply get rid of them altogether, at least in the foreseeable future. We must maybe admit that studying literature in university is a privilege for those who need not worry about work once they graduate, and those of us from the middle- and working-classes who take that route must deal with the consequences. But by examining the problems, we can at least parse out a partial remedy: the humanities are the people’s pleasure, and we must give it back to them.

The answer lies in social media. David Lea, in addition to the shift from collegial to managerial values, places the decline of the humanities on online learning, despite what he admits might be “obvious financial advantages” (261). Providing the humanities with a physical space is expensive; moving them online would cut costs to university departments, though of course there remains the desire to teach the humanities in actual classrooms. David Lea is therefore right to worry about the threat of online learning to the state of humanities education. But his observation also reveals a willingness among the public to transfer the education process to an online environment, and this willingness could be the saving grace of the humanities, the opportunity to bring it back to the people. My outline of a theoretical humanities-centred social network provides the crux of my argument.

First of all, social media can be split into five or six broad types. For our purposes, we will use Tim Grahl’s categories: 1. Social networking sites, where users create profiles to connect with others (e.g. Facebook); 2. Bookmarking sites, where users “save, organize, and manage links” (e.g. StumbleUpon); 3. Social news, where users share links with others and rate them (e.g. Reddit); 4. Media sharing, where users upload their own content, often accompanied by “additional social features, such as profiles, commenting, etc.” (e.g. YouTube); 5. Microblogging, “services that focus on short updates that are pushed out to anyone subscribed to receive the updates” (e.g. Twitter); and finally, 6. Blog comments and forums. Grahl further notes that many social media platforms incorporate features from multiple categories (Grahl). A humanities social network would primarily combine elements from categories 1 and 4, with elements of 2, 3, and 6.

The raison d’être of a humanities social network would be providing students, hobbyists, and professional academics with a space to upload and share their writing on various works of literature. Much like media networking sites YouTube or DeviantArt, users would create a profile in which they would list their literary and critical interests: which authors and styles they admire, which critical theories they like to employ. All content they upload would be labelled with various tags indicating the topic and types of criticism used; these tags would make the content searchable by other users, who could search for writing on a particular topic, read other peoples’ work, and provide feedback or invite them to be “friends” in a similar vein to Facebook. Users could then follow their friends’ content, discover the writing of their friends’ friends, and in turn building a community of people whose primary connection to each other is their passion for literature. The perceived hierarchy between professor and student, or academic and layman, would not exist, encouraging a perception of literary criticism as a hobby, accessible to anyone with a favourite book and ideas they can support. Another important feature reinforcing this bond would be discussion forums, allowing users to have meaningful conversations about works of literature and critical theories outside of the context of a particular paper’s comment page. Again, these discussion forums would blur hierarchical lines and make literary criticism accessible.

The success of social networks-with-a-purpose such as LinkedIn and Academia.edu sets a precedent for the branching out of a humanities social network. The website could easily split into a free version, and a premium version where established academics wishing to publish professionally can do, creating a database similar to JSTOR or Project Muse. Like those two databases, postsecondary institutions could pay for the premium version to give their students access to these articles, and students would still be able to share their opinions on the article with other users. Integrating this social network into university online services could reduce costs paying for physical copies of journals; furthermore, it would serve to get students interacting with their peers both inside and outside their institution, as well as with people not enrolled in academia. Furthermore, incorporating elements of social media categories 2 and 3, students could save papers (from the academic premium version only, as saving papers from the free version would invite too many complications regarding plagiarism) and rate them for students writing on similar topics. The noble intentions of the academic humanities and the pleasure of the people would both be served.

In the face of a changing academic landscape, the humanities are increasingly perceived as too costly for institutions and too pretentious for everyday people. A social network focused on the humanities would help remedy that, fostering the perception that literary criticism is accessible, while providing a database with the potential to cut costs for libraries.

Works Cited

Bauerlein, Mark. “Oh the Humanities!” Weekly Standard. 16 May 2011./p>

Donoghue, Frank. “Can the Humanities Survive the 21st Century?.” Chronicle of Higher Education. 05 Sept. 2010.

Fendrich, Laurie. “The Humanities Have No Purpose.” Chronicle of Higher Education. 20 Mar 2009.

Grahl, Tim. “The 6 Types of Social Media.” Out:think. Out:think Group.

Hall, Stuart. “The Emergence of Cultural Studies and the Crisis of the Humanities.” Humanities as Social Technology. 53. (1990): 11-23.

Knapp, Steven. “The Enduring Dilemma of the Humanities.” Phi Beta Kappa Society. Phi Beta Kappa Society, 29 Mar 2011.

Lea, David. “The Future of the Humanities in Today’s Financial Markets.” Educational Theory. 64.3 (2014): 261-83.

Sinclair, Stefan. “Confronting the Criticisms: A Survey of Attacks on the Humanities.” 4Humanities.org. The Digital Humanities Community, 09 Oct 2012.

----  

Academic Suicide

Posted: Mar 12, 2014 16:03;
Last Modified: Mar 04, 2015 06:03

---

The so-called “college paper” has been a debated topic practically since its initial inception. A recent class statement brought the debate to the forefront of my mind. Professor O’Donnell stated, in a tone of bemusement, that his students tend to perform better on the blog assignments than on their actual papers. It does seem odd that a discrepancy exists between two writing exercises. However, the answer formed almost immediately within my thoughts and has expanded through the discussion of prescriptive rules versus descriptive. The reason students are so terrible at writing the “college paper” boils down to differences between prescriptive rules and descriptive rules. With that I commit myself to academic suicide by breaking the general guidelines and prescriptive rules of academic writing and adhering only to grammatical prescriptive rules and a more formal dialect to explain the phenomenon of why students are incapable of writing the traditional North American college paper.

In terms of grammar, students are already limited in the way that they can communicate their ideas in a paper by having to adopt a more prescriptive based, formal dialect. I am NOT arguing that students can get by in the world, and more specifically their university career, without an academic and more formal dialect. Just like the young student with only a formal, prescriptive dialect who is beaten up on the schoolyard for not having a more descriptive based dialect that allows him or her to fit in (Wallace 51), the university student will be figuratively beaten up in the classroom if they do not possess a formal, more prescriptive based dialect. It is necessary for university students (and anyone who wants to be successful in the English-speaking world) to adopt a second (or third or fourth) dialect that allows them to fit into their surroundings. There are various situations in which prescriptive rules should be relied on more heavily than descriptive rules and vice versa.

Professors, however, ignorant of the fact that students are already restricted by a dialect that may not be second nature, impede the ability for students to effectively communicate their ideas further by creating their own set of stylistic prescriptive rules. In the Humanities (and Sciences) it is a major faux pas to use first person pronouns. The only time ‘I’ may be acceptable in a paper is when it is used to clarify the student’s argument from a secondary source. ‘Helpful’ topic ideas only serve as an agent of restriction, tightening the figurative noose around students’ ideas. There are few things more disheartening in the post-secondary experience than completing an essay that has veered so far from the original topic that it almost seems pointless to hand it in. Whether well-written or not, whether ideas have been communicated appropriately and interestingly in an academic dialogue does not matter to professors who set guidelines. The paper that succeeds in communicating ideas may receive a lower grade if it does not meet the guidelines. Professors need to realize that the more prescriptive rules they place on their papers, the worse students’ papers will be. The more rules, the more confining the box that students need to fit their ideas into. This is why students perform so much better on blogs. A blog has no rules aside from one: it must be “within shouting distance of the course” (O’Donnell). Students are therefore free to express themselves and communicate the ideas that they find interesting in compelling and captivating ways. After reading several blogs, despite the lack of rules, it becomes evident that there is a second, unwritten rule that comes naturally to almost all university students: they use a more prescriptive based, formal dialect than what their typical descriptive dialect would permit.

Another guideline or prescriptive rule set out by professors is the limitation of secondary sources to scholarly articles. Although it is understandable that the use of websites like ‘Wikipedia’ should be maintained to a minimum, it is another guideline that prevents the development of strong, relevant ideas that support the argument. With social media permeating our everyday lives, professors need to accept changing times. Why should a student be restricted from using blog posts of highly educated people in respectable positions? Why does a professor’s journal article garner more merit than a post on their blog site? It shouldn’t; and even one of the ‘scholarly articles’ cited for this paper (Steven Pinker’s “Grammar Puss”) can be found in a blog. Clearly this prescriptive rule of how research material for papers ought to be gathered is about as outdated as the grammatical prescriptive rules that, as Steven Pinker points out, are based on Latin and 18th-Century fads (20).

The problem, however, is the fact that prescriptive rules are extremely difficult to abolish. They have become so engrained into our minds that we don’t challenge them. This fear of driving change is also perpetuated by “the worry that readers will think [the author] is ignorant of the rules” (Pinker 20). As a result we limit our thoughts and ideas and force them into tiny, prescriptive boxes. We avoid engaging in a dangerous game of Russian Roulette with our grades by playing it safe and coughing up redundant, highly repetitive, excruciatingly painful to read, and bluntly put, shit. The failure of the college paper is not due to the students, as Rebecca Shuman so strongly states in her blog post “The End of the College Essay”, but that of the professors who are not willing to wake up to the 21st Century and rethink their own set of restrictive, prescriptive rules.

I personally used to love writing. I enjoyed it. I didn’t even mind writing essays. And I wrote good ones. Over 50% of my class failed the first essay in my English 1900 course. I received a grade over 90% and embarrassingly had to tell the girl beside me who had received an abysmal 4% that I had done very well and leave it at that. Somewhere along the way, however, something changed. Professors implemented more guidelines and maybe even my own standards rose. Whatever the cause, the outcome is the same, I no longer feel capable of writing essays. How can I when students are repeatedly informed that they do not know how to write essays and are incapable of producing good ones? I am now so replete with anxiety concerning whether or not my essay will be long enough (or too long), if it will sound academic enough, and whether it will actually relate to the topic, that I have become immobilized. I do half-assed, last minute essay writing to avoid the stress of completing something that I may be simultaneously proud and doubtful of because it is well-written but does not fit into a professor’s prescriptive rules. I would rather accept a lower grade on something that I slapped together the night before it was due than have hard work torn apart by a “SNOOT” professor (Wallace “Tense Present”).

The reason students are incapable of writing college papers is not because of some kind of innate inability nor is it because students do not possess an academic dialect. The students are not to blame for their poor attempts at the “college paper”; it is the professors who need to realize the defeating, restricting effects of their prescriptive rules who are to blame. Just like the machine that is capable of duplicating human language in Pinker’s “Grammar Puss”, students are given a number of prescriptive rules to follow and just like the machine, we sit there, immobilized, unable to communicate our ideas (19). Prescriptive rules should be relied on more heavily when it comes to the college paper and academic writing, but without descriptive rules students, like the machine, don’t know how to say what they want to say. Perhaps if allowed to break the rules, students could actually write the infamous “college paper”.

Works Consulted

O’Donnell, Dan. English 2810. 8 Jan. 2014
Pinker, Steven. “GRAMMAR PUSS. (Cover Story).” New Republic 210.5 (1994): 19-26. Business Source Complete. Web. 14 Jan. 2014.
Schuman, Rebecca. “The End of the College Essay.” Web log post. Slate. 13 Dec. 2013. Web. 16 Jan. 2014
Wallace, David Foster. “Tense Present: Democracy, English, and the Wars Over Usage.” Harper’s Magazine 04 2001: 39-58. ProQuest. Web. 14 Jan. 2014

----  

Back to content

Search my site

Sections

Current teaching

Recent changes to this site

Tags

anglo-saxon studies, caedmon, citation, citation practice, citations, composition, computers, digital humanities, digital pedagogy, exercises, grammar, history, moodle, old english, pedagogy, research, student employees, students, study tips, teaching, tips, tutorials, unessay, universities, university of lethbridge

See all...

Follow me on Twitter

At the dpod blog