I spend my days trying to structure books into relational databases to attempt to map them to more interesting non-linear behaviors (UX and UI).
Most of the humanities has been relegated to linear data storage (books). Unlocking that with new interfaces is the most obvious way that the humanities can become obviously digital.
I don't really see it as a very heavy question -- there is still a need for writers, painters, photographers, etc -- there is just more spots for workers to design the frames for that content in new and exciting ways. The professors won't get a handle on this until its already defined and part of the past, because humanities isn't really research for the future as much as it is a study of the past.
I can't quite make up my mind whether its irony, or the other half of the same phenomenon, that humanities are feeling pressure to become more business-like whilst businesses are being drawn towards methods from the humanities.
yes, of course. because it is completely atypical for humanities to have data that either confirms or disproves some hypothesis. that's more of a "hard science" thing ;) humanities simply cannot operate like that. no data can prove why Shakespeare was good.
i think the author was simply pointing out that no significant new discovery was, or perhaps even could be, made that way. he was just attacking the hype, the story of new breakthroughs right around the corner, of the entire field of humanities being shaken to the core.. not by this.
and also, this Ngram discovery is a weak one. to reach some proper scientific level they should not restrict it to checking a couple of examples with known outcomes. it really should produce something new. otherwise, would it not be in danger of simply being a manifestation of confirmation bias?
edit - ok, i may be using the term "humanities" wrong :) i guess some can get a lot from data, but some really cannot.
It's interesting reading this from a background in Anthropology, having experienced some of the challenges the article describes. In particular, the perspective that people need to know how to code (Ramsay), and the backlash it evoked was something I've experienced on numerous occasions. I'm interested in digital cultures and quantitative methods, but Anthropology largely abandoned quantitative stuff in the 70s and now it seems like anyone who mentions statistics and quantitative methods non-sarcastically gets shunted into other fields (e.g. Sociology, Informatics, CS).
I didn't whither away in Anthro, but now I'm leaving the field because, essentially, my methodology and skill set align me more closely with a CS program than with an Anthro program. That my research interests are in line with cultural Anthropology is secondary; methods determined my field, which when you think about it is kind of bizarre.
Dan Jurafsky (at Stanford) and I had a brief conversation about this and seemed to come from a similar experience. He described himself as a linguist and seemed to feel that that's where he is intellectually, but the field of Linguistics sees him as more of a computer scientist because of his tools and so he's got an appointment in CS.
I liken Anthropology in digital cultures to being an anthropologist of some South American culture; there are languages, customs, and mindsets that you need to learn - and indeed be fluent in - before you can even claim to be an expert. Anything less and you're seriously limiting yourself. There will be people who study digital cultures who can't code, eschew quantitative stuff (including "Big Data" et. al), and generally stick to orthodox convention. Their expertise will someday be limited by not being able to "speak the language" both literally and figuratively (ie having an emic understanding of the culture they're studying).
I don't think "someday" is today. Maybe that's because my undergrad advisor is an Anthropologist with no programming background, yet unequivocally one of the leading researchers in digital cultures. I think there's still enough basic stuff that we need to figure out to orient ourselves with digital cultures that people have to be able to code and bring domain-specialized skills to the field. Or maybe we're so far along that getting entrenched in digital culture has become abstracted from being savvy to the point of being a programmer.
The article ends on the prescription to critique technology in the spirit of "intellectual responsibility", but I get the feeling that it doesn't see the "identity crisis" described earlier as the very critique the article calls for. Hacking away at each other, intellectual vying for control of the digital humanities are ultimately advocating for what the field ought to be. This is what we (social scientists and humanists) badly need.
If the digital humanities can't coalesce into something meaningful (and perhaps soon), people like me will ultimately go into CS and related fields where we'll learn the tools we need to become fluent in the technical skills, rather than hone qualitative skills and study sociocultural theory. If that continues to happen, the digital humanities could lose an entire generation and ultimately fail to keep Anthropology and the Humanities modern and relevant. I don't know what will happen to these disciplines if they get left behind.
[+] [-] tsunamifury|12 years ago|reply
Most of the humanities has been relegated to linear data storage (books). Unlocking that with new interfaces is the most obvious way that the humanities can become obviously digital.
I don't really see it as a very heavy question -- there is still a need for writers, painters, photographers, etc -- there is just more spots for workers to design the frames for that content in new and exciting ways. The professors won't get a handle on this until its already defined and part of the past, because humanities isn't really research for the future as much as it is a study of the past.
[+] [-] robin2|12 years ago|reply
An example of the latter would be the work of ReD Associates. (See, e.g., http://www.theatlantic.com/magazine/archive/2013/03/anthropo..., http://rdn32.com/2014/04/03/the-moment-of-clarity/)
[+] [-] ppod|12 years ago|reply
Isn't it sadly typical of the humanities to view data that confirms a hypothesis as redundant?
[+] [-] bakhy|12 years ago|reply
i think the author was simply pointing out that no significant new discovery was, or perhaps even could be, made that way. he was just attacking the hype, the story of new breakthroughs right around the corner, of the entire field of humanities being shaken to the core.. not by this.
and also, this Ngram discovery is a weak one. to reach some proper scientific level they should not restrict it to checking a couple of examples with known outcomes. it really should produce something new. otherwise, would it not be in danger of simply being a manifestation of confirmation bias?
edit - ok, i may be using the term "humanities" wrong :) i guess some can get a lot from data, but some really cannot.
[+] [-] alialkhatib|12 years ago|reply
I didn't whither away in Anthro, but now I'm leaving the field because, essentially, my methodology and skill set align me more closely with a CS program than with an Anthro program. That my research interests are in line with cultural Anthropology is secondary; methods determined my field, which when you think about it is kind of bizarre.
Dan Jurafsky (at Stanford) and I had a brief conversation about this and seemed to come from a similar experience. He described himself as a linguist and seemed to feel that that's where he is intellectually, but the field of Linguistics sees him as more of a computer scientist because of his tools and so he's got an appointment in CS.
I liken Anthropology in digital cultures to being an anthropologist of some South American culture; there are languages, customs, and mindsets that you need to learn - and indeed be fluent in - before you can even claim to be an expert. Anything less and you're seriously limiting yourself. There will be people who study digital cultures who can't code, eschew quantitative stuff (including "Big Data" et. al), and generally stick to orthodox convention. Their expertise will someday be limited by not being able to "speak the language" both literally and figuratively (ie having an emic understanding of the culture they're studying).
I don't think "someday" is today. Maybe that's because my undergrad advisor is an Anthropologist with no programming background, yet unequivocally one of the leading researchers in digital cultures. I think there's still enough basic stuff that we need to figure out to orient ourselves with digital cultures that people have to be able to code and bring domain-specialized skills to the field. Or maybe we're so far along that getting entrenched in digital culture has become abstracted from being savvy to the point of being a programmer.
The article ends on the prescription to critique technology in the spirit of "intellectual responsibility", but I get the feeling that it doesn't see the "identity crisis" described earlier as the very critique the article calls for. Hacking away at each other, intellectual vying for control of the digital humanities are ultimately advocating for what the field ought to be. This is what we (social scientists and humanists) badly need.
If the digital humanities can't coalesce into something meaningful (and perhaps soon), people like me will ultimately go into CS and related fields where we'll learn the tools we need to become fluent in the technical skills, rather than hone qualitative skills and study sociocultural theory. If that continues to happen, the digital humanities could lose an entire generation and ultimately fail to keep Anthropology and the Humanities modern and relevant. I don't know what will happen to these disciplines if they get left behind.
/rant.