(no title)
delazeur | 8 years ago
My experience with academic archaeologists has been that they are pretty myopic on this. They like to dismiss past archaeologists as treasure hunters, but they are also eager to get their hands on all the data they can without much regard for the locals or for preservation. They usually justify this by pointing out that they aren't looking for treasure, in the traditional sense.
mncharity|8 years ago
I wonder if anyone has done a survey of views? There is some "do no harm", exploit non-destructive techniques, and reserve excavation for rescue archaeology. But seemingly a variety of other standards of care for site conservation and consumption.
Not my field, but one thing I've anecdotally seen underappreciated, is the rate of change in molecular biology, and thus eventually in molecular archaeology (aka bioarchaeology outside the US). Where over a mere decade or two, "can't imagine doing X" can change to "doing X costs years and millions", and then to "new trainee can casually do X in an afternoon", and even to "X is a field-deployable box". With a potential impact... for example, it turns out tooth calculus preserves DNA. So from millennia old teeth, one might sequence the owner, their mouth microbiome (with health information), and the species they've been eating. Grabbing a few soil samples might be ok if you're thinking pollen, but rather less so if a single tiny rat incisor might tell you so much. And proteomics is just getting started. And in addition to biology, there's improving imaging, inorganic materials analysis, bulk data management, robotics, and so on. Aggressive excavation, with 'DSLR, eyeball, sample bagging, brush, sift, and float', while largely unchanged for decades, seems rather more problematic if one looks ahead?