(no title)
otakucode | 6 years ago
One thing we could do, which would not solve the problem but perhaps illuminate a better way forward, would be to work on communications technology. Not simply technology for data transfer, but for actual information and understanding transfer. Technologies and systems which facilitate the good behavior without facilitating the bad nearly as well. By that I mean things such as systems which enable presenting large volumes of nuanced information in ways accessible to more people, and systems which enable both constructing, sharing, and refining complex arguments.
As an increasingly large portion of humanity conducts themselves online, we must keep in mind that there is much of human behavior which is not admirable. The solutions to that behavior are not technological except in the most dystopian and (philosophically to me at least) disgusting scenarios. There has never been, nor ever will be, a happy, prosperous police state. Autonomy and free expression are not luxuries, they are necessary for human health. We must also always be vigilant that the systems we implement are flexible and permit society to change both within and through them. Take a thought experiment I came up with for example. Imagine that tomorrow morning 90% of the population of planet earth awoke to a realization that agitation over nudity was ludicrous. Would it be possible for our existing systems to accommodate this change, or would it actively thwart every single attempt by any individuals to live their life according to this newly adopted principle? Would it result in a global relaxation of pointless anxieties, or would it result in increased anxiety as people felt themselves isolated in their realization, 'judged' by their technology which would filter them, block them, and reject them at every stage?
At no point in history has any society, so far as we know, hit upon "The Correct Ideas" which represent unvarnished truth, eternal and unchanging. And we should be careful to consider that our current social ideas are not unwittingly treated as such, ossifying human culture.
In Eric Schmidt's book "The New Digital Age" he speaks about wishing to play a very active role in exactly this kind of cultural ossification, expressing an extremely elitist view that due to the fact Google is rich, they are Better and should therefore take steps to actively guide and mold society in the ways which Eric Schmidt believes are best. Those just so happen to be the social values of the late 1990s when Google was introduced and which facilitated their wealth-building. That is, to my mind, a dangerous game. Past history would suggest that attempts at "social engineering" which do not rely completely upon broad social consensus and upon society reaching its own conclusions and doing its own enforcement of its own ideals tends to backfire in spectacularly catastrophic and inevitably violent ways.
No comments yet.