In 1997, Steven Johnson wrote Interface Culture, an insightful book in which he points out the rise of media content like Bevis and Butthead, Mystery Science Theater and Talk Soup which, he points out, provide little if any original content. Instead, they are what he calls "parasitic media", which act as filters for an increasingly digitized world of information - helping to explain the behemoth machine of information that drives society. These forms, he argues, are similar in function to the Elizabethan novel that helped the populous to make sense of industrialization - a shift that must have been overwhelming - a machine that drove society, but that was too large to make immediate sense to the human mind. In a sense, this has always been the function of myth - to help so explain and filter the machines (loosely defined) of nature and society in terms more understandable to the human mind.
Since 1997, we have entered a new era of information filtering, where terms such as "folksonomy" and "wiki" suggest that the effects of mass human filtering is greater than the sum of its parts. Mathematicians and complexity theorists seem to suggest that it is the structure of these systems that is importantly causal in this sort of filtering - what seems to me to be a highly empirical/structuralist myth. Humanists seem to suggest that it is the intention of the actors of these systems that render these emergent interactions effective - the self-corrective mechanisms arising out of the statistically significant goodness of the majority of writers of wikipedia.
These are of course myths we're devising to explain a far more complex system than we can understand at the moment. I wonder what myths will supercede them?
Thursday, September 07, 2006
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment