Media Theory for the 21st Century

March 3, 2008

Back to Narrative/Database for just a second..

Filed under: Discussion — nickroth @ 7:30 pm
Tags: , , ,

This post is both late and not a genuine official post-post.  I was just thinking back to our discussion of narrative and database in the novel, and specifically in reference to the possibility of novels which privilege or favor database over narrative, and Nabokov’s Pale Fire came to mind.  It seems like one could take the poem as a database and the commentary maybe as a set of algorithms.  I don’t have any definite ah-ha! points to make about it, so I’m just wondering if anyone has any productive thoughts about it – after all, it’s insistently “A Novel” according to the cover of all the publications I’ve ever seen, but pretty much everyone who’s ever read it has encountered serious difficulty with mapping it’s narrative(s).


February 28, 2008


Filed under: Discussion — sergiomf @ 10:14 am
Tags: , , , ,

spacefighter banner

Last week I concluded my presentation on Narrative and Databases with SpaceFighter, the latest software tool devised by MVRDV, expected to be released and publicly accessible online in the coming months. In this post, I would like to elaborate some more on this topic, namely by framing the theory, and attempt a better articulation with the topics that we have discussed over the past few weeks.

In its opening statement, Winy Maas, the director of MVRDV involved in this project, sustains the “inevitable and total surrender” towards a process oriented approach, which he hopes will lead to a city which can reformulate itself, a city that is conscious of its gained knowledge (a concept which had already been identified in previous publications from this office, namely KM3 – Excursions on Capacity).

I would argue that this software has at its very core the notion of the necessity to readjust the way we see urban planning and the city, to a new model which reflects the changes in our contemporary society as well as in the complex urban synergies which planners and architects have only now started have a more detailed grasp on. This assertion seems to correspond with the fundamental idea in the Schivelbusch piece (which we read a few weeks back), where he argued that the change of technology, in his case transportation systems, from coach to railroad, should be accompanied by an adaptation of our “traditional perception apparatus” to this new condition. Thus, SpaceFighter, aims at understanding “the size and complexity of the urban reality”, by trying to develop new methods beyond the exhausted models of scenario creation (which have dominated urban analysis for quite some time now), more precisely original projective methods. In practical terms, it would mean a shift from the common and exclusive tools of mapping and diagramming to the innovative and inclusive tools of gaming. This is argued to be more suited, since contrasting to the limited variation in scenarios; the interactive model can generate outcomes previously unimaginable, as it absorbs new knowledge by agents playing the game, but also by the constant update of the several databanks to which is connected.

In this regard, SpaceFighter expects to gain new insights into the complexity of urban systems by the combination of different datasets, stimulating the planning aspects of all (possible) databanks, “encouraging them to move from static to progressive data”, and therefore producing more data (quantitatively as a result of new technologies), but also better data (since it becomes available and accessible to a larger audience). I would argue this specific process of data-crossing to be more productive and less scary than the one described by professor Hayles when explaining the future ubiquity of RFID chips and possible uses in personal data-mining. I also believe that we have somehow addressed issues relevant to this point in our discussion of Google Earth (after Tim’s presentation) as a tool which can be used by a small group professionally, but also by a larger audience as an entertainment device, but also in the comments of professor Hayles regarding the close reading of Fuller, where he writes about how the systems of work and fun have become intermeshed and are no longer restricted to their original use.

I would like to point out to something that seem to be implicit, and undisputed, in the whole SpaceFighter project, namely that in current society, the pervasiveness of digital technology implies that everything (or at least complex urban systems) are in some way or another captured in databases, which is why I thought this endeavor to be relevant in the context of last week’s discussion of databases and narratives.

Finally, and since the theoretical implications of Spacefighter are not exhausted in the (limited) approach I used in this post, and so that you could have a broader idea of what this project is based upon, other recurrent concepts in the theory of SpaceFighter, and which I have not addressed but believe to be worth noting would be:

Entropy, or how regions are comparable to surviving systems where energy losses, accumulations and uses are observable

Evolution, in the Darwinian sense, as a parallel is drawn from regional survival and the development of species, namely the (inherent) perfecting ratios-based calculations which have been incorporated into living entities by evolution

Complexity, specifically that contemporary life is based in complexity, which can be approached by a simplification and understanding of elements which compose it on different levels

Game Theory, or how games can be used to gain insight in complex systems. In SpaceFighter concepts such as player, strategy, payoff, complexity and predictability are applied to architecture and urban planning

Multiple Scenarios, which allows planners and decision makers to realize that there is no one absolute truth, but several different possibilities or scenarios


PS: If you are interested in SpaceFighter, you can download a more comprehensive presentation I prepared about SpaceFighter (based on the publication) in PDF slides

SpaceFighter Presentation Long

February 24, 2008

seeing the future in mouse spinal cords

Filed under: Discussion,Readings — johnbcarpenter @ 2:19 am
Tags: , , ,

reading folsom and professor hayles’ comments on overwhelming quantities of information in database systems (“the tempestuous relationship of narrative and data”), i immediately thought of scott fraser’s (a professor at caltech and director of their brain and the biological imaging centers) regular comment that “scientists today collect more data than humans can perceive.”

while technological developments are making it possible to generate and store large datasets of information, researchers are also looking for new ways to interpret them so that they can be understood on human terms. an inspiring example for such a narrative is the painting-motivated diffusion tensor representation technique developed in professor fraser’s lab in 1998.

i briefly mentioned this in class a couple of weeks ago, but i thought i’d post a link to the full paper visualizing diffusion tensor images of the mouse spinal cord (laidlaw et al.). as outlined in the paper, the nine dimensional mri diffusion tensor data is typically shown like this…

but using the laidlaw technique developed in collaboration with the caltech conceptual artist, davidkremers, the group created visualization that looked like this…

“Our second method applies concepts from oil painting to display diffusion tensor images. We used multiple layers of brush strokes to represent the tensor image and the associated anatomical scalar image. The brush strokes reflect the geometric nature of values derived from the tensors and of the relationships among the values. Also, the use of underpainting and saturated complementary colors evokes a sense of depth. Together, these painting concepts help create a visual representation for the data that encodes all of the data in a manner that allows us to explore the data for a more holistic understanding.”

diffusion data = visualization
anatomical image = underpainting lightness
voxel size = checkerboard spacing
ratio of largest to smallest eigenvalue = stroke length/width ratio and transparency
principal direction (1) = stroke direction
principal direction (2) = stroke red saturation
magnitude of diffusion rate = stroke texture frequency

this visualization presents the information in an intuitive way that doesn’t require years of training to understand some of the basic information of the dataset. for instance, deterioration of the spinal cord on the right (patchiness) is obvious as compared to the healthy organism on the left.

the brilliance (in my opinion) of this work is that it combines a nine-dimensional data set into a single 2D image, and in doing so allows the viewer to intuitively “look into the future”… the organism on the right has no physical symptoms of the disease, but based off the visualization, it’s possible to predict where and when it will develop spinal cord damage.

February 19, 2008

Read / Write / Access / Error: Query on human/machine memory

Filed under: Discussion — P.J. @ 7:23 pm
Tags: , ,

The debate that arises as early as Plato’s argument between Theuth and Thamus – which Renee brought up in an earlier post – has long fascinated me for its definition of memory as that which must instantiate in a consciousness located within the human body. Because I tend toward a model of memory that includes the extended “unconscious” information humans store in other media for retrieval, it is difficult for me to parse that clear defining line between biological and technical information resources, a position not unlike Andy Clark’s in Natural Born Cyborgs. Alternately, the proliferation of our externalized information resources (read that “databases”) demands new interrogations, not just because of the implications for New Media work we’re discussing this week, but because of the profound political implications that inhere in a system that can’t seem to get away from the “communication-and-control” of cybernetics; here Zach’s discussion on the status of abjected information in shadow databases comes into play.

In short, I am interested in exploration of some very fundamental issues: What is memory? What is its relationship to database – mutually exclusive, related, equivalent? I wonder especially about recent cybernetic/computational models of memory that, while they obviate the mystique attending models from 200 years ago, threatens to substitute it with a cybernetic mystique that fits into its own neat black box.

Further, what is it to re-member, or to re-mind, as we engage in this exercise in re-cognition? What members have we dispersed that demand reattachment, and does the act of remembering – whether it involves human recollection, paratextual referencing, or machinic recall of distributed data points – sufficiently fulfill this seeming desire of the members to rejoin? Does the re-rendered/re-membered/re-called figure function as a reasonable facsimile, or do we always create monsters anew, akin to the roughly-stitched Patchwork Girl of Shelley Jackson?

Seaman’s model of recombinant poetics, which he details in Vesna (124), seems to offer a model that anticipates this more emergent conception. Surprisingly, it is far more aligned with my understanding of oral composition, informed by Albert Lord’s study of the bardic tradition in the early 20th-century Balkans in The Singer of Tales. The bards who, as Sharon Daniel notes, “[perform] for the community narratives belonging to the community” (144), learn their craft not from the rote memorization of scripts, but from years of intense devotion to understanding the story elements and the verbal formulae in a way that enables masters to tell stories with any number of elements changed or modified according to the circumstances of the performance and audience feedback. (This is nothing like the spurious “substitutions” Steve Dietz offers from Ong’s interpretation of Homer in Vesna 117). Yet Lord notes (I can find that quote for you!) that the bards eschew any model of memory as exact reproduction by insisting that each instance of a given story is a reproduction of the same story, and moreover, is a truthful account of the same story.

So, immersed in computational models galore, how do we reconcile this understanding of human memory to our definitions of textual authenticity and data integrity? What function does forgetting serve? And, momentarily setting aside Daniel’s optimism regarding the intervention of dialogism, what are the political and aesthetic stakes in a system that falls short of ubiquitous capture and categorization? Even with an idealized ubiquity of storage and retrieval, what firewalls can we establish that prevent the human being – noted in several essays as analagous to a database – from being reduced to a datum?

I haven’t even started on the database/narrative dynamics in the Bible and the huge apparatus of scholarship that attends it. 🙂 But that would be fun to discuss too.

February 18, 2008

Pattern Extrapolation

Filed under: Discussion — jjpulizzi @ 11:58 pm
Tags: , ,

Many of the readings we’ll be looking at tomorrow respond to Manovich’s association of database and narrative (whether affirmatively or negatively), though we may also wish to think about the role that databases, particularly the interfaces, play in aiding the human brain at pattern recognition.

Manovich’s pronouncement that “database and narrative are natural enemies” (44) suggests that two are vying for control over how human beings imagine relationships—whether spatially (database) or temporally (narrative). Professor Hayles instead suggests that despite their differences, database and narrative, like human and computer, exist symbiotically as necessary extensions of one another in a society flooded with information needing organization. We should also remember that brain’s ability to separate relevant and irrelevant information (pattern and noise) quickly is a necessary component of narrative and also an ability that is refined by narrative. If we have a “story” or context for a given situation, then we’ll be better able to mark pattern from noise.

This perspective also allows us to understand visual and auditory interpretations of databases as instances of the same tendency to search for patterns that appears in narrative. We can take as an example the failure of databases and their attendant algorithms to catalog common sense. Much common sense knowledge or know-how is extremely dependent on context (i.e. what came just before and what might come next), which leads to a bewildering proliferation of exceptions and special cases (if anyone is interested I can give references to works that make this argument). Even if these indefinitely proliferating exceptions and cases could be cataloged, the problem of how to efficiently search the records remains.

How then could the human brain possibly learn with ease what cannot be represented explicitly? Terrence W. Deacon in The Symbolic Species attributes the human ability to learn and use a complex natural language, whose grammar and syntactical variations could never be entirely cataloged or put in a database, to the brain’s facility at first discerning high-level patterns and then gradually refining that structure with relevant details, which prevents one from being overwhelmed by obscure exceptions or noise. The fuzzy picture must precede the sharp focus—so children with immature brains, which are easily distracted, learn languages with much greater facility than adults do. Refining the fuzzy picture also requires actually living in and experiencing the language and the society in which it is used.

Computers permit one to explore various data for patterns whether it be with a narrative method of by visualization. Again it is a case of the human and computer collaborating to draw upon the strengths of the other. Excellent pattern recognition and sorting on the one hand, and rapid calculation and manipulation on the other.

Create a free website or blog at