Narrative Velocity & Big Data
I’ve been reading Ciaran Carson's Exchange Place. The most compelling aspect of this latest novel from such a unique novelist is the structure of the narrative, whose interlinking, interrupting strands wind around and among a set of numerous repeating images, moments, perspectives, phrases: a tweed coat appears, in various degraded and evolving forms and aspects, over and over again, as does a notebook, as does a manuscript and various other tropes.
You’d think the effect would be an overburdening of the narrative structure with these noisy details, but what occurs is a sense of hugeness and depth to the text, whereby the flow of time passes around the story’s tropes like water moving around rocks in a stream. It is difficult to pin down precisely when anything happened or even precisely to whom, but the tropes are consistent: ballasts of memory in a continuous modern media flux, a tweed coat standing in for order in a time-lapsing present.
For most writers, memory is like a series of vectors, taking us through a virtual geography; for Carson a memory—represented by an object or a particular phrase to which he demonstrates clear attachment—is used to demonstrate, not movement in time, but the fact that everything is moving around it. In that sense he presents experience in time, as we actually experience time.
Literature changes when it becomes a moving target; narrative at speed is turbulent and unpredictable, imposing structure from below. Peter Boxall argues in Twenty-First Century Fiction that “the fiction of the new century fashions its own emergent mode of historiography—what we might call the reality of history,” a hybrid of fiction and history.[i] Perhaps there's an extra dose of reality in how history is being written through narrative, as deep-structured and chaotic, recorded at velocity.
Certainly this reflects the way we are coming to understand information in the age of big data, maybe not as a cause, but bound up in the same matrix of social and scientific awareness as they roil about in culture; rather than freeze-framing data (think of a bug on a slide), computing technologies have us analysing data in dynamic ways that are basically real-time, or almost.
Rob Kitchin says in The Data Revolution that “Velocity occurs because repeated observations are continuously made over time and/or space with many systems operating in perpetual, always-on mode.”[ii] This is the generational difference between small and big data, and understanding analysis and information in such a way changes how we think about narrative, too—which is, after all, just an exchange of information.
[i] Peter Boxall, Twenty-First Century Fiction: A Critical Introduction (Cambridge: Cambridge University Press, 2013), p. 45-6.
[ii] Rob Kitchin, The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences (London: Sage, 2014), p. 76-7.