The post-digital age

| August 12, 2014

The way we store and share information has changed fundamentally over the centuries. Phillip Long, Executive Director of Innovation and Analytics at the University of Queensland, explains how we came to reach the post-digital age.

It’s about time we got there. We started with the model of learners working either independently or in a close relationship with mentors, instructors or teachers to acquire from them knowledge, built upon scarce data that only the teachers knew. Knowledge and its power were directly related to what you could remember and recall when interacting in face-to-face debate.

But the growth of facts began to challenge even the most capacious human minds. We took to recording them, laboriously by scribes, onto an external storage medium. The effort made the scarcity costly, adding economic value to the mix. Among the knowledge elite this precious external storage environment was prized and guarded.

Practitioners of the passage of knowledge were seen at times silently mouthing words and sentences in a mystical and to some very frightening practice of ‘quiet reading’. They translated the coded representations of knowledge on the fly as their eyes danced across the storage medium bringing them to express things that those around them knew to be beyond their experience – and scaring the unlearned by the “power” in these new devices… books.

The revolution of the printing press democratised access to information. It was no longer a matter of the rich and powerful to own knowledge. The transition took several hundred years, but it laid a part of the foundation for the explosion of knowledge that characterised the Renaissance. It also changed the way we think. What once was knowledge by virtue of memory and its recall was now possible to store outside the little grey cells in your cranium. We needed an indexing system to be able to track all of that externally stored information and mechanisms to use it for efficient retrieval.

Various mechanisms emerged that used descriptions of the location of the physical objects (the Persian city of Shiraz’s library, 10th century), the location itself coded by numbers (Library at Amiens Cathedral in France) or Thomas Hyde’s Incunabulum, a printed catalogue of the books in the Bodleian Library, Oxford University. All of these were attempts to organise and make more accessible to humans the increasingly vast body of information accumulating in the world of knowledge creation.

We have always, as tool making creatures, used our ability to build things to improve our existence. Initially this was focused on survival, but as we became more capable it quickly spread to other aspects of making life easier, better and more fulfilling. To organise our knowledge we built repositories in the form of collections in libraries and indexed them, initially more arbitrarily, and later through a classification schema (e.g., the Dewey Decimal System).

The advent of representing information more abstractly in terms of binary coding of human readable characters launched the digital revolution. The initial physical manifestations derived from what was cutting edge technology of the period – gears, levers and pulleys gave way to re-appropriation of the loom for separating digitally represented holes as ones and their absence as zeros. The “Jacquard head” for dobby looms provided the insight between pattern in the abstract, and the Jaquardian weaving resulted. His key idea was the use of hole-punched cards to represent a sequence of operations, leading to the Analytical engine of Charles Babbage and later the card tabulating machine of Herman Hollerith to perform the 1890 US Census.

Tying all this together is the use of technology to augment the human intellect. Fast forward to the end of World War II and the same concern for the proliferation of information and ways to find and use it rather than continue a cycle of rediscovery was expressed by Vannevar Bush in the classic “How We May Think” article printed in Vanity Fair, July 1st, 1945. In it he proposed, based on the state of the art of technology of his day, the Memex, a machine to record knowledge by mimicking the human search process through what he termed Associative Trails. He wanted to record not just the artefacts but the way in which humans thought through the steps that led to the formation of those artefacts. Further he wanted to make these shareable so that others could see not just the result, but the process by which that result was derived.

It took another 23 years before the technology today could at least attempt a physical implementation of this idea. It was presented to the world in a breathtaking live demonstration at the Moscone Center in San Francisco, California by Doug Engelbart in the “Mother of All Demos”. The demo was an attempt to show, rather than talk about, something Engelbart wrote six years before in the landmark paper Augmenting the Human Intellect: A Conceptual Framework. In one 100 min ‘show and tell’, Doug demonstrated the introduction of the computer mouse, video conferencing, teleconferencing, hypertext, word processing, hypermedia, object addressing and dynamic file linking, bootstrapping, and a collaborative real-time editor. Our world was forever changed.

Today we routinely offload memory into silicon. Some refer to this as the dumbing down of our intellect by Google, but we’ve been doing it for centuries – we just do it better and more efficiently today. And while it appears we’re mired in our devices, walking heads down into street signs as we text away, or sitting at the dinner table with friends when we’re ‘friending’ people who aren’t present through the devices formally known as phones, we are moving past that era.

What marks this shift? The post digital age is like prior radical transitions – it’s marked by the fact we no longer recognise it as different. Think back to when your parents had an icebox. They replenished it with block ice at least daily. And then something happened. Refrigeration. And in less than a generation we went from astonishment at this miracle to forgetting the world was different in the time before it.

Look at children playing today with their parent’s smartphones or perhaps their own tablet computers. When they walk up to pictures now they try naturally to manipulate them with the ubiquitous thumb and forefinger spread to zoom the image. We walk around with digital sensors measuring our gate, altitude, and velocity and glance at our ‘phones’ (we need a new term for this) to see the dashboard of our activity.

More importantly, we are starting to see and think in ways we couldn’t before because our devices are shaping what we conceive as questions. In 2011, we began to make movies by directly recording the impulses from the voxels in our brain to reconstruct the imagery recalled from the memory of movies we have seen. We are on the verge of communicating rich media from neural storage to the sensors that pre-process it in others. It’s not long before we have the capability to transfer these memories passing their biological encoding. Will these be ‘memories’ at all without this step? Will we perceive them the same as those created by our own neural infrastructure? We don’t know yet, but we soon will.

As the embedding of digitally enabled devices extends the concept of the internet of things to interconnection of objects in a network with ourselves we silently enter the post-digital age.

As David Foster Wallace wrote in 2008, “the most obvious, ubiquitous, important realities are often the ones that are the hardest to see and talk about”. Fish don’t see water, but we must.

Welcome to the post digital age.

Global Mindset presented Dr Long’s ideas along with digital thinking, leadership thinking, global thinking, lateral thinking and an education technology Start Up pitch at the conference ‘Innovations in Learning’ on 13 August in Sydney.

SHARE WITH: