In his 2007 article on Attention Crash, Steve Rubel predicts an imminent bursting of the web 2.0 information bubble since our attention does not scale in proportion to our inputs
We are reaching a point where the number of inputs we have as individuals is beginning to exceed what we are capable as humans of managing. The demands for our attention are becoming so great, and the problem so widespread, that it will cause people to crash and curtail these drains. Human attention does not obey Moore's Law.
I assume Rubel is referring to the generic notion of scheduled exponential growth which Moore's Law has come to symbolize in the microprocessor industry, and more generally, the computing industry as a whole. The metaphor is more apt than perhaps the author originally intended.
Moore's Lore
Moore's Law, formulated in 1965, states that transistor density will double every 1 - 2 years. By 1970 Moore's Law was being interpreted as computer processing capability doubling every few years - a subtle change in prediction from exponential component decrease to exponential processing increase. The latter knows no bounds while the former operates in the physical world dominated by fundamental constraints and hard limits.
Moore's Law recently celebrated its 40th anniversary with much fanfare in 2005. We can divide this history into 3 epochs which we will call the Narrative, Prophecy and Legacy epochs. During the Narrative epoch Moore's Law provided an accurate description of advances in the microchip industry - graphing the data followed his predicted curve. During the Prophecy epoch, Moore's Law became more of an industry mission statement rather than a narrative. The industry was actively planning and investing to fulfil Moore's Law and it came to symbolize the advancement and promise of the industry as a whole. Ensuring the next doubling of processor speed became mission critical.
In the final epoch of Legacy, our epoch, the assumptions of Moore's Law are in decline. The reason is that chip makers are approaching the fundamental limits of increasing computer processing speeds through crafting smaller components. The hardest limit is simply that component size bottoms out at the molecular or atomic scale - nothing can be made smaller that follows the same physics. Further, issues with power consumption, chip cooling and production costs will invalidate the assumption that smaller components are the most cost effective strategy to increase processing capability. In fact, Mr. Moore's Intel is already delivering dual-core and quad-core processors which increase computing power by providing more chips per device rather than more transistors per chip. The future then lies with more chips of a given complexity not with a chip of increased complexity. So Moore's Law may actually be maintained but not for the reasons that Moore predicted.
Fundamental Attention Limits
When we start out connecting, subscribing or otherwise attaching ourselves to new web information sources, we are able to absorb a great deal of new information. We may even begin participating in the production of additional web content for distribution and consumption. We feel more informed, more empowered, and more enamoured with the promise of the omnipotent web. The web 2.0 narrative has worked its magic and we tacitly commit into a seemingly virtuous circle of information inflation.
After a honeymoon period we start to feel the burden of keeping up appearances on the web. We still subscribe to new feeds, extend our blogroll, sign-up for new services like Twitter, embed ourselves in social networks, fret about our followers and labour over posts. In our peripheral vision the virtuous circle is acquiring a vicious hue. The web 2.0 narrative becomes more belief and prophecy than tangible benefit. But belief and prophecy can be strong motivations and we continue on connecting, subscribing and participating in new web information sources.
After passing through the Narrative and Prophecy epochs, we finally reach the Legacy epoch, where our initial assumptions are invalidated. As with Moore's assumption that computing components can be exponentially decreased for the foreseeable future, the initial assumption that our time and attention can be exponentially fragmented into smaller meaningful units collapses. While this took 40 years in the case of Moore's Law, we live on compressed and accelerating Internet time, and this lesson can be learnt from one birthday to the next.
Unfortunately as humans we don't have the luxury of switching to dual- or quad-core brains. We are stuck with one and its inherent capacity to process at a given rate of input and granularity. A Sub-Time Crisis is upon us and we need a bailout, or at least the presence of mind to opt out.
Related Posts