If We'd Had Connections First

The following is based on a guest blog originally invited by the editors of SlashCloud. Thanks for their permission to share those ideas here.

Is it only an accident of history that the microprocessor came before the Internet? And if the sequence had been reversed, would we have different ideas today about what is ‘normal’ and what is ‘disruptive’ in the ways we live and work?

ConnectedPlanetIt takes a real effort of will to reset our thinking, and re-invent whole ecosystems (such as health care and education) for a massively connected, sensor-rich, API-infused world – but the effort will pay off with huge improvements in the sustainable performance of these functions.

Intel’s 4004 was first advertised in November 1971, while the word ‘Internet’ wasn’t coined until three years later; the TCP/IP stack was not standardized until 1982.

That gap provided a long, long time—seven Moore's-Law cycles—for isolated nodes of computation to swarm across the planet, with connectivity considered an extra-cost option and at most an intermittent convenience. We're still carrying the resulting baggage, in the form of ziggurats of middleware that simulate what we might have much more simply built for real – if we'd had ubiquitous connection before we'd gotten cheap CPUs.

From a purely technical point of view, we can understand and rage at the perversity. If you want to enable simultaneous editing of a document by many contributors, for example, the obvious way is to have a shared data structure and a simple means of concurrent access and conflict resolution. The worst possible way is to give every editor a separate copy of the document, and to build a complex mechanism of replication and combination that tries to approximate what could be more simply made real. The former is something like Google Docs, the latter something like Microsoft Office. In a connected-first world, wouldn't Google Docs have long been considered the norm? instead of being seen as an innovation?

Technical elegance, though, is much too narrow a frame of reference. Let's think big. Let's think about global, crucial, currently unsustainable institutions like the way we practice medicine. Too many people crowd into too few doctors' offices to take too many tests at too much cost to deliver too many null results – that say, in effect, "OK, you didn't need to come into the office today after all."

In a world of 'connection first,' we'd put radio nodes in medicine bottles to confirm that the patient is taking pills on schedule. We'd put network-connected sensors in toilets to do basic urinalysis. We'd let people opt-in to connect their grocery store frequent-buyer accounts to their primary care providers, so information on what we're eating and drinking could be accurately recorded instead of optimistically self-reported.

Likewise higher education: instead of diverting huge fractions of tuition and fees to housing and recreation, we'd make campus a place that's occasionally visited for seminars and feedback sessions – while most learning takes place in apprentice- or intern-style engagements, and faculty members tailor sequence of content to match the student's assignments in those practical settings.

This is the way we need to think about the cloud: not merely as new connective tissue for the institutions and processes that we already have, but as a new environment in which we can re-think what those processes should be and how those instutitions should do their jobs. That's the difference between accident and design – and the difference between being a leader, and being a victim of disruption.

Creative Commons License Licensed under Creative Commons Attribution-No Derivative Works 3.0.