Science, abstraction and magic

I was in a pub a while back having a pint with a British friend when the topic turned briefly to computing.  “What I don’t understand,” he said.  “Is how programmers make something from nothing?

I thought for a moment, then started briefly explaining about levels of abstraction, starting from transistors through machine language to higher-level languages.  I figured this might show him how by controlling certain electrical properties at the micro-scale we’re able to create a panoply of macro-effects.  Well, this obviously wasn’t the right approach since he began to glaze over after only a few minutes.  To be fair, we were a few pints in, but he had asked after all.  He was a university grad, so I’d assumed this was the kind of answer he’d been looking for.  Thinking back, I’m not entirely sure he wasn’t winding me up (pulling my leg) – a favorite English pastime – but I suspect that wasn’t the case.  I think he actually considered programming to involve the creation of something from nothing.

This got me thinking about the oft-quoted Arthur C. Clarke observation that “any sufficiently advanced technology is indistinguishable from magic.”  It’s been written  about frequently enough, but I’ve tended to think of it in terms of eras and generations.  The space probes, imaging technologies and wireless communications of today would seem like magic to a citizen of the 19th Century.  The locomotives, magic lanterns and telegraphs of the 19th Century would probably be quite magical to someone from the 15th Century Renaissance.  Likewise for the engines of da Vinci, perspective paintings and printing presses of that era, if you could show them to someone of a sufficiently earlier time.

But just as the rate of technological advance is increasing, so are the time frames of these examples diminishing.  I don’t think it’s necessarily a direct inverse relationship, but it does look like a fairly solid trend.  The fact is that a growing proportion of our technology is moving beyond the realm of common knowledge. 

When we flick a switch to turn on a light, we’re using an interface which separates us from the more complex processes involved, relatively speaking.  The processes are being abstracted in order to make them more accessible.  I think it’s fair to say that most people don’t understand the fundamental principles that cause a light bulb to glow when they flick that switch.  This is even more the case with more advanced devices.  As devices become increasingly sophisticated, there will have to be more levels of abstraction created between what they do and how we use them, between what we perceive as cause and effect, in order for us to be able to use them.  Because of this, the world is going to appear to become ever more “magical”.

Consider a lighting system.  We can already walk into some homes and activate the lights by our motion or with a verbal command.  In a few years, we’ll be able to operate much more complex systems using simple gestures or eye tracking.  And maybe a bit more than a decade beyond that by just thinking about it.  The simple interfaces will be there, the underlying technology will be there, but the conceptual distance between them will be so great that for many of us our understanding of the processes will become distorted. 

This creates the potential for some very negative consequences.  A number of people including futurologist Ian Pearson (see my recent interview with him in FutureNovo) have suggested that we may be moving toward a technological dark age where among other things, superstitions will arise around certain devices and processes because of just such a lack of understanding.  The ramifications of such a disconnect are alarming since it creates an environment begging for backlash and exploitation.  A greater gulf will occur between those with very specialized, esoteric knowledge and the end-users who reap the benefits.  If only a rare few understand how to build, operate or repair a technology, all kinds of abuse becomes possible.  Finally, should something happen to those few, we could easily find ourselves with no access to the technology at all.

And then we’ll really have to build something out of nothing.

Empowering the people

Change frightens us. The uncertainty of the new, the potential for disruption, it’s one of the reasons that our species seeks to anticipate the future – so we can avoid and hopefully survive the worst it has to throw at us.

Self-replicated RepRap parts

But change also has a huge potential to improve our lives and empower us. The recent accomplishments of the RepRap project are a case in point. Headed by Dr. Adrian Bowyer of the University of Bath’s Center for Biomimetics, RepRap is short for replicating rapid prototyping machine. In use by industry for about a quarter century now, prototypers are essentially 3D inkjet printers capable of creating parts by laying down thin layers of resin following a computer-driven design. Almost any basic object can be created using this technique, from dinnerware to engine parts. What makes RepRap different is that it’s the first prototyper capable of copying itself. And it’s open source.

While there are still a handful of its own parts that RepRap can’t copy and all of it still has to be hand assembled, the potential for a self-replicating replicator is enormous. Distributed to people in the developing world, such a technology could quickly raise their standard of living, providing necessities many of the rest of us have long taken for granted. Of course, such a technology would be tremendously disruptive to industry, but that can hardly be justification for billions to continue living and dying in unnecessary poverty.

Like RepRap, many other new and developing technologies have the potential to heal, to enable, to lift up vast numbers of people. DEKA’s water purification system, the Kurzweil-National Federation of the Blind Reader, solar energy solutions and worldwide immunization programs are but a few of the recent implementations of technology that have the ability to change the lives of millions for the better.

A changing world can be a frightening place, but it can be a very hopeful place as well.

Vanishing computers

Computers are disappearing. 

Now before you panic (or in a few cases, jump for joy), what I mean to say is computers are disappearing from view.  They’ll still be here, more powerful and in greater numbers than ever.  We’ll just not be seeing a growing proportion of them.


Intel 80 core research chip
  • Last year Intel unveiled a postage stamp-sized 80-core research chip as powerful as a 1996 supercomputer which at that time took up 2,000 square feet.  The new chip requires about 1/10,000th as much power as that supercomputer did. 
  • Wireless technology is available in more of our environment at continually increasing transmission speeds.  The recent auction of 700Mhz spectrum will allow for the delivery of a wide range of new software services via wireless.
  • GPS and other positioning technologies are being developed with greater degrees of accuracy and granularity at ever-lower cost.
  • RFID is becoming increasingly capable.  Identification, sensor-integration, data storage, firewalled access and encrypted communication are just some of their current features.  Grains so small they qualify as powder can be embedded in just about anything imaginable.  Even under your skin.
  • Cloud computing is taking off.  With its growth, more and more of our processing needs can be off-loaded to distant, unseen servers, which will provide processing-on-demand and greatly reduce wasted processing cycles.
  • Display technology is shrinking.  Texas Instruments recently demonstrated a prototype DLP pico-projector which is small enough to fit in a cell phone.  Wearable displays and retinal projection technology will become increasingly available in the near future. 
  • Emotive headset

    Several companies have recently demonstrated the ability to translate thoughts into commands that can be used to control games and other applications.  Emotiv Systems plans to ship its first-gen neuroheadsets in late 2008.

All of these technologies are becoming increasingly capable even as their cost is plummeting.  This is how technology works.  Many of us can remember when a not very sophisticated calculator cost as much as a current-day PS3.  And that’s not even in adjusted dollars.

So how does this change the way we’ll use computers?  Well, for one, they’ll soon be with us everywhere, all the time.  If you have enough computing power in your pocket or woven into your clothes or embedded under your skin to control basic I/O functions, various forms of wireless, GPS and cloud computing can do the rest.  Clunky old keyboards, mice and monitors will be a thing of the past.  RFID in clothing, jewelry or even under your fingertips will make gesture recognition input possible.  Wearable displays have the potential to provide heads-up information anywhere you go, augmenting your environment with different layers and levels of information.  Contextual overlays will be driven by a mix of geographic data and proximity detection, while being controlled and modified by personal preference filters.

And brainwave I/O is only just getting started.

A new era is coming.  Get ready to say goodbye to your computer.