+64 (0) 22 653 9410 info@transformingleaders.net

I came across this today on the Casey Daily Dispatch. It confirms what you may have already suspected. that you are truly amazing. You can sign up for their free newsletter here. I get nothing from this as there is no commision payable here. it's just good solid information.

If you ever felt inadequate when faced with the latest technology well, you can relax because your brain is still at least ten times faster than the worlds best super computer...and will remain so for the foreseeable future.

Round One of The Fight of the Millennium

Wetware is a term applied to biologically based information processors. There aren't any commercial devices of this sort on the market - you can't go and pick one up at Best Buy - yet the world has a couple trillion of them running around. We're talking about brains, the amazing computers that power every animal on the planet.  

And, of course, as evidenced by the fact that this article is both written and being read, we know there is no more advanced piece of wetware walking the earth than the human brain. But even calling the brain by such a term implies that it is some kind of supercomputer whose components can be analyzed as you would your Mac... and that an understanding of the interplay of hard- and software on our desktops allows us to model that most mysterious of organs.

But is what's inside our heads really comparable to product offerings from Apple, Intel, and Microsoft?

Well, actually, in some ways it is.

After all, both are electrical at their core, and both are based on binary logic. But when it comes to relative computing power, there is simply no contest.

The building block of electronic computers is the logic gate, through which all information processing flows. It takes two or more input impulses and translates them into an output impulse according to the simple on/off, zeroes and ones, true/false binary structure with which most everyone is at least vaguely familiar.

Two simple ones are AND gates and OR gates, which schematically look like this:

(If you'd prefer to see a logic gate in action, here's an entertaining video that uses dominoes to demonstrate the principle.)

Traditionally, the logic gate employs transistors. Sure, there are other options, such as optical and molecular. And out on the fringe, researchers are tinkering with crazy ideas like spintronics and quantum gates.

But for now we mostly have transistors, with the binary nature of their output determined by whether the current passing through them is "strong" or "weak." The number of them that can be embedded in a computer chip has grown exponentially for the past half-century, more or less in accordance with "Moore's Law." This most famous law of information technology states that the number of transistors on a chip will double about every two years, for the same unit cost. Thus, in 1971, we could fit only 2,300 transistors on a chip. In 2011, we can squeeze in something like 2.3 billion.

That's a lot of decision-making logic gates, and it puts an enormous amount of computing power at our fingertips.

By contrast, our brains must seem puny. Right? 

In fact, that is far from the case. We may not be able to solve advanced math problems in our heads in microseconds, but that doesn't mean we don't each own our personal advanced supercomputer. We're just tuned for very different tasks than your average computer, which doesn't have to find food or watch out for predators.

The human brain is truly unique. To begin understanding its complexity, you have to look at it on the cellular level.

Although this certainly isn't the whole story, the brain can be broken down very roughly into two different kinds of cells, neurons and glial cells. Neurons do the heavy lifting, i.e., they conduct electrical impulses. Glial cells do not; they're the sidekicks to the big guys, irreplaceable yet usually uncredited. They surround neurons and provide support for them and insulation between them (i.e., prevent crossed wires). Bidirectional communication exists between glial cells and neurons, and between glial cells and vascular cells. Until recently, it was believed that the number of glial cells outnumbered neurons by 5-10 times, but the latest research indicates that their numbers are actually approximately equal.

The staggering thing is how many of these cells there are. Exactly how many, no one knows. There are just too many, and they are just too small, to actually count. There are only really rough ballpark guesses. If you search the data, you will find estimates ranging from 50 billion to a trillion, with 100 billion a nice round number that a lot of people tend to agree on.

A 2009 article in the Journal of Comparative Neurology attempts to pin it down more precisely and comes up with a similar figure: "... despite the widespread quotes that the human brain contains 100 billion neurons and ten times more glial cells, the absolute number of neurons and glial cells in the human brain remains unknown. Here we determine these numbers by using the isotropic fractionator and compare them with the expected values for a human-sized primate. We find that the adult male human brain contains on average 86.1 ± 8.1 billion NeuN-positive cells ('neurons') and 84.6 ± 9.8 billion NeuN-negative ('nonneuronal' or glial) cells." (An isotropic fractionator is a technique for breaking down highly complex brain structures into just their nuclei, making them easier to count in a lab.)

Of the neurons, there seems to be a fairly general agreement that about 22 billion of them reside in the cerebral cortex alone, the 2- to 4-millimeter-thick layer on the outer region of the mammalian brain often dubbed "gray matter" after its appearance once preserved. The rest of the mass of the brain appears to be mostly made up of wiring in the form of axons to connect the brain's specifically programmed regions to each other and the rest of the nervous system.

Whatever the case, it might be tempting to see a neuron as the functional equivalent of the computer's transistor. That, however, would be an error. It's way more complicated than that.

This, highly simplified, is what a garden-variety neuron looks like:

Every neuron has an axon (usually only one). The axon is an "output" fiber that sends impulses to other neurons. Each neuron also has a proliferation of dendrites - short, hair-like "input" fibers that receive impulses from adjacent neurons. When a dendrite is stimulated in a particular way, the neuron to which it is attached suddenly changes its electrical polarity and may fire, sending a signal out along its single axon where it may be picked up by the dendrites of other neurons.

The connections are made via synapses - conductive links between abutting neurons. The links are formed at narrow spaces between the sending and receiving neurons, known as gap junctions. One gap junction channel is composed of two connexons (or hemichannels), each of which is made up of six connexins that can move together to open and close the connexon, as pictured below. It's much like a camera's iris. The two connexons bond across the intercellular space, allowing electrical or chemical signals to pass from one cell to another.

The brain features both chemical and electrical synapses, with the latter most often used to trigger actions that require a quick response time, as in the "fight or flight" reflex. Electrical synapses, like the one above, are characterized by a microscopic gap junction, 2-4 nanometers, as you can see. Chemical synapses' gaps are still tiny, but about 10 times larger.

These things are fast. Signals are transmitted across a chemical synapse in about 2 milliseconds (ms), and an electrical synapse in about 0.2 ms.

But the real eye-opener is how many there are. Babies are born with about 2,500 synapses in an average neuron. By the time the adult human brain is fully formed, that number has ballooned to 10-15,000.

Synapses are the true closest analogue to transistors. They are similarly binary, open or closed, letting a signal pass through or blocking it. So our biocomputer has - taking a median estimate of 12,500 synapses/neuron, and taking the consensus estimate of 22 billion cortical neurons - something on the order of 275 trillion transistors. In other words, our cerebral cortex alone contains the implied equivalent of about 120,000 of our most advanced chips.

As to processor speed, let's assume a very conservative average firing rate for a neuron of 200 times per second. If the signal is passed to 12,500 synapses, then 22 billion neurons are capable of performing 55 petaflops (a petaflop = one quadrillion calculations) per second.

The world's fastest supercomputer, a monster from Japan unveiled by Fujitsu at a conference this past June, has a configuration of 864 racks, comprising a total of 88,128 interconnected CPUs. It tested out at 8 petaflops (which only five months later was upped to 10.51 petaflops). Our brains are nearly five times faster.

But that's not even half the story. Unlike transistors locked into place on their silicon wafers, synaptic connections can and do move over time, creating an ever-shifting environment where the possible hookups are, for all practical purposes, limitless. Furthermore, there are another 78 billion neurons, give or take, outside of the cortex, hard at work on other complex functions.

The wiring complexity of our brains alone means that in the crude terms we understand computers today, our brains are much more complex than anything we've built, and still faster than even the most expensive supercomputer ever built.   

On top of that, we are only beginning to understand the complexity of that wiring. Instead of one-to-one connections, some theorists postulate that there are potentially thousands of different types of inter-neuronal connections, upping the ante. Moreover, recent evidence points to the idea that there is actually subcellular computing going on within neurons, moving our brains from the paradigm of a single computer to something more like a self-contained Internet, with billions of simpler nodes all working together in a massive parallel network. All of this may mean that the types of computing we are capable of are only just being dreamt of by computer scientists. 

Will our electronic creations ever exceed our innate capabilities? Almost certainly. Futurist Ray Kurzweil predicts that there will be cheap computers with the same capabilities as the brain by 2023. To us, that seems incredibly unlikely. But on a slightly longer time frame, given the exponential advances of the field, it is quite possible that there are humans alive today who will live to see the day. 

The main stumbling block right now is that, as ever more powerful computers are built, there is a concurrent expansion of power, management, and structural issues. But the Defense Advanced Research Projects Agency (DARPA) is putting its money on the line, betting that the problems can be overcome. And soon.

In late 2010, DARPA awarded the first grants to firms it wants to build so-called exascale computers, i.e., machines capable of performing a quintillion computations per second. DARPA expects the first prototypes to be working by 2018.

At that point, they'll be faster than us, but the software will still be far behind. But even there things march forward rapidly, with advances in artificial intelligence. 

For the moment, at least, wetware reigns supreme.

Yet, instead of being built from exotic materials, involving hundreds of engineers, and plugging into a worldwide electrical grid, our brain both builds and powers itself with cheeseburgers and blueberries. And then uses what's left over to help us dream up machines that may one day be as smart as we are.

Follow Your Heart, it is Smarter Than You Think