In a unique laboratory in Sussex, England, a computer carefully scrutinized every member of large and diverse set of candidates. Each was evaluated dispassionately, and assigned a numeric score according to a strict set of criteria. This machine's task was to single out the best possible pairings from the group, then force the selected couples to mate so that it might extract the resulting offspring and repeat the process with the following generation. As predicted, with each breeding cycle the offspring evolved slightly, nudging the population incrementally closer to the computer's pre-programmed definition of the perfect individual.
The candidates in question were not the stuff of blood, guts, and chromosomes that are normally associated with evolution, rather they were clumps of ones and zeros residing within a specialized computer chip. As these primitive bodies of data bumped together in their silicon logic cells, Adrian Thompson-- the machine's master-- observed with curiosity and enthusiasm.
Dr. Adrian Thompson is a researcher operating from the Department of Informatics at the University of Sussex, and his experimentation in the mid-1990s represented some of science's first practical attempts to penetrate the virgin domain of hardware evolution. The concept is roughly analogous to Charles Darwin's elegant principle of natural selection, which describes how individuals with the most advantageous traits are more likely to survive and reproduce. This process tends to preserve favorable characteristics by passing them to the survivors' descendants, while simultaneously suppressing the spread of less-useful traits.
Dr. Thompson dabbled with computer circuits in order to determine whether survival-of-the-fittest principles might provide hints for improved microchip designs. As a test bed, he procured a special type of chip called a Field-Programmable Gate Array (FPGA) whose internal logic can be completely rewritten as opposed to the fixed design of normal chips. This flexibility results in a circuit whose operation is hot and slow compared to conventional counterparts, but it allows a single chip to become a modem, a voice-recognition unit, an audio processor, or just about any other computer component. All one must do is load the appropriate configuration.
The informatics researcher began his experiment by selecting a straightforward task for the chip to complete: he decided that it must reliably differentiate between two particular audio tones. A traditional sound processor with its hundreds of thousands of pre-programmed logic blocks would have no trouble filling such a request, but Thompson wanted to ensure that his hardware evolved a novel solution. To that end, he employed a chip only ten cells wide and ten cells across-- a mere 100 logic gates. He also strayed from convention by omitting the system clock, thereby stripping the chip of its ability to synchronize its digital resources in the traditional way.
He cooked up a batch of primordial data-soup by generating fifty random blobs of ones and zeros. One by one his computer loaded these digital genomes into the FPGA chip, played the two distinct audio tones, and rated each genome's fitness according to how closely its output satisfied pre-set criteria. Unsurprisingly, none of the initial randomized configuration programs came anywhere close. Even the top performers were so profoundly inadequate that the computer had to choose its favorites based on tiny nuances. The genetic algorithm eliminated the worst of the bunch, and the best were allowed to mingle their virtual DNA by swapping fragments of source code with their partners. Occasional mutations were introduced into the fruit of their digital loins when the control program randomly changed a one or a zero here and there.
For the first hundred generations or so, there were few indications that the circuit-spawn were any improvement over their random-blob ancestors. But soon the chip began to show some encouraging twitches. By generation #220 the FPGA was essentially mimicking the input it received, a reaction which was a far cry from the desired result but evidence of progress nonetheless. The chip's performance improved in minuscule increments as the non-stop electronic orgy produced a parade of increasingly competent offspring. Around generation #650, the chip had developed some sensitivity to the 1kHz waveform, and by generation #1,400 its success rate in identifying either tone had increased to more than 50%.
Finally, after just over 4,000 generations, test system settled upon the best program. When Dr. Thompson played the 1kHz tone, the microchip unfailingly reacted by decreasing its power output to zero volts. When he played the 10kHz tone, the output jumped up to five volts. He pushed the chip even farther by requiring it to react to vocal "stop" and "go" commands, a task it met with a few hundred more generations of evolution. As predicted, the principle of natural selection could successfully produce specialized circuits using a fraction of the resources a human would have required. And no one had the foggiest notion how it worked.
Dr. Thompson peered inside his perfect offspring to gain insight into its methods, but what he found inside was baffling. The plucky chip was utilizing only thirty-seven of its one hundred logic gates, and most of them were arranged in a curious collection of feedback loops. Five individual logic cells were functionally disconnected from the rest-- with no pathways that would allow them to influence the output-- yet when the researcher disabled any one of them the chip lost its ability to discriminate the tones. Furthermore, the final program did not work reliably when it was loaded onto other FPGAs of the same type.
It seems that evolution had not merely selected the best code for the task, it had also advocated those programs which took advantage of the electromagnetic quirks of that specific microchip environment. The five separate logic cells were clearly crucial to the chip's operation, but they were interacting with the main circuitry through some unorthodox method-- most likely via the subtle magnetic fields that are created when electrons flow through circuitry, an effect known as magnetic flux. There was also evidence that the circuit was not relying solely on the transistors' absolute ON and OFF positions like a typical chip; it was capitalizing upon analogue shades of gray along with the digital black and white.
Today, researchers are just beginning to explore the real-world potential of evolving circuitry. Engineers are experimenting with rudimentary adaptive hardware systems which marry evolvable chips to conventional equipment. Such hybrids quickly adapt to new demands by constantly evolving and adjusting their control code. The space exploration industry is intrigued by the technology-- an evolving system could dynamically reprogram itself to avoid any circuits damaged by radiation, reducing the need for heavy shielding and redundant systems. Similarly, researchers speculate that robots might one day use evolution-inspired systems to quickly adapt to unforeseen obstacles in their environment.
Modern supercomputers are also contributing to artificial evolution, applying their massive processing power to develop simulated prototypes. The initial designs are thoroughly tested within carefully crafted virtual environments, and the best candidates are used to breed successive batches until a satisfactory solution has evolved. These last-generation designs are then fabricated and tested in the real world. NASA recently used this approach to produce the antenna for a spacegoing vessel, resulting in flamboyant-yet-effective shapes that vaguely resemble organic lifeforms-- unlike anything an engineer would design without the benefit of mood-altering drugs. Scientists hope to eventually use genetic algorithms to improve complex devices such as motors and rockets, but progress is dependent upon the development of extremely accurate simulations.
These evolutionary computer systems may almost appear to demonstrate a kind of sentience as they dispense graceful solutions to complex problems. But this apparent intelligence is an illusion caused by the fact that the overwhelming majority of design variations tested by the system-- most of them appallingly unfit for the task-- are never revealed. According to current understanding, even the most advanced microchips fall far short of the resources necessary to host legitimate intelligence. On the other hand, at one time many engineers might have insisted that it's impossible to train an unclocked 10x10 FPGA to distinguish between two distinct audio tones.
There is also an ethical conundrum regarding the notion that human lives may one day depend upon these incomprehensible systems. There is concern that a dormant "gene" in a medical system or flight control program might express itself without warning, sending the mutant software on an unpredictable rampage. Similarly, poorly defined criteria might allow a self-adapting system to explore dangerous options in its single-minded thrust towards efficiency, placing human lives in peril. Only time and testing will determine whether these risks can be mitigated.
If evolvable hardware passes muster, the Sussex circuits may pave the way for a new kind of computing. Given a sufficiently well-endowed Field-Programmable Gate Array and a few thousand exchanges of genetic material, there are few computational roles that these young and flexible microchips will be unable to satisfy. While today's computers politely use programmed instructions to solve predictable problems, these adaptable alternatives may one day strip away such limits and lay bare the elegant solutions that the human mind is reluctant-- or powerless-- to conceive on its own.