The fuzzy sound of a distorted
electric guitar is, for rock fans, a thing of beauty. It has been
a staple of music since the early 1950s, when guitarists strained
the vacuum tubes inside their amplifiers to match the raw voices of
blues singers. Later generations relied on digital devices: “effects
pedals” with circuits built from silicon diodes and transistors.
But last year, a completely new source of distortion hit the market—one
that uses an electronic junction made from organic molecules.
Billed as the world’s first commercial device to
rely on this form of “molecular electronics”, the Heisenberg
Molecular Overdrive relies on aromatic azo compounds strung between
two electrodes just a few nanometers apart. This forms a molecular junction that only
allows a current to flow once a threshold voltage is reached. The result is a warm
growl
that happens to sound more like the tube amps of old than silicon-based
effects pedals, thanks to the way that electrons tunnel from one electrode
to the other. The device clips off the peaks and troughs of the audio signal, producing
fewer harsh high-frequency harmonics in the distortion
than a conventional silicon circuit. It is an unlikely pioneer in
a field that once hoped to reinvent the computer.
The Heisenberg Molecular Overdrive creates distortion
on electric guitar sounds. It gets its warm sound from molecular
junctions, built into SIM cards in the effects pedal’s circuit
board. Credit: Adam Bergren.
More than four
decades ago, researchers suggested that individual molecules could
act as components in electronic circuits, potentially offering the
ultimate in miniaturization. By the turn of the century, this vision
was seemingly becoming a reality: Science magazine
heralded the first molecular-scale circuits as their “Breakthrough of the Year” in
2001. And since then researchers have created a bewildering array of
organic molecules and nanotubes that can act as diodes, transistors,
and memory devices.
And not a moment too soon, proponents said.
Our computers’ capabilities have blossomed thanks to the inexorable
shrinking of silicon devices, which has reliably doubled the number
of transistors that can fit on a chip every couple of years (a trend
known as Moore’s law, after Gordon Moore, the cofounder of
Intel). But transistors, a kind of electrical switch, were becoming
so tiny that electrons could leak through them even when they were
off, screwing up their performance and threatening to stall this remarkable
rise in computing power. Perhaps single molecules could take silicon’s
place and save the day?
Today, most researchers admit that is
unlikely to happen. The difficulties of building a practical computer
using molecular electronics—along with the microelectronics
industry’s success in carving ever-smaller features into silicon
devices and working around the electron leakage problem—have
forced the field to adjust its goals, says Mark A. Ratner of Northwestern University,
who first proposed a molecular diode with his colleague Ari Aviram back in 1974. This
may mean targeting niche applications like sensors or the Heisenberg
pedal—or harnessing single molecules as tools to explore the
fundamentals of electron behavior, which may yield unexpected strategies
for improving computing. Still, the idea of computing at a molecular
scale is not dead. Others are eyeing hybrid technologies that combine
silicon with molecules like carbon nanotubes, and using molecules
to perform logic and memory operations in ways that take advantage
of their chemical properties, rather than trying to bend their electronic
behavior to imitate silicon. “The field has had to reinvent
itself because of the continuing miniaturization of silicon”,
says Kevin F.
Kelly, a nanoimaging researcher at Rice University.
Focus on Fundamentals
Films of organic compounds are already used to make commercial
electronic devices such as light-emitting diodes, but these depend
on the behavior of the bulk material. In contrast, molecular electronics
aims to use the properties of individual molecules to control the
flow of charge between electrodes that are just nanometers apart.
That potentially offers a way to construct very densely packed circuits,
something that looked attractive when silicon transistors were still
thousands of times larger than a molecule. But today’s chips
use silicon transistors with features as small as 14 nm, and 10 nm
technology is just around the corner, considerably narrowing any benefit
molecular electronics might offer.
Moreover, few molecular electronic
devices can beat silicon on both energy efficiency and speed of charge
transport, a necessity for developing practical circuits that are
good enough to usurp a well-established industry: “So many
billions of dollars have been invested in silicon fabrication, nobody
wants to throw that away”, says Subhasish Mitra of
Stanford University.
Still, researchers have achieved some impressive
milestones. Last year, for example, Latha Venkataraman and Luis M. Campos of Columbia
University and Jeffrey B. Neaton
of Lawrence Berkeley National Laboratory unveiled what is arguably
the best single-molecule diode ever made. Diodes allow current to flow freely in one
direction but not in
the opposite direction, and the team’s thiophene-based device had a very high on-off
ratio of more than 200—meaning 200-fold more electrons flowed
in one direction than the other. The diode relies on a polar solvent,
which aligns the molecule’s orbitals so they produce the desired
electrical response.
Crucially, Venkataraman’s work has been consistently
reproducible, unlike a lot of previous single-molecule work, and has
revealed important details about electronic behavior. “She’s
one of the top three people in the world at doing these measurements”,
says Ratner.
Nongjian Tao of Arizona State University has made similar
fundamental discoveries that have addressed central problems in molecular
computing. In some molecular electronic devices, electrons can travel
from one electrode to the other by quantum tunnelling, a kind of subatomic
disappearing act that allows them to dive right through a normally
insurmountable energy barrier. For longer molecules, though, the electron
can actually hop along the molecule in several steps, akin to conventional
conduction along a wire. Last year, Tao’s team found that switching
the order of base pairs in a DNA helix flipped the charge transport
mechanism from tunnelling to hopping, and that in some circumstances, the base-pair
swap led to an intermediate form of tunneling-hopping. Understanding exactly how electron
transport works in single molecules—and
how chemical structure, mechanical stress, temperature, and other
factors affect it—will be crucial for chemists trying to
design molecules with very specific electronic properties,
he says.
And molecular electronics has led to computing breakthroughs,
at least indirectly. For example, fundamental studies of electron
behavior in molecules helped to inspire a new generation of memory
chip called resistive random access memory (RRAM). Researchers realized
that, in certain molecular electronic devices, current was not actually flowing through
the molecules. Instead it moved along nanoscale metal filaments that unexpectedly
grew across the gap between electrodes. Santa Clara based company Crossbar has exploited
this effect to develop a type of RRAM that stores data using the presence
or absence of these filaments as the ones and zeroes of binary code.
The company says that its RRAM operates faster and at lower voltages
than traditional flash memory, and began licensing the technology
to chipmakers earlier this year. “Developments in molecular
electronics definitely helped to develop resistive memory”,
says Victor
Zhirnov, chief scientist at the Semiconductor Research
Corp., an industry-funded research consortium.
Building a Better Circuit
Making smaller components is not the only way to improve on silicon
electronics, though. The semiconductor industry is desperate for technologies
that reduce the energy consumption of chips—for mobile devices
with limited battery lives, and for companies running vast clusters
of computers. Improving energy efficiency also means that components
generate less waste heat, so they can be packed more closely together
to make smaller devices.
Chipmakers would also like to increase
their circuits’ clock speeds—essentially the number
of instructions they can carry out per second—which have barely
risen over the past decade. That is partly due to the relatively poor
mobility of electrons within thin slivers of silicon, which limits how quickly a pulse
of charge moves through a transistor, delaying the stream of instructions.
Unfortunately, most materials either transmit charges quickly or
have low energy demands, but not both. Carbon nanotubes (CNTs) offer
a way out of this Catch-22 situation, because they are very thin, just one nanometer
or so across, and have an intrinsically
high charge mobility that can also be switched with a relatively small
voltage. Imagine that the transistor is a water hose, says Mitra: You would have to
stand on a fat hose with all your weight to staunch the flow, whereas a thin pipe
is much easier to pinch off.
In 2013, Mitra and colleagues including Max M. Shulaker at Massachusetts Institute
of Technology unveiled the first CNT computer, built from 178 CNT transistors and
able to run simple programs.
Several techniques lay behind this success, including a circuit design strategy that
worked around problems with mispositioned CNTs, and a better method for
aligning and packing CNTs tightly so they carried a decent current.
“All too often they looked like a bowl of noodles”,
Mitra says of previous attempts. Instead, his team grew CNTs on a
quartz substrate before transferring them to silicon. By 2014, they could neatly pack
around 100 CNTs per micrometer inside their transistors, the first
such devices to match the overall performance of ones made of silicon.
Researchers have harnessed carbon nanotubes to build computer
circuits. This 100 mm wafer is patterned with circuit elements made of individual
carbon nanotubes. Layering nanotubes over silicon elements could improve chip
performance. Credit: Max M. Shulaker.
A decade ago, the
microelectronics industry had all but given up on CNTs. But experiments
such as Mitra’s are showing that problems with alignment and
nanotube imperfections can be overcome, and industry interest is growing
again. Last year, for example, a team from IBM unveiled a better way to bind CNTs
to metal electrodes, avoiding longstanding problems with
high resistance at the contact point.
The key is not to think CNTs will act alone, Mitra says.
“We’re not talking about throwing away silicon; we’re
talking about enhancing it.” A source of delay in circuits
is the time it takes to pass signals between components. One way to
reduce a signal’s transit time is to simply place those components
closer together. Stacking circuits in 3-D would be a big help. But
building a layer of silicon components takes high temperatures, enough
to melt parts of the layers beneath it. Mitra’s CNT circuits,
in contrast, are constructed at much lower temperatures, so they can
be safely stacked on top of silicon circuits. he says.
Memory Molecules
Molecules might not replace silicon processors, but they could
be the way forward for memory applications, some say. Silicon technology
is ideal for storing data that is needed in short order, because it
reads and writes data using speedy electrical currents. But silicon
memory is an expensive and bulky way to archive the exabytes of data
(1018 bytes) the world generates every day. One of the
most promising alternatives is to encode data in DNA by using patterns of base pairs
to
represent digital ones and zeros. Bits can be stored at such a high
density that one kilogram of DNA could meet the world’s entire
storage needs, Zhirnov estimates. “It’s the only solution
I can think of to support the data explosion”, he says. Stored
correctly, DNA is also extremely durable; and because it is the basis
of our own genome, the technology needed to read it—a DNA sequencer—is
unlikely to become obsolete in the future.
Over the past few
years, researchers have stored Shakespeare sonnets and entire
books in DNA. In July 2016, researchers at Microsoft and the University of Washington
set a new record when they stored 200 megabytes of data—including 100 books
and a high- definition music video—in a tiny smear of DNA that
contained more than 1.5 billion base pairs. Although that is half
the size of the human genome, data-storing DNA is built in short strands
that are easier to synthesize. The researchers add identifying sequences
to the ends of each 150-base-pair segment so that they can find specific
chunks of data. Reading back the data simply involves sequencing the
DNA, and glitches can be caught by error-correcting software, making
the storage method more robust than human DNA sequencing using the
same methods.
This pink smear of DNA holds a record
200 megabytes of
data—including 100 books and a high-resolution movie—encoded
in its base-pair sequence. Credit: Tara Brown Photography/University
of Washington.
The big challenges to deploying
DNA data storage are cost and speed. For archival storage, magnetic
tape is still the most cost-effective data storage medium, and although
slower than flash memory it is still able to transfer hundreds of
megabytes per second. That would be equivalent to reading and writing
1 billion DNA bases per second, something that could only be achieved
with new chemistry in a massively parallel system.
Luis Ceze, part of the University of Washington team, does not claim that
DNA will replace the memory inside your computer. “But I’m
optimistic that we can replace archival media like magnetic tape”,
he says. Zhirnov predicts that researchers will build prototype DNA
memory devices in the next five years, and potentially create practical
devices by 2030. “DNA storage is not as far out as it might
seem”, he says.
A Way Forward
In order to prove
the practical value of any molecular computing devices, says Mitra,
researchers must also learn how to connect the components into functional
systems, a task that will require a lot more cross-disciplinary research
with computer system engineers.
Richard L. McCreery, who invented the molecular junctions at the heart of the Heisenberg
Molecular Overdrive, says that fundamental science such as Tao’s
and Venkataraman’s is still needed to provide a better understanding
of electronic behavior, something that will be crucial to discovering
a killer app for molecular electronics.
But in the meantime,
the University of Alberta nanotechnologist also hopes that his guitar
pedal, while clearly a niche application, will lend credibility to
the field. He and his colleague Adam Bergren last year formed Nanolog
Audio, a company that is now in talks with musical instrument maker
Roland to supply the molecular junctions for other effects circuits.
“Molecular electronics has to coexist with silicon, and like
any technology it’s got to establish that it’s got some
advantages”, McCreery says. “I’m sure it can
do that, but it’s going to take some very good research.”
Mark Peplow is a freelance contributor
to
Chemical &
Engineering News
, the weekly newsmagazine
of the American Chemical Society.