IEEE, Bell Labs Honor Seven Groundbreaking Innovations

Bell Labs is already highly recognized, but in its centennial year, the organization hoped to add more awards to burnish its reputation as one of the world’s leading centers of technical innovation.

On 21 October, IEEE representatives, Nokia Bell Labs leaders, and alumni of the storied institution gathered to celebrate seven technological achievements recognized as IEEE Milestones:

The large number of milestones granted at once is due to an extraordinary effort to achieve the recognitions during Bell Labs’ 100th anniversary year, which IEEE Fellow Peter Vetter, president of Nokia Bell Labs core research, told the attendees was always intended as a full 12 months of celebrations.

Speakers emphasized that celebrating such history inspires today’s—and tomorrow’s—engineers.

“History gives us context,” IEEE President Kathleen Kramer said. “It reminds us why we do what we do.”

Theodore Sizer, Nokia Bell Labs executive vice president, said of the recognition, “We are also here to celebrate the 100 years ahead of us.”

Presenters at the event acknowledged the outsize role Bell Labs has played in the development of many technologies, noting that it helped make IEEE Region 1—the Eastern United States—a powerhouse of innovation. Seventy of the 279 IEEE milestones that have been granted were to technologies developed in Region 1, noted its director, Bala Prasanna, an IEEE life senior member.

“Bell Labs stands at the heart of that legacy,” Prasanna said.

IEEE Life Member Emad Farag, chair of the IEEE North Jersey Section, said, “This section has given birth to technology that is at the heart of modern life.”

Molecular beam epitaxy

The high-purity crystal growth process called molecular beam epitaxy (MBE) was developed by IEEE Fellow Alfred Y. Cho in the 1960s. Used to grow thin films of crystal atop one another, the process makes possible high-electron mobility transistors, vertical-cavity surface-emitting lasers (VCSELs), and other technologies.

With MBE, ultrapure elements such as gallium and arsenic are heated within the side compartments of a vacuum chamber. Inside the chamber sits a heated target semiconductor wafer. The elements sublimate, evaporating and flying at the target wafer, where they attach and combine, slowly growing into a layer of crystal.

“It sounds straightforward, but it’s difficult to get it right,” said IEEE Fellow David Nielson, group leader for optical transmission at Bell Labs. “The thermodynamics going on at the surface in MBE is incredibly complex.”

VCSELs are dependent on MBE, Nielson noted. They rely on multiple vertical semiconductor layers to form internal reflectors and other structures. VCSELs are key to the facial recognition systems used to unlock smartphones today. The tiny array of lasers paints your face with a pattern of dots to create a 3D map.

Because MBE happens one atomic layer at a time and under clean-room conditions, it gives scientists unprecedented control over the thickness, composition, and purity of each layer—similar to 3D printing but at the nanometer scale, according to the University of Iowa physics and astronomy department’s MBE Lab.

Building up enough layers to make a useful device—a process that happens at the glacial pace of 1 micrometer (or less) per hour—was a test of Bell Labs scientists’ patience and determination, Nielson said.

“In the lab, we used to say MBE didn’t just stand for molecular beam epitaxy; it also meant many boring evenings,” he joked.

The scientists’ steadfast attention and patience paid off.

“It unlocked all sorts of new materials,” Nielson said. “It allows you to build materials that don’t naturally exist. Some of the impacts in the scientific domain include fractional quantum Hall effects—another Bell Labs innovation being celebrated today.”

As Cho recounted in a 2010 interview for the Computer History Museum’s oral history series, he began working at the Murray Hill facility in 1968. His colleague John R. Arthur Jr. soon proposed a new approach to fine-tuning the semiconductor formulations: Evaporate pure elements such as gallium and arsenic in an ultrahigh vacuum, then let the resulting molecular beams travel unimpeded, allowing them to condense on a heated crystalline substrate. Cho said in the oral history that Arthur’s idea inspired him to connect insights gleaned from research papers, lectures, and his own graduate work.

When asked how he invented what became known as MBE, he described it as “linking ideas from one field to another to create something entirely new.”

Cho understood how early effusion cells (the combustion chambers in which the elements are heated until they break down into their molecular or atomic components) and cesium ion emitters (which improve the orderliness of the atomic layering) worked in an ultrahigh vacuum.

He applied that knowledge, along with his background in surface physics—the understanding of how to monitor and assess the quality of the atomic layers through electron diffraction and how to remove oxides to clean surfaces—to the growth of semiconductor materials. By connecting surface physics, ion engines, and crystal growth, he helped create a new field, he said in the oral history.

“History gives us context. It reminds us why we do what we do.” —IEEE President Kathleen Kramer

By the end of 1968, he and Arthur had built the first experimental MBE system. Their 1969 Bell Labs technical memo and follow-up Applied Physics Letters paper documented the first high-quality gallium arsenide layers with atomically sharp interfaces—something no other technique could achieve. What astonished their colleagues was the repeatability: By controlling shutter timing, temperature, and beam flux—the rate at which elements evaporate and their atoms flow toward the target wafer—they could reproduce identical structures repeatedly.

The invention had all the hallmarks of the Bell Labs tradition: a simple question pursued with rigor, a culture that valued exploration over deadlines, and an audacious belief that even the smallest layer of matter could be engineered to perfection.

The IEEE Milestone plaque honoring MBE reads:

“In 1968–1970, molecular beam epitaxy (MBE) techniques using reflection high-energy electron diffraction for growing epitaxial compound semiconductor films were introduced. MBE deposits single-crystal structures one atomic layer at a time, creating materials that cannot be duplicated through other known techniques. This precise crystal growth method revolutionized the fabrication of semiconductor devices, quantum structures, and electronic devices, including lasers for reading and writing optical disc media.”

Charge-coupled device

In 1969 two Bell Labs physicists and IEEE Life Fellows—Willard S. Boyle and George E. Smith—scribbled an idea on a blackboard that would quietly reshape the way the world records light. Their concept, sketched amid a one-hour conversation, would become the charge-coupled device, or CCD—a breakthrough that, as Scientific American noted in its February 1974 issue, seemed poised to improve TV cameras and astronomical imaging. It eventually ushered in the digital photography revolution and changed how scientists see the universe.

At the time, Bell Labs was in one of its most fertile phases, having already given the world the transistor, the laser, and information theory. The company was turning its attention to solid-state imaging and memory—technologies it hoped might one day support the burgeoning field of electronic communications. Boyle, then head of the device concepts department, and Smith, a physicist known for his intuitive design skills, were exploring how to create a new kind of semiconductor memory.

The spark came partly from internal competition. As Smith recalled during his Nobel lecture, Bell Labs’ Electronics division had two groups: William Boyle’s semiconductor department and another department which handled all other technologies. Under pressure to advance magnetic bubble memory, vice president Jack Morton urged Boyle’s group to develop a competing semiconductor device or see resources shift to the other group.

“To address this demand, on October 17, 1969, Bill and I got together in his office,” Smith later explained. “In a discussion lasting not much more than an hour, the basic structure of the CCD was sketched out on the blackboard, the principles of operation defined, and some preliminary ideas concerning applications were developed,” he said.

According to Bell Labs’ internal technical reports, the essence of their idea was that a grid of capacitors that could hold and shift electrical charges, one to the next, in a controlled sequence. The charge-coupled device would store data.

The CCD’s image-capture capability was an accidental discovery, Sizer said during his presentation at the Milestone ceremony.

Boyle and Smith were testing the CCD for use as a memory circuit “when they noticed that light in the room flipped bits in the device,” Sizer said. “That accident connected light and information—and turned a memory circuit into an imaging sensor.”

“Today the essence of that blackboard sketch lives in every smartphone camera. The CCD turned light into data. It taught machines to see.”

Within weeks, Boyle and Smith had a working prototype, which under laboratory lamps produced a faint but discernible pattern—a “ghostly image,” as Smith later described it.

Bell Labs quickly organized teams to refine the fabrication process, improve signal-to-noise ratio, and explore an array of uses including in video cameras and data storage arrays.

Management appeared to recognize the potential almost immediately, though commercial products were still years away. As noted at the time by former Bell Labs president Mervin J. Kelly, the CCD fit squarely within the institution’s mission: pushing the frontiers of solid-state electronics to strengthen communication systems.

“AT&T’s Bell Labs News wrote that it could be used in a small color TV camera for future videophones—a remarkably clairvoyant prediction,” Sizer said.

By the mid-1970s, companies including Fairchild Semiconductor, RCA, and Sony had taken the concept further, producing the first CCD video cameras and astronomical imagers, according to the Digital Camera Museum.

The device soon found its way into camcorders, telescopes, fax machines, and medical instruments. By the 1990s, CCDs had become the gold standard for digital imaging.

When Boyle and Smith received the Nobel Prize in Physics in 2009, they credited the company’s culture for their success.

“Bell Labs gave us the freedom to think in any direction,” Smith said in an interview about the Nobel Prize. “That was its genius.”

The IEEE Milestone plaque honoring the CCD reads:

“The charge-coupled device (CCD), originally conceived for digital memory applications, was later shown to offer a compact, sensitive, and efficient way to convert light into digital signals by storing light-generated charges in a series of tiny capacitors. Invented and developed by Bell Labs scientists Willard Boyle, George Smith, and Michael Tompsett, CCDs found wide use in astronomical instruments, medical imaging, and consumer electronics.”

According to accounts from Bell Labs archives and interviews published by the Nobel Foundation, by the early 1990s, Eric Betzig’s corner of the Bell Labs facility was alive with the hum of possibility. He received a 2014 Nobel Prize in Chemistry.

Fluorescence microscopy—a biologist’s window into the cell—had hit the diffraction limit of about 200 nanometers (or roughly half the wavelength of visible light), which had been postulated a century earlier by physicist Ernst Abbe. But Betzig suspected there was a way around it. His idea was radical for its time: If a single fluorescent molecule could be detected, he theorized, then perhaps an image could be built one molecule at a time, with each point localized far more precisely than the laws of optics previously seemed to allow.

Bell Labs continued to evolve through the 1990s, yet remained one of the world’s great research institutions. The breakup of AT&T ushered in a more commercially aware era. As a result, researchers were asked to balance blue-sky curiosity with a clearer line of sight to practical applications.

For Betzig and other researchers, whose passion leaned toward fundamental physics rather than communications or materials science, that balance was hard to strike, according to a 2012 Time magazine article written by Jon Gertner, adapted from his book The Idea Factory: Bell Labs and the Great Age of American Innovation.

The lab did not become hostile to discovery. Far from it. But management steered toward projects that promised tangible short-term returns in telecommunications and optoelectronics, Gertner said.

Betzig’s work on single-molecule fluorescence, while elegant, was difficult to justify within the emerging priorities. Over time, he felt his path diverging from that of the company.

“It wasn’t that they were wrong,” he said in a 2014 Nobel interview with the Royal Swedish Academy of Sciences. “Just that my interests no longer fit.”

After demonstrating single-molecule imaging in 1993, as documented in his paper in Optics Letters that year, Betzig found himself at a crossroads. Rather than retool his research to fit Bell Labs’ shifting agenda, he chose to step away. He left in 1995 to work at his father’s machine shop in Michigan—a move described in a September 2015 New York Times profile.

“In a discussion lasting not much more than an hour, the basic structure of the CCD was sketched out on the blackboard, the principles of operation defined, and some preliminary ideas concerning applications were developed.” —George E.Smith, 2009 Physics Nobel laureate

The story might have ended there if not for another promising physicist determined to break through Abbe’s theoretical boundary. Physicist Stefan W. Hell, an IEEE member, began publishing papers describing his stimulated emission depletion (STED) microscopy technique. It used a laser to make fluorescent molecules glow and a second, donut-shape laser to suppress fluorescence everywhere except a nanometer-scale central point so that telescopes could resolve features much smaller than half a wavelength.

Hell’s technique was among several advances in microscopy that spurred Betzig to resume his career in science. He joined the Howard Hughes Medical Institute’s Janelia Research Campus, in Ashburn, Va., where he continued his research.

Together with Harald Hess, another Bell Labs alumnus, Betzigl developed a working prototype demonstrating the feasibility of his microscopy method, which he called photoactivated localization microscopy, or PALM. It broke through the diffraction limit by precisely mapping thousands of blinking molecules to reconstruct nanometer-scale images.

Betzig shared the 2014 Nobel Prize in Chemistry for that work with Hell and IEEE Life Senior Member William E. Moerner. In 1988, while working at IBM’s Almaden Research Center in Silicon Valley, Moerner achieved the first optical detection of a single molecule.

For Betzig, the award was a reflection of Bell Labs’ enduring legacy—and the kind of deep, foundational curiosity it instilled in generations of scientists.

“Bell Labs taught me how to think,” he said in his Nobel Foundation biography and in interviews with The Washington Post. “Even after I left, I was still one of theirs.”

The IEEE Milestone plaque honoring super-resolution fluorescence microscopy reads:

“The first super-resolution image of a biological sample was obtained in 1992 by exciting and collecting light diffracted in the near field of the sample. This breakthrough achievement, called super-resolved fluorescence microscopy, exploited the properties of evanescent waves and made single-molecule microscopy possible. Its successful use in imaging single fluorophores inspired applications in cell biology, microbiology, and neurobiology.”

In early 1982, in a low-temperature laboratory at Bell Labs, physicist Horst L. Störmer watched a set of electrical traces appear on an oscilloscope that defied every expectation. The measurements were taken from a wafer of gallium arsenide cooled to a few thousandths of a degree above absolute zero and placed in a powerful magnetic field. The pattern that emerged showed “beautiful, clean plateaus in Hall resistance, but at fractional values of e2/h”—the fundamental constant, where e represents the electrons’ charge and h equals Planck’s constant, the value of the smallest possible discrete packets of energy at atomic and subatomic scales, according to Störmer’s Nobel lecture in 1998.

To Störmer and his colleague Daniel C. Tsui, it was a moment both thrilling and disorienting. The electrons should have behaved like independent particles. Instead they were somehow acting as if they had split into smaller, correlated entities: quasiparticles with fractional charge. The phenomenon had no place in classical theory—at least not yet.

The discovery of the fractional quantum Hall effect (FQHE) led to “the development of new theoretical concepts of significance in many branches of modern physics,” as stated by the Royal Swedish Academy of Sciences in the news release announcing that Störmer and Tsui had been named Nobel laureates. As chronicled in the Bell Labs Technical Journal and the Nobel Foundation’s background material about the technology, FQHE emerged from the collaborative environment at Bell Labs.

Störmer joined the company in 1970 to study high-mobility two-dimensional electron systems—structures made possible by molecular beam epitaxy. The exquisitely pure gallium arsenide/aluminum–gallium arsenide heterostructures allowed electrons to move almost without scattering, making them ideal playgrounds for exploring quantum phenomena.

Working with Tsui, who had a well-honed feel for experimentation, Störmer began studying how the two-dimensional electron gases behaved under magnetic fields of several teslas. In 1980 Klaus von Klitzing at the Planck Institute for Solid State Research, in Stuttgart, Germany, discovered the integer quantum Hall effect. Von Klitzing showed that current flow, instead of varying smoothly across the magnetic field, forms plateaus at precise, quantized values in integer multiples of e2/h—a discovery that earned him the 1985 Nobel Prize in Physics.

Störmer and Tsui noted in a 1982 Physical Review Letters paper (“The Fractional Quantum Hall Effect”) that their data showed the plateaus appeared not just at integers but at simple fractions such as one-third. Something entirely new was happening.

At first, neither Störmer nor Tsui could believe the measurements. The duo was surprised by the data they were seeing, according to the news release announcing that they had been named Nobel laureates. The results did not conform with existing theories. Yet repeated experiments confirmed the result.

Within weeks, the pair had a preprint ready for Physical Review Letters. It was published in November 1982.

The theoretical explanation came soon after, from Robert B. Laughlin, then a young theorist at Stanford. In a landmark 1983 Physical Review Letters paper, Laughlin explained theoretically what the Bell Labs researchers were seeing with their experiments. Laughlin proposed that under extreme magnetic fields and low temperatures, electrons could condense into a new collective quantum state—a “liquidlike state of matter” (such as a Bose-Einstein condensate)—supporting subatomic particles carrying a fraction of the electron’s charge. Laughlin’s elegant wavefunction not only explained the 1/3 state but also predicted an entire family of fractional states—all later confirmed experimentally.

The work exemplified the Bell Labs ecosystem at its best: precision materials from Cho’s MBE group, cryogenic measurement expertise from the low-temperature labs, and an atmosphere that encouraged cross-disciplinary risk-taking.

“We were never told to stop,” Störmer recalled in a Physics World interview.

Störmer, Tsui, and Laughlin shared the 1998 Nobel Prize in Physics for their discovery and theoretical explanation of the FQHE.

The IEEE Milestone plaque honoring the discovery of the FQHE reads:

In 1982 Bell Labs researchers revealed a new phase of matter, an incompressible quantum fluid that supports fractional charges. Daniel Tsui and Horst Störmer experimentally observed this result in two-dimensional electron systems confined within gallium arsenide heterostructures engineered by Arthur Gossard. This discovery, named the fractional quantum Hall effect (FQHE), transformed key concepts in physics while opening new directions in quantum computation and other potential applications.”

In the late 1980s, when most of the artificial intelligence community had grown disenchanted with neural networks, a small group of researchers at the Bell Labs facility in Holmdel, N.J., would not let the idea die. Their goal was deceptively simple: Teach computers to see as humans do by recognizing patterns in raw pixels.

The U.S. Postal Service was looking for a faster, more accurate way to read handwritten ZIP codes. Yann LeCun’s Bell Labs team trained a neural network on thousands of digit samples with varying slants and handwriting pressure. By the early 1990s, the team had built a prototype that matched human-level digit-reading accuracy.

The technology behind it—convolutional neural networks (CNNs)—was inspired by the human visual cortex. As LeCun explained in his 1998 Proceedings of the IEEE paper, “Gradient-Based Learning Applied to Document Recognition,”CNNs learn their filters directly from images through the mathematical operation of convolution. The idea drew on earlier work by researcher Kunihiko Fukushima, whose 1980 “neocognitron” model proposed a similar layered structure. LeCun frequently credited Fukushima as an influence, but his Bell Labs team made the concept practical.

Working with colleagues including Yoshua Bengio, LeCun implemented multilayer CNNs on state-of-the-art workstations and trained them using backpropagation, a technique formalized in a 1986 Nature paper coauthored by Geoffrey Hinton—the Nobel laureate under whom LeCun served as a postdoctoral researcher at the University of Toronto before joining Bell Labs.

By 1993, Bell Labs’ parent company, AT&T, had deployed CNN technology commercially in its check-sorting and mail-reading systems. Millions of envelopes were processed daily by CNN-enabled machines, according to Klover.ai’s history of the technology.

Despite that success, neural networks soon fell out of favor. As Communications of the ACM reported, limited data and computing power made newer methods, such as support vector machines, appear more effective. After Bell Labs’ 1996 spinoff into Lucent Technologies, research priorities shifted to short-term, market-driven goals.

Yet the intellectual groundwork endured. LeCun’s 1998 publication of LeNet-5 became a cornerstone for the next generation of AI researchers. When deep learning reemerged in the 2010s—fueled by powerful GPUs and vast image datasets—CNNs became the foundation of modern computer vision, enabling self-driving cars, advanced medical imaging, and smartphone cameras.

In 2018 LeCun, Bengio, and Hinton received the Turing Award—referred to as the “Nobel Prize of computing”—from the Association for Computing Machinery for their contributions to deep learning. By then, LeCun was a professor at New York University and director of Meta AI research—the Facebook parent company’s AI lab. He often credits Bell Labs as the place where the modern neural network learned to see.

The IEEE Milestone plaque honoring convolutional neural networks reads:

“In 1989 research on computational technologies at Bell Laboratories helped establish deep learning as a branch of artificial intelligence. Key efforts led by Yann LeCun developed the theory and practice of convolutional neural networks, which included methods of backpropagation, pruning, regularization, and self-supervised learning. Named LeNet, this deep neural network architecture advanced developments in computer vision, handwriting recognition, and pattern recognition.”

Previously publicized breakthroughs

Two additional innovations, the Echo project and the Bellmac-32 microprocessor, were honored with IEEE Milestone plaques at the October gathering. Stories of those inventions were detailed and celebrated this year in The Institute.

IEEE Life Fellow Sung-Mo “Steve” Kang, one of the lead developers of the Bellmac-32 microprocessor honored as an IEEE Milestone, gave a talk and answered questions about the 1980s-era chip. Ben Lowe

IEEE Life Fellow Sung-Mo “Steve” Kang, one of the lead engineers who worked on the development of the Bellmac-32—which pioneered CMOS chip architecture and featured several other firsts—spoke at the Milestone event.

The Bellmac-32 had 150,000 transistors—“massive for 1981,” Kang said. “Today, a student could do that in a semester with CAD tools, but at that time, it took a hundred engineers.”

Plaques recognizing the seven IEEE Milestones are displayed in the lobby at the Nokia Bell Labs facility in Murray Hill, N.J. The IEEE North Jersey Section sponsored the nominations.

Administered by the IEEE History Center and supported by donors, the Milestone program recognizes outstanding technical developments worldwide that are at least 25 years old.

IEEE.tv covered the Milestone dedication event. Click here to watch the ceremony.

From Your Site Articles

Related Articles Around the Web

Leave a Comment