Shannon centennial celebrants recall U-M grad’s advances, societal impact

Topics:

The greater mathematical community hails University of Michigan alumnus Claude Shannon as the “Father of Information Theory,” for revealing the potential for digital communications and inspiring advancements that followed.

But even Shannon, himself, couldn’t remember where the theory originates from.

“I wanted to work on information, the measurement of information,” said Shannon, who died in 2001. “I had already read (Ralph) Hartley’s paper … it had been an important influence in my life. I think I read it at the University of Michigan.”

This portrait of Claude Shannon was taken in 1957. (Photo courtesy of MIT Museum)

That was in the 1930s. After earning bachelor’s degrees in electrical engineering and mathematics at U-M in 1936, Shannon went on to graduate studies at the Massachusetts Institute of Technology, and a fellowship at the Institute for Advanced Study in Princeton, New Jersey. There, he discussed his ideas with influential scientists and mathematicians and worked across disciplines. He joined Bell Labs.

In 1948 he wrote his own influential paper, “The Mathematical Theory of Communication.”

Shannon theorized the binary code of zeros and ones that makes cell phones, email and the Internet possible.

This would change the world.

“Claude Shannon is the founder of the theory of information and communication. These contributions were singularly important in that they led directly to the digital revolution that powers our electronic world,” says Alfred Hero, the R. Jamison and Betty Williams Professor of Engineering at U-M and co-director of the Michigan Institute for Data Science.

To celebrate the centennial of his birth on April 30, institutions worldwide are organizing celebrations to honor Shannon. They include the Boston Museum of Science, Technische Universitat Berlin, University of South Australia, University of Toronto, Chinese University of Hong Kong, Cairo University, Telecom ParisTech, National Technical University of Athens, Indian Institute of Technology Bombay, Nanyang Technological University, Ecole Polytechnique Federale de Lausanne, University of Maryland, University of Illinois at Chicago, MIT, and the University of California, Los Angeles.

“At U of M we are planning a workshop in the fall on campus,” says Hero, who also is professor of electrical engineering and computer science, biomedical engineering, and statistics.

David Neuhoff (right), Joseph E. and Anne P. Rowe Professor of Electrical Engineering, led the effort to recognize Claude Shannon’s contributions to information theory by working with the IEEE Information Theory Society to commission statues in his honor, including one outside the Electrical Engineering and Computer Science Building at U-M. Alfred Hero (left), R. Jamison and Betty Williams Professor of Engineering and co-director of the Michigan Institute for Data Science, is organizing a Shannon centennial celebration at U-M in the fall. (Photo by Daryl Marshke, Michigan Photography)

Shannon also is recognized for his work on cryptography, the sampling theorem and connecting Boolean algebra to logic circuit design.

Shannon died Feb. 24, 2001, of Alzheimer’s disease at age 84. The New York Times called him a “playful genius who invented the bit, separated the medium from the message, and laid the foundations for all digital communications. (He) single-handedly laid down the general rules of modern information theory, creating the mathematical foundations for a technical revolution.”

Without his clarity of thought and sustained ability to work his way through intractable problems, such advances as email and the World Wide Web would not have been possible, the Times stated.

Praised for his work ethic, Shannon also was known to juggle while riding his unicycle in the halls of Bell Labs. He also was an expert chess player and gambler who designed machines that could play his favorite games — even a juggling machine. He spoke of them in his 1985 speech accepting the Kyoto Prize for contributions to his field and to humanity, an equivalent of the Nobel Prize, Hero says. Shannon also won the National Medal of Science in 1966.

Claude who?

Born in 1916 in Petoskey, Michigan, Shannon grew up in nearby Gaylord where he played with radio sets and math puzzles, and looked up to Thomas Edison.

 “Today, the average person does not know what information theory is, or who Claude Shannon was. However, the impact of his work in our everyday life is very significant,” says Christina Fragouli, professor in the School of Electrical Engineering, UCLA. She also is an Institute of Electrical and Electronics Engineers (IEEE) fellow and co-chair of the IEEE’s Information Theory Society Claude Shannon centennial committee.

The society presents a yearly Claude E. Shannon award to honor consistent and profound contributions to the field of information theory. Each winner presents a Shannon Lecture at the IEEE International Symposium on Information Theory.

Shannon showed how information could be defined and quantified with precision, by encoding in binary digits, or bits, Graham P. Collins wrote in Scientific American. Shannon also analyzed the ability to send information through a communications channel. He found a channel had a certain maximum transmission rate, or bandwidth, Collins wrote. Even in a noisy channel with a low bandwidth, near perfect, error-free communication could happen by keeping the transmission rate within the channel’s bandwidth and by using error-correcting schemes.

Claude Shannon appears on camera during a CBS telecast of “The Thinking Machine.” (Photo courtesy MIT Museum)

Joining the greats

Shannon was one of the greatest of the giants who created the information age, wrote M. Mitchell Waldrop in the MIT Technology Review a year after Shannon’s death.

“John von Neumann, Alan Turing (who met Shannon during World War II) and many other visionaries gave us computers that could process information. But it was Claude Shannon who gave us the modern concept of information — an intellectual leap that earns him a place on whatever high-tech equivalent of Mount Rushmore is one day established,” Waldrop wrote.

Today, information theory encompasses the design of compression, coding, signaling, detection and classification techniques that underlie contemporary information transmission, storage and processing technologies.

But the impact of Shannon’s work reaches beyond information theory to mathematics, physics, statistics, computing and cryptology, and even economics, biology, linguistics and other fields in the natural and social sciences, according to the Information Theory Society Centennial Committee.

Hero agrees. “His impact goes well beyond the technology of communication. The Shannon entropy is used in fields as disparate as computational linguistics, data mining, medical imaging and pattern recognition,” he says.

Shannon met his wife, Betty, when she was a numerical analyst at Bell Labs. They married in 1949 and had three children.

The IEEE Information Society reports that during this centennial year, the Republic of Macedonia is planning a Shannon commemorative stamp. A petition drive to produce a U.S. Postal Service stamp honoring Shannon also is underway, as well as a film documentary about Shannon and the impact of information theory.

Tags:

Leave a comment

Commenting is closed for this article. Please read our comment guidelines for more information.