News

Harvard Alumni Email Forwarding Services to Remain Unchanged Despite Student Protest

News

Democracy Center to Close, Leaving Progressive Cambridge Groups Scrambling

News

Harvard Student Government Approves PSC Petition for Referendum on Israel Divestment

News

Cambridge City Manager Yi-An Huang ’05 Elected Co-Chair of Metropolitan Mayors Coalition

News

Cambridge Residents Slam Council Proposal to Delay Bike Lane Construction

The Race for The Ultimate Supercomputer

TECHNOLOGY

NO WRITER ATTRIBUTED

IN JUST OVER thirty-two hours the future of computer development may have moved from strategic blueprints on the desks of high-tech executives to a nationwide trend mapped in stone. A sixty-nine digit number--the last in a century-old list of seemingly unfactorable numbers composed by a famous French mathematician--was broken down by a Cray supercomputer. The implications of this are revolutionary. While the breakdown of the number, more simply known as 2251-1, utilized only a sleek algorithm and no revolutionary advances, it signaled the ever-growing importance of ultra-sophisticated computers.

No longer is it sufficient that we have computers which can replace a horde of mathematicians in both speed and accuracy, or robots which render human laborers "impediments" to progress. The "new generation of supercomputers," as they have been dubbed, will make even the supercomputer which solved the sixty-nine digit puzzle look like a pocket calculator.

A handful of nations around the world have entered a high stakes race to develop faster and more efficient supercomputers. Ten billion dollars have been committed to such research by industries around the world. The investment, however, is only a fraction of the envisioned spoils; the winner will corner a five-hundred billion dollar-a-year market of information-age business. The ultimate Prize is even greater-the development of artificial intelligence.

The future of computers will be determined by the degree of success scientists enjoy with the concept of "parallel processing." Instead of directing all the computer's labor through a central unit, the work will be divided among many data processing units. The departure from the "von Neumann architecture," named for its founder, will enable programmers to avoid a "bottleneck." The rewards could be substantial since the "von Neumann bottleneck" has served as a traffic jam which severely restricts the flow of information through the one processor.

Parallel processing, however, is not without drawbacks. Without proper programming, the separate units could slow to a "deadly embrace" if they must await information still passing through another unit. Experts in parallelism have nevertheless claimed to be approaching a processing speed that exceeds the fastest theoretical rate possible with a "von Neumann" processor by a factor of ten.

Critics fear the new trend will culminate in the development of an all encompassing brain, and such worries may not be pure speculation. Nobel lauireate Herbert Simon, professor of computer science and psychology at Carnegie-Mellon, sees no restrictions on the science and believes that human intelligence will one day be recreated. Yet if an understanding is what onlookers seek, they had best concentrate on the reeasoning that spawned such efforts rather than on the possible realization of science fiction folklore.

Before an electronic brain could be perfected, it would need a capacity to distinguish between the most minute nuances of languages and psychology. A common example: to translate the phrase, "Mary had a little lamb," a computer must distinguish between twenty-eight meanings such as "Mary owned the lamb," "Mary ate the lamb," "Mary gave birth to a lamb," or "Mary engaged in sexual activity with a little lamb." Because no researcher has yet solved this problem society-at least for a time-is till bafe from an electronic "Big Brother."

THE CURRENT THREAT, though, is not that our government will develop such technology, but that it will not. The Japanese "National Superspeed Computer Project" and "Fifth Generation Computer Project" threaten to erase the supremacy of the United States in almost all areas of research. Already the Japanese boast a computer that is a thousand times faster than the speediest American model.

Super computers today are responsibloe for the derivation of drugs from quantum theory rather than trial-and-error, the simulation of auto and aircraft performance in replacement of test tracks and wind tunnels, and the exploraiton of oil deposits as an alternative to extensive surveys. And the issue is no longer one of monetary profit alone; the risk is national security.

American hopes are directed at a vast array of government and private projects investigating parallelism and the supercomputers of the future, but nowhere is the effort more visible than where it is most crucial. The Pentagon's Defense Advanced Research Projects Agency (DARPA) has been designe to stimulate research that will insure national security and perhaps the "Star Wars" defense advocated by President Reagan.

Consider the problem of the sixty-nine digit number. The Defense Department long ago created a coding system founded on eighty digit numbers. By breaking these numbers into their respective factors, the code will be broken. A new generation of superfast computers could break down such a code in a matter of hours.

The race to build the most rapid and efficient microprocessor is on, and the winner will soon pull far ahead of the pack of nations scrambling for the outdated pieces of technology left behind.

Want to keep up with breaking news? Subscribe to our email newsletter.

Tags