Technology

The Future Of Computing

Everywhere but unseen

This is most likely being read on a computer. You’re probably taking that information for granted as well. That’s despite the fact that the equipment in front of you would have amazed computer experts only a few decades ago, and seemed like pure magic much earlier. It is made up of billions of tiny computing elements that run millions of lines of software created by countless people all over the world. The outcome: You click, tap, type, or talk, and the result displays on the screen effortlessly.

Computers used to fill entire rooms. They’re now hidden in watches, automobile engines, cameras, televisions, and toys, and they’re everywhere. They are in charge of electricity grids, scientific data analysis, and weather forecasting. Without them, the modern world would be unimaginable, and our reliance on them for health, prosperity, and amusement would only grow.

One mill of the analytical engine

Charles Babbage proposed a programmable machine

in 1833 that had a “store” for storing numbers, a “mill”

for operating on them (pictured), an instruction reader,

and a printer.
© THE MUSEUM OF SCIENCE’S BOARD OF TRUSTEES

 

Scientists aspire to create computers even quicker, programs that are more intelligent, and technology that is used ethically. But, before we move forward, let’s take a glance back at where we’ve come from.

Charles Babbage, an English mathematician, devised a programmable machine in 1833 that foreshadowed today’s computing architecture, with a “store” for storing numbers, a “mill” for processing them, an instruction reader, and a printer. This Analytical Engine also had logical features such as branching (if X, then Y). Babbage only built a portion of the machine, but his friend Ada Lovelace recognized that the numbers it could manipulate could represent anything, even music, making it far more versatile than a calculator. “For the future application of analysis, a new, large, and powerful language is being designed,” she stated. She became an expert in the operation of the proposed machine and is often referred to as the “first programmer.”

Colossus machine

The world’s first reliable electronic digital computer was Colossus. It was finished in 1943 and was used by British intelligence forces to decrypt code during WWII.

Alan Turing, an English mathematician, proposed the concept of a computer that could rewrite its own instructions, making it infinitely programmable, in 1936. His mathematical abstraction could replicate any machine with a tiny repertoire of operations, earning it the moniker “universal Turing machine.”

Colossus, the first reliable electronic digital computer, was finished in 1943 to assist England in deciphering wartime codes. Instead of moving mechanical pieces like the Analytical Engine’s cogwheels, it utilised vacuum tubes, which are devices for directing the flow of electrons.

Colossus was quick as a result of this, but engineers had to manually rewire it whenever they needed to do a new task. The team behind the ENIAC, the United States’ first electronic digital computer, drew up a new architecture for its successor, the EDVAC, possibly influenced by Turing’s vision of a more easily reprogrammable computer. John von Neumann, a mathematician who designed the EDVAC in 1945, devised a device that could store programs in memory alongside data and alter them, a structure that is now known as the von Neumann architecture. That paradigm is followed by nearly every computer today.

ENIAC

In 1946, the US Army presented the ENIAC, the first all-electronic general-purpose digital computer. The ENIAC’s calculations contributed in the building of the hydrogen bomb, however it was too late to help in World War II.

Researchers at Bell Telephone Laboratories devised the transistor in 1947, a piece of circuitry that controls the passage of electrons between two places by applying voltage (electrical pressure) or current. It was designed to replace the less efficient and slower vacuum tubes. Texas Instruments and Fairchild Semiconductor researchers separately invented integrated circuits in 1958 and 1959, in which transistors and accompanying circuitry were produced on a chip in a single process.

Only professionals could program computers for a long time. Then, in 1957, IBM launched FORTRAN, a considerably more user-friendly programming language. Today, it is still in use.

In 1981, IBM introduced the IBM PC, and Microsoft debuted its MS-DOS operating system, bringing computers into homes and offices for the first time. Apple’s Lisa and Macintosh operating systems, released in 1982 and 1984, further individualized computers. Both systems pioneered graphical user interfaces, or GUIs, which employ a mouse pointer rather than a command line to provide users with information.

Arpanet map
The ARPANET (seen on an early map) was established in 1969 and would later join with other networks to form the internet.

Meanwhile, academics were working on a project that would eventually connect our cutting-edge hardware and software. Claude Shannon, a mathematician, published “A Mathematical Theory of Communication” in 1948, which popularized the term bit (for binary digit) and provided the groundwork for information theory. His theories have influenced computation, particularly data sharing through wires and in the air. The US Advanced Research Projects Agency built ARPANET, a computer network that later joined with other networks to form the internet, in 1969. CERN, a European laboratory near Geneva, Switzerland, devised the protocols for data transmission that would become the cornerstone of the World Wide Web in 1990.

Better technology, software, and communication have now connected the majority of the world’s population. But how much more advanced can the CPUs become? How intelligent can algorithms get? What kinds of advantages and risks might we anticipate as technology advances? Stuart Russell, a computer scientist at the University of California, Berkeley, and coauthor of a popular textbook on artificial intelligence, believes computers have enormous potential for “expanding artistic creativity, accelerating science, serving as diligent personal assistants, driving cars, and — hopefully — not killing us.”
Hutson, Matthew

Jobs and Mac

In 1984, Apple cofounder Steve Jobs is photographed with the first Macintosh computers. These machines introduced graphical user interfaces, or GUIs, which allowed users to click and drag icons rather than type commands into a command line.

chasing after speed

Computers, for the most part, communicate in binary. They store data in strings of 1s and 0s, whether it’s music, a program, or a password. They also process data in a binary manner, switching transistors between on and off states. The more transistors a computer has, the faster it can process data, allowing for more realistic video games and better air traffic control.

A logic gate is one of the building parts of a circuit made up of transistors. An AND logic gate, for example, is active when both inputs are active, but an OR logic gate is active when at least one input is active. The physical expression of computation is a complex traffic pattern of electrons made up of logic gates. Millions of logic gates can be found on a single computer chip.

As a result, the more logic gates, and hence transistors, the more powerful the computer. Gordon Moore, a pioneer of Fairchild Semiconductor and later of Intel, published a paper in 1965 titled “Cramming More Components into Integrated Circuits” about the future of chips. He extended the trend by graphing the number of components (mainly transistors) on five integrated circuits (chips) created between 1959 and 1965. Every year, the number of transistors per chip doubled, and he expected this trend would continue.

Original Moore graphGordon Moore graphed the rise in the number of transistors per computer chip over time in 1965. (solid line). The trend, he said, would continue.

Moore cited three elements driving exponential development in a 1975 talk: smaller transistors, larger chips, and “device and circuit cleverness,” such as less wasted space. He predicted that the doubling would happen every two years. It did, and it has done so for decades. Moore’s law is the name given to this pattern.

Moore’s law, unlike Newton’s rule of universal gravitation, is not a physical law. It was meant to be a comment on economics. There will always be incentives to make computers faster and less expensive, but physics will eventually intervene. As it becomes increasingly difficult to manufacture transistors smaller, chip development can no longer keep up with Moore’s law. The cost of chip fabrication plants, or “fabs,” doubles every few years, according to what’s known as Moore’s second law. TSMC, a semiconductor manufacturer, has proposed developing a $25 billion factory.

Moore’s law no longer holds true; doubling is now taking longer. With each generation, we continue to cram more transistors into processors, but the generations are becoming less frequent. Better transistors, more specialized chips, novel chip designs, and software hacks are among the options being considered by researchers.

ADVERTISMENT

Leave a Reply

Back to top button