MULTIZ321
TUG Member
- Joined
- Jun 6, 2005
- Messages
- 32,898
- Reaction score
- 9,451
- Location
- FT. LAUDERDALE, FL
- Resorts Owned
- BLUEWATER BY SPINNAKER HHI
ROYAL HOLIDAY CLUB RHC (POINTS)
Smaller, Faster, Cheaper, Over: The Future of Computer Chips - by John Markoff/ Technology/ International New York Times/ The New York Times/ nytimes.com
"At the inaugural International Solid-State Circuits Conference held on the campus of the University of Pennsylvania in Philadelphia in 1960, a young computer engineer named Douglas Engelbart introduced the electronics industry to the remarkably simple but groundbreaking concept of “scaling.”
Dr. Engelbart, who would later help develop the computer mouse and other personal computing technologies, theorized that as electronic circuits were made smaller, their components would get faster, require less power and become cheaper to produce — all at an accelerating pace.
Sitting in the audience that day was Gordon Moore, who went on to help found the Intel Corporation, the world’s largest chip maker. In 1965, Dr. Moore quantified the scaling principle and laid out what would have the impact of a computer-age Magna Carta. He predicted that the number of transistors that could be etched on a chip would double annually for at least a decade, leading to astronomical increases in computer power.
His prediction appeared in Electronics magazine in April 1965 and was later called Moore’s Law. It was never a law of physics, but rather an observation about the economics of a young industry that ended up holding true for a half-century..."
Gordon Moore, a founder of the Intel Corporation, in a photograph from the late 1960s. In 1965, in what came to be called Moore’s Law, Dr. Moore laid out the principle that the number of transistors that could be etched on a chip would double annually for at least a decade. Credit Intel
Richard
"At the inaugural International Solid-State Circuits Conference held on the campus of the University of Pennsylvania in Philadelphia in 1960, a young computer engineer named Douglas Engelbart introduced the electronics industry to the remarkably simple but groundbreaking concept of “scaling.”
Dr. Engelbart, who would later help develop the computer mouse and other personal computing technologies, theorized that as electronic circuits were made smaller, their components would get faster, require less power and become cheaper to produce — all at an accelerating pace.
Sitting in the audience that day was Gordon Moore, who went on to help found the Intel Corporation, the world’s largest chip maker. In 1965, Dr. Moore quantified the scaling principle and laid out what would have the impact of a computer-age Magna Carta. He predicted that the number of transistors that could be etched on a chip would double annually for at least a decade, leading to astronomical increases in computer power.
His prediction appeared in Electronics magazine in April 1965 and was later called Moore’s Law. It was never a law of physics, but rather an observation about the economics of a young industry that ended up holding true for a half-century..."

Gordon Moore, a founder of the Intel Corporation, in a photograph from the late 1960s. In 1965, in what came to be called Moore’s Law, Dr. Moore laid out the principle that the number of transistors that could be etched on a chip would double annually for at least a decade. Credit Intel
Richard