A computer consists of various devices referred to as hardware, such as the keyboard,
screen, mouse, hard disks, memory, DVD drives and processing units. Every year or two, the capacities of computer hardware have approximately doubled inexpensively. This remarkable trend often is called Moore’s Law, named for the person who identified it, Gordon
Moore, co-founder of Intel—the leading manufacturer of the processors in today’s computers and embedded systems, such as smartphones, appliances, game controllers, cable set-top boxes and automobiles.
Moore’s Law and related observations apply especially to
• the amount of memory that computers have for running programs and processing data
• the amount of secondary storage (such as hard disk storage) they have to hold programs and data over longer periods of time
• their processor speeds—the speeds at which computers execute their programs (i.e.,do their work)
Similar growth has occurred in the communications field, in which costs have plummeted as enormous demand for communications bandwidth (i.e., information-carrying capacity) has attracted intense competition. We know of no other fields in which technology improves so quickly and costs fall so rapidly. Such phenomenal improvement is truly fostering the Information Revolution and creating significant career opportunities.
As a result of this continuing stream of technological advances, computers already can perform calculations and make logical decisions phenomenally faster than human beings can. Many of today’s personal computers can perform billions of calculations in one second—more than a human can perform in a lifetime. Supercomputers are already performing thousands of trillions (quadrillions) of instructions per second! The world’s fastest supercomputer—the Cray Titan—can perform over 17 quadrillion calculations per second—(17.59 petaflops)2—that’s more than 2 million calculations per second for every person on the planet! And—these “upper limits” are expanding quickly!