Skip to main content

Posts

Wiener and Cybernetics

From experiments with anti-aircraft systems that interpreted radar images to detect enemy planes,  Norbert Wiener  coined the term  cybernetics  from the Greek word for "steersman." He published "Cybernetics" in 1948, which influenced  artificial intelligence . Wiener also compared  computation , computing machinery,  memory  devices, and other cognitive similarities with his analysis of brain waves. The first actual computer bug was a  moth . It was stuck in between the relays on the Harvard Mark II. [1]  While the invention of the term 'bug' is often but erroneously attributed to Grace Hopper , a future rear admiral in the U.S. Navy, who supposedly logged the "bug" on September 9, 1945, most other accounts conflict at least with these details. According to these accounts, the actual date was September 9, 1947 when operators filed this 'incident' — along with the insect and the notation "First actual case of bug being found"...

Shannon and information theory

Up to and during the 1930s, electrical engineers were able to build electronic circuits to solve mathematical and logic problems, but most did so in an  ad hoc  manner, lacking any theoretical rigor. This changed with  Claude Elwood Shannon 's publication of his 1937 master's thesis,  A Symbolic Analysis of Relay and Switching Circuits . While taking an undergraduate philosophy class, Shannon had been exposed to  Boole's  work, and recognized that it could be used to arrange electromechanical relays (then used in telephone routing switches) to solve logic problems. This concept, of utilizing the properties of electrical switches to do logic, is the basic concept that underlies all electronic digital computers, and his thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II. Shannon went on to found the field of  information theory  wi...

Emergence of a discipline

The mathematical foundations of modern computer science began to be laid by  Kurt Gödel  with his  incompleteness theorem  (1931). In this theorem, he showed that there were limits to what could be proved and disproved within a  formal system . This led to work by Gödel and others to define and describe these formal systems, including concepts such as mu-recursive functions  and  lambda-definable functions . 1936 was a key year for computer science. Alan Turing and  Alonzo Church  independently, and also together, introduced the formalization of an  algorithm , with limits on what can be computed, and a "purely mechanical" model for computing. These topics are covered by what is now called the  Church–Turing thesis , a hypothesis about the nature of mechanical calculation devices, such as electronic computers. The thesis claims that any calculation that is possible can be performed by an algorithm running on a computer, provi...

Birth of computer science

Before the 1920s,  computers  (sometimes  computors ) were human clerks that performed computations. They were usually under the lead of a physicist. Many thousands of computers were employed in commerce, government, and research establishments. Most of these computers were women, and they were known to have a degree in calculus. Some performed astronomical calculations for calendars. After the 1920s, the expression  computing machine  referred to any machine that performed the work of a human computer, especially those in accordance with effective methods of the  Church-Turing thesis . The thesis states that a mathematical method is effective if it could be set out as a list of instructions able to be followed by a human clerk with paper and pencil, for as long as necessary, and without ingenuity or insight. Machines that computed with continuous values became known as the  analog  kind. They used machinery that represented continuous nume...

Binary logic

In 1703,  Gottfried Leibnitz  developed  logic  in a formal, mathematical sense with his writings on the binary numeral system. In his system, the ones and zeros also represent  true and  false  values or  on  and  off  states. But it took more than a century before  George Boole  published his  Boolean algebra  in 1854 with a complete system that allowed computational processes to be mathematically modeled. By this time, the first mechanical devices driven by a binary pattern had been invented. The  industrial revolution  had driven forward the mechanization of many tasks, and this included  weaving .  Punched cards  controlled  Joseph Marie Jacquard 's loom in 1801, where a hole punched in the card indicated a binary  one  and an unpunched spot indicated a binary  zero . Jacquard's loom was far from being a computer, but it did illustrate that machines could be ...

History of Computer Science

Early history The earliest known tool for use in computation was the  abacus , developed in period 2700–2300 BC in  Sumer . The Sumerians' abacus consisted of a table of successive columns which delimited the successive orders of magnitude of their  sexagesimal  number system. [2]  Its original style of usage was by lines drawn in sand with pebbles. Abaci of a more modern design are still used as calculation tools today. The  Antikythera mechanism  is believed to be the earliest known mechanical analog computer. [3]  It was designed to calculate astronomical positions. It was discovered in 1901 in the  Antikythera  wreck off the  Greek  island of Antikythera, between Kythera and Crete, and has been dated to c. 100 BC. Technological artifacts of similar complexity did not reappear until the 14th century, when mechanical  astronomical clocks  appeared in  Europe . [4] Mechanical analog computing devices a...

Year of the computer introduced in India.

In the early days of computer, only large corporations and businesses could afford a mainframe computer, and these machines were used for tasks like issuing payrolls and for analyzing huge data-sets (like a population census). The Indian Statistical Institute in Calcutta acquired the first computer in Indian in 1955. Additional computers were purchased in India, mainly from IBM. By 1972, there 172 computers in India, and three-fourths of these were made by IBM. In 1977, the Indian government refused to allow more than 50 percent ownership by foreigners of any company operating in the nation. IBM refused to sell majority ownership of its Indian operations, and was thus forced to leave India. Later, after 1984, Prime Minister Rajiv Gandhi changed government policies to encourage an indigenous microcomputer industry. Imports were liberalized, and international standards were followed by Indian computer manufacturers so that their products could compete more effectively in the global mark...