The Cyber-Cave

Reflections on the political, technological, cultural and economic trends of the world

Computer science

4th century or 7th century BC: Panini (an Hindu linguist born in today’s Pakistan) writes the ‘Astadhyayi’ which has been compared to Euclid’s ‘Elements’ for his work on creating a formal-logical framework. In Panini’s case, however, the work is conducted on creating meta-rules of the Sanskrit grammar. Some historians claim that the development of ancient Indian algebra may be due to Panini’s work (such as using letters as numbers). Some have also compared Panini’s grammatic rules (recursions, meta rules etc) to have similar computing power to that of a Turing machine .

1136-1206: Al Jazari automata inventions

1930s: Shannon applies Boolean algebra for electrical circuits.

1936: Turing and Church show that a solution to the Entscheidungsproblem (one of Hilbert’s 23 problems) is impossible (for instance in the Halting problem is undecidable: “Given a program and an input to the program, determine if the program will eventually stop when it is given that input.”).

1943: the Colossus is built (the first digital computer)

1945: Vannevar Bush essay ‘As we may think’; ” The Encyclopoedia Britannica could be reduced to the volume of a matchbox”….”Consider a future device for individual use, which is a sort of mechanized private file and library. It needs a name, and, to coin one at random, “memex” will do. A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory.”
“We know that when the eye sees, all the consequent information is transmitted to the brain by means of electrical vibrations in the channel of the optic nerve. This is an exact analogy with the electrical vibrations which occur in the cable of a television set: they convey the picture from the photocells which see it to the radio transmitter from which it is broadcast. We know further that if we can approach that cable with the proper instruments, we do not need to touch it; we can pick up those vibrations by electrical induction and thus discover and reproduce the scene which is being transmitted, just as a telephone wire may be tapped for its message.” [1]

1945: Von Neumann designs the ‘stored-program architecture’ of a computer (comprising the main memory, arithmetic and logic unit…)

1947: AT&T Bell Telephone Laboratory’s engineers use the transistor

1948: Shannon’s foundation of information theory “A Mathematical Theory of Communication” (information entropy, bit)

1954 Wesley Clark and Belmont Farley publish a paper on the artificial neural network

1956 John McCarthy ‘AI’ reference at the Dartmouth Conference

1960 Frank Rosenblatt develops the Perceptron (an electronic device capable of learning like the human neural network. This device had importance consequences in the fields of machine learning, deep learning and artificial neural networks).

1962: John Tukey’s “The Future of Data Analysis” anticipates data science

1969 Codd, while at IBM, produces insights on databases

1969-1990 Arpanet program

1982 Judea Pearl’s paper on the ‘Bayesian networks’

1990 Tim Berners-Lee creates the first web-site  (WWW)

1998 Google founded (Page creates PageRank algorithm)

2010s Google’s Deepmind uses the ‘neural network’ concept, learns pattern recognition (like recognising cats), has beaten the Go champion

SOURCES:
[1] https://www.theatlantic.com/magazine/archive/1945/07/as-we-may-think/303881/

Advertisements
%d bloggers like this: