By Deera Wijesundarra
Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses punched wooden cards to automatically weave fabric designs.
subsequently early computers would use similar punch cards to manually input instructions
English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers.
Which will give birth to a project funded by the British government, called the "Difference Engine"
Which unfortunately would fail due to a lack of technology at the time
Ada Lovelace, an English mathematician and the daughter of poet Lord Byron, writes the world's first computer program.
while translating a paper on Babbage's Analytical Engine from French into English.
Because of her comments/annotations her transcription will turn out to be three times longer than the original
Siffert wrote in an article for The Max Planck Societ.
"Lovelace also adds a step-by-step
description for computation of Bernoulli numbers with Babbage's machine — basically an algorithm
— which, in effect, makes her the world's first computer programmer."
Swedish inventor Per Georg Scheutz and his son Edvard design the world's first printing calculator.
The machine is significant for being the first to "compute tabular differences and print the results,"
This would later be popularised by Uta C. Merzbach's book, "Georg Scheutz and the First Printing Calculator" (Smithsonian Institution Press, 1977).
Herman Hollerith designs a punch-card system to help calculate the 1890 U.S. Census.
The machine, saves the government several years of calculations, and the U.S. taxpayer approximately $5 million
Hollerith later establishes a company that will eventually become International Business Machines Corporation (IBM).
At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and builds the Differential Analyzer
the first large-scale automatic general-purpose mechanical analog computer,
Alan Turing, a British scientist and mathematician, presents the principle of a universal machine, later called the Turing machine, in a paper called "On Computable Numbers…"
Turing machines are theoretically capable of computing anything that is computable. The central concept of the modern computer is based on his ideas.
Turing is later involved in the development of the Turing-Welchman Bombe, an electro-mechanical device designed to decipher Nazi codes during World War II.
John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts or shafts.
David Packard and Bill Hewlett found the Hewlett Packard Company in Palo Alto, California. With its first headquaters being Packard's garage
Atanasoff and his graduate student, Clifford Berry, design the first digital electronic computer in the U.S., called the Atanasoff-Berry Computer (ABC).
This marks the first time a computer is able to store information on its main memory, and was capable of performing one operation every 15 seconds,
German inventor and engineer Konrad Zuse completes his Z3 machine, the world's earliest digital computer
Which was Unfortunately estroyed during a bombing raid on Berlin during World War II
Luckily Zuse escaped Nazi Germany and later released the world's first commercial digital computer, the Z4, in 1950
Two professors at the University of Pennsylvania, John Mauchly and J. Presper Eckert, design and build the Electronic Numerical Integrator and Calculator (ENIAC).
The machine is the first "automatic, general-purpose, electronic, decimal, digital computer,"
Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC
the "Universal Automatic Computer", which would be first commercial computer for business and government applications.
William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor.
A discovery making it possible to manufacture an electric switch with solid materials and without the need for a vacuum.
A team at the University of Cambridge develops the Electronic Delay Storage Automatic Calculator (EDSAC)"the first practical stored-program computer,"
"EDSAC ran its first program in May 1949 when it calculated a table of squares and a list of prime numbers,"
scientists with the Council of Scientific and Industrial Research (CSIR), now called CSIRO, build Australia's first digital computer
Council for Scientific and Industrial Research Automatic Computer (CSIRAC) was the first digital computer in the world to play music,
Grace Hopper develops the first computer language, which eventually becomes known as COBOL, which stands for COmmon, Business-Oriented Language
Hopper is later dubbed the "First Lady of Software" in her posthumous Presidential Medal of Freedom citation.
Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.
John Backus and his team of programmers at IBM publish a paper describing their newly created FORTRAN programming language, an acronym for FORmula TRANslation,
Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip
Kilby is later awarded the Nobel Prize in Physics for his work.
Douglas Engelbart reveals a prototype of the modern computer at the Fall Joint Computer Conference, San Francisco.
His presentation, called "A Research Center for Augmenting Human Intellect" includes a live demonstration of his computer, including a mouse and a graphical user interface (GUI)
This marks the development of the computer from a specialized machine for academics to a technology that is more accessible to the general public.
Ken Thompson, Dennis Ritchie and a group of other developers at Bell Labs produce UNIX, an operating system
Which made "large-scale networking of diverse computing systems — and the internet — practical," according to Bell Labs
he team behind UNIX continued to develop the operating system using the C programming language, which they also optimized.
The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.
A team of IBM engineers led by Alan Shugart invents the "floppy disk," enabling data to be shared among different computers.
Months later, entrepreneur Nolan Bushnell and engineer Al Alcorn with Atari release Pong, the world's first commercially successful video game.
Ralph Baer, a German-American engineer, releases Magnavox Odyssey, the world's first home game console, in September 1972
Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.
The magazine cover of the January issue of "Popular Electronics" highlights the Altair 8080 as the "world's first minicomputer kit to rival commercial models."
After seeing the magazine issue, two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language.
On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.
Steve Jobs and Steve Wozniak co-found Apple Computer, unveil APPLE 1, he first computer with a single-circuit board and ROM
Radio Shack began its initial production run of 3,000 TRS-80 Model 1 computers — disparagingly known as the "Trash 80" — priced at $599
Within a year, the company took 250,000 orders for the computer, according to the book "How TRS-80 Enthusiasts Helped Spark the PC Revolution"
The Commodore Personal Electronic Transactor (PET), is released onto the home computer market, featuring an MOS Technology 8-bit 6502 microprocessor
It featured a screen, keyboard and cassette player. And the PET was especially successful in the education market,
The first West Coast Computer Faire is held in San Francisco. Jobs and Wozniak present the Apple II Computer
includes color graphics and features an audio cassette drive for storage.
VisiCalc, became first computerized spreadsheet program is introduced, which was written for the Apple II computer
MicroPro International, founded by software engineer Seymour Rubenstein, releases WordStar, the world's first commercially successful word processor.
WordStar is programmed by Rob Barnaby, and includes 137,000 lines of code, according to Matthew G. Kirschenbaum's book "Track Changes: A Literary History of Word Processing"
"Acorn," IBM's first personal computer, is released onto the market at a price point of $1,565Acorn uses the MS-DOS operating system from Windows.Optional features include a display, printer, two diskette drives, extra memory, a game adapter and more.
The Apple Lisa, standing for "Local Integrated Software Architecture"(also the name of Steve Job's daughter) is the first computer to feature a GUI
The machine also includes a drop-down menu and Various icons. But was considered a commercial failure due to the high price
This year, the Gavilan SC is released and is the first portable computer with a flip-form design and the very first to be sold as a "laptop."
The Apple Macintosh is announced to the world during a Superbowl advertisement. The Macintosh is launched with a retail price of $2,500
first successful mass-market all-in-one desktop personal computer to have featured a graphical user interface, built-in screen, and mouse.
As a response to Apples arguably better GUI, Microsoft releases Windows in November 1985 with a highly "ispired" GUI
Commodore annouces the Amiga 1000
Tim Berners-Lee, a British researcher at the European Organization for Nuclear Research (CERN), submits his proposal for what would become the World Wide Web.
His paper details his ideas for Hyper Text Markup Language (HTML), the building blocks of the Web.
The Pentium microprocessor advances the use of graphics and music on PCs.
Sergey Brin and Larry Page develop the Google search engine at Stanford University. The first search engine to rank sites by backlinks
Microsoft invests $150 million in Apple, which at the time is struggling financially.
This investment ends an ongoing court case in which Apple accused Microsoft of copying its operating system.
Wi-Fi, the abbreviated term for "wireless fidelity" is developed, initially covering a distance of up to 300 feet (91 meters)
Mac OS X, later renamed OS X then simply macOS, is released by Apple as the successor to its standard Mac Operating System
OS X goes through 16 different versions, each with "10" as its title, and the first nine iterations are nicknamed after big cats, with the first being codenamed "Cheetah,"
AMD's Athlon 64, the first 64-bit processor for personal computers, is released to customers.
The Mozilla Corporation launches Mozilla Firefox 1.0. The Web browser is one of the first major challenges to Internet Explorer, owned by Microsoft.
During its first five years, Firefox exceeded a billion downloads by users, according to the Web Design Museum.
Google buys Android, a Linux-based mobile phone operating system
The MacBook Pro from Apple hits the shelves. The Pro is the company's first Intel-based, dual-core mobile computer.
Microsoft launches Windows 7 on July 22. The new operating system features the ability to pin applications to the taskbar, easy-to-access jumplists, easier previews of tiles and more,.
The iPad, Apple's flagship handheld tablet, is unveiled.
Google releases the Chromebook, which runs on Google Chrome OS.
Apple releases the Apple Watch.
Microsoft releases Windows 10.
The first reprogrammable quantum computer was created.
capability to program new algorithms into their system. While others are usually each tailored to attack a particular algorithm
lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.
The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers.
Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color
And this richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures.