The world is today living in a digital era – with almost everything happening within digital technology – from online businesses, and e-learning to banking on your mobile phone.
Today, nearly 7 billion people utilize
digital technology, but surprisingly only one per cent of them understand it.
Allow me to take you through the history
and evolution of digital technology that I am sure at the end of it, you
will be a smarter user.
The first digital algorithm was created
in 1843 by Ada Lovelace, a British Mathematician who realized computers could
be more than big calculators.
45 years later, which is in 1888, Herman
Hollerith invented the tabulating machine to help process data for the 1890 U.S.
Census. It was an electromechanical machine that could summarize information
stored on punched cards. Hollerith then starts a company that later becomes
IBM.
In 1936, Alan Turing conceives of the
Logical Computing Machine. The Turing machine was a mathematical model of the
modern computers we all use today.
But all through that period, the world
had no computer – the first one came in 1945 with the innovation of ENIAC (Electronic
Numerical Integrator and Computer) – the first programmable, electronic,
general-purpose digital computer.
It was designed to calculate artillery
firing tables for the United States Army. In the same year, Grace Hopper wrote
a 500-page Manual of Operations for the Automatic Sequence-Controlled
Calculator, outlining the fundamental operating principles of computing
machines.
The First Transistor
Two years later, Bell Laboratories
invented the first transistor, a semiconductor device used to amplify or switch
electrical signals and power. It was one of the basic building blocks of modern
electronics and allowed for more advanced digital computers.
After the transistor came the Microchip
in 1959. It was smaller than your fingernail and contained computer circuitry
called an integrated circuit.
The integrated circuit is one of the
most important innovations of mankind. Almost all modern products use chip
technology.
And that was not the end of innovations
as in 1964, BASIC (Beginners' All-purpose Symbolic Instruction Code) was
created by John G. Kemeny and Thomas E. Kurtz. It is a set of general-purpose,
high-level programming languages that enabled students in non-scientific fields
to use computers.
The Microprocessor
In 1968 Intel was founded by Gordon
Moore and Robert Noyce. In 1971, the two created the world's first commercial
microprocessor chip - the Intel 4004. The chip and its successors revolutionized
electronics and made Intel a household name around the world.
The first personal computer would later be
invented in 1975 by Ed Roberts who created the Altair 8800. Altair used Intel's
8080 microprocessor which was revolutionary for the time. The first units had a
maximum of 8 Kilobytes of memory, a 2Mhz CPU clock speed and initially no disk
storage.
The same year, Bill Gates and Paul Allen
found Microsoft. They released a BASIC compiler for the Altair 8800, giving it
its first programming language. At the time, anyone with an Altair PC could
write their own programs, which was a real breakthrough.
Microsoft BASIC would then become the
foundation software product of the Microsoft Company. It evolved into a line of
BASIC interpreters and compilers adapted for many different microcomputers.
A year later – in 1976 – Steve Jobs and Steve Wozniak found Apple Computer Company and released the Apple I. It was an 8-bit desktop computer designed by Wozniak and is the first Apple product. The Apple I went on sale in July 1976 at a price of US$666.66.
Four years later, Microsoft bought QDOS
(Quick and Dirty Operating System) from Tim Paterson for $50K. QDOS was tweaked
to become MS-DOS which is licensed to IBM and became the dominant operating
system of the 1980s.
In 1981 - Xerox released the Xerox Star
which used a GUI or graphical user interface. It consisted of graphical
elements such as windows, menus, radio buttons, and checkboxes. Apple, IBM and
Microsoft later borrow many Xerox ideas to use in their own products. Digibarn
Museum.
Microsoft would 1985 ship the first
version of Windows. By 2000, Windows was the operating system for 95% of the
market.
The Open Source Movement:
In the same 1985, Richard Stallman found The
Free Software Foundation to support the free software movement. He believed
software should be distributed under copyleft ("share alike") terms. Later
in 1991, Linux was released by Linus Torvalds. Linux was one of the most prominent
examples of free and open-source software collaboration. The source code could be
used, modified and distributed commercially or non-commercially by anyone.
Then came the internet era. Between 1969
and 1990, the internet was being developed but only for government and academic
uses. From 1991 to1993, three congressional bills made it widely available to
the general public.
The first widely downloaded Internet
browser, Mosaic was launched in 1993. In 1998, the Google search engine went
live, revolutionizing how people find information online. Then in 2001 Wikipedia
was launched, paving the way for collective web content generation and
democratizing information
Social Media:
Social media would then be born in 1997 came with the innovation of among others Six Degrees (2002), MySpace (2004),
Facebook (2005), YouTube (2006), Twitter (2010), Instagram (2017) and currently
TikTok.
Today, Social has at least 4.7 billion users worldwide.
From social media came the Smart Phone
in 2007 when the iPhone put a supercomputer in the hands of the masses. As of
today, not less than 6.6 billion humans are walking around with supercomputers
(mobile phones) in their pockets.
So what is the future of digital technology?
As we speak now, the Digital Revolution
is complete. We are now entering the Fourth Industrial Revolution – An era with
rapid change to technology, industries, and societal patterns due to increasing
interconnectivity and smart automation.
This is a period that will be marked by
breakthroughs in emerging technologies such as artificial intelligence,
virtual reality, gene editing, quantum computing, the internet of things, blockchain
technology and web3.
Some people will call this approaching
period the Imagination Age Where creativity and imagination become the primary
creators of economic value.
The rise of immersive virtual reality
will automatically raise the value of the "imagination work" done by
designers, artists, and creatives.
As we look and anticipate what the future in digital technology looks like, you better start positioning yourself to be among the first beneficiaries of emerging and changing trends.
Baoriat Agencies is committed to helping you find the best
place for you to settle in Eldoret town. We walk you through the entire process
of acquiring your own property in Eldoret until it has been transferred into
your hands.
To learn more about buying a property in Eldoret,
Call 0721-554937
WhatsApp https://wa.me/0721-554937
Email evekibet@gmail.com or
Visit us at Juma Hajee Building room number 16, Eldoret town
Follow our Facebook Page Boariat Agencies for the latest deals
Comments
Post a Comment