close
US

The history of computers

By Zakeriya Ali
17 November, 2023

Over the last few decades, computers have become such an intrinsic part of our lives that we barely even stop to think about them anymore. But the machine has such a fascinating history....

The history of computers

COVER STORY

The term “computer” refers to an electronic asset that can process data in tandems with the input provided by a user. Computers have become an integral component of our lives and are of paramount importance in this dynamic atomic and information age.

One can erroneously contribute the attributes of a physical desktop or laptop to a computer, though compact devices like smartwatches, smartphones, and tablets also fall under the ambit of computing. Computers are an amalgamation of physical and virtual components that are seamlessly synchronized with one another and ultimately function as a unitary element.

From numbers to machines

The origins of the concepts associated with the hassle-free operation of a computer can be traced back to the “Islamic golden age”, an era during which thousands of Muslim scientists bestowed their scholarly knowledge upon the subsequent generations, and their rendered services altered the trajectory of the work undertaken by field experts and pioneers.

Abbasid polymath Muhammad ibn Musa al-Khwarizmi is held in high esteem by visionary entrepreneurs and industry experts alike due to his contributions to the fields of mathematics and arithmetic which paved the path for the subsequent generation of intellectual scientists to leverage upon his acumen and to introduce efficient computers possessing potent capabilities. Al-Khawarizmi is credited with the postulation of diversified theories associated with algebra and algorithms, concepts that are regarded as foundation stones upon which modern computing prowess is based. Al-Khawarizmi laboured over the concept of zero incorporated into mathematics. Little did he know that this novel discovery would underpin the foundations of binary language, a tool that dictates a specific command and information over to computers.

The history of computers

In the mid-17th century, a French mathematician Blaise Pascal toiled over an idea to manufacture a primitive calculator capable of undertaking rudimentary calculations in a bid to assist his father, who was deputed as a tax supervisor, in his day-to-day operations. The manufacturing of the machine was an attempt to automate recurring cumbersome operations. In total about 50 prototypes of the machine were assembled and marketed, though it failed to garner traction due to its bulky feature and incapacitation to execute convoluted mathematical operations. These limited features coupled with the tendency of the manufacturer to target a conservative customer base played a pivotal role in its short-lived commercial success. Some experts dismiss the act of Pascal assembling a primitive prototype of a calculator merely as an intuitive gesture to investigate the probable functions and impact of such device.

Gottfried Wilhelm Leibniz, a German philosopher tweaked and improved the mechanism that warranted the smooth operation of Pascaline and introduced his own version of a digital mechanical computer. The ensuing decades were categorized by evolution introduced in myriad fields including large-scale manufacturing that prompted industry leaders to forsake antiqued manufacturing practices in favor of novel ones primarily due to monetary gains and factors associated with the latter.

In 1820, a French inventor devised the arithmometer, an instrument that supplemented the rudimentary functions performed by the ancient prototypes. The arithmometer was an instant hit and resonated with the masses due to its distinct features and aptness and remained a bestseller product for a period of 90 years. It could undertake the four fundamental mathematical operations, a task once considered insurmountable.

Automation and versatility

Though the concept of automation to one extent or another remained constrained to the jurisdiction of arithmetic and calculation-based operations, the invention of such machines was a stroke of ingenuity. Numerous other inventors and pioneers tried their luck in the aforementioned sophisticated field but failed to leave a significant footprint behind. However, Charles Babbage, an individual radiating a supreme level of intellect, embarked upon a journey to upend the status quo. Babbage managed to capitalize upon the monetary grants accorded by the British government and the validation accorded by the eminent and well-renowned Royal Society. Babbage envisioned a device proficient enough to automate mind-numbing calculations used in astronomy and navigation. He aimed to save the government’s kitty by introducing a machine that could assist sailors during their journey and impede the disastrous ramifications of inaccurate navigation tables. Babbage incorporated the concept of built-in memory – a feature that is synonymous with modern-day computer devices – and his device could operate upon polynomial values, a feature well ahead of its time. The machine could operate upon multiple datasets and inputs, albeit a single operation could be applied to the multiple inputs.

The history of computers

Babbage discarded the cemented concept of machines operating on binary values and advocated the utilization of discrete values as an input. Babbage, during the manufacturing process hit multiple deadlocks, locked horns with the technician Joseph Clement tasked with the responsibility of assembling the machine, and the tendency of the government to uphold the monetary grants. Consequently, no fully functional prototype of the machine was assembled during his lifetime and he abandoned his brainchild. However, a few scientists in 1991 leveraged upon the writings of Babbage and succeeded in manifesting Babbage’s dream into reality by assembling a functional prototype, dubbed the difference engine.

In the 1830s, Babbage embarked upon another treacherous journey to manufacture a device to perform versatile operations upon multitudinous inputs, a stark contrast and departure from the difference machine. Babbage initiated the mechanism of punch cards that were used as an input, a memory unit, and an output unit. Ada Lovelace, the world’s first computer programmer who befriended Babbage, assisted him in his quest to churn out the analytical machine. The highly technical design and operational aspects of the device handicapped and crippled Babbage’s vision of a multifaceted device tailored to the customized requirements of the users.

Inventors based in the US began to catch up with the innovative products envisioned by their European counterparts, and in the 1870s Herman Hollerith was tapped by the U.S. census office to automate the process of tabulating census data by developing a machine that operated on the mandate established by Babbage, that is, punch cards.

Inroads to more complexity

The dawn of the 20th century ushered in a new era of computing. The industry giant IBM – the International Business Machines Corporation – was established in 1911, a by-product of the restructured CTR, the Computing-Tabulating-Recording Company. In 1930 Vannevar Bush, an engineer based at Massachusetts Institute of Technology, developed a differential analyser, a device used to solve complex differential equations that dismayed and frustrated engineers and scientists due to their impenetrable nature. The device was far from stellar due to its colossal size and the multiplex operations required to operate it.

The history of computers

The commencement of the Second World War paved the path for the sprawling military-industrial complex to undertake great strides in the field of technology. Numerous dynamic computing products were introduced during this era including the world’s first digital electrochemical computer, Mark 1, which could process both positive and negative values and trigonometric functions and required minimal human intervention during processing.

In 1936, Allan Turing introduced the Turing machine, a hypothetical device that could generate an output of a specific code embedded in it. Turing imagined the machine as a versatile device possessing unbounded storage capability. The Turing machine acts as a foundation stone upon which modern-day computer programming is based.

The services and products envisaged by numerous distinguished intellectuals were utilized to overpower the technological might enjoyed by the Nazi forces. Alan Turing and his team also played a pivotal role in devising a mechanism that could encrypt the notorious enigma code developed by the Nazi forces to operate under the guise of cyber mist and fog to evade detection. Colossus computer was manufactured during the Second World War under the so-called Ultra project to counteract the threat posed by the encrypted messages generated and propagated by Lorenz and the enigma computers. During the late 1940s, a shift towards digital computing began to emerge with the invention of the ABC and Z3, and the core-operational concept of the devices was tied with Boolean logic. The devices like their predecessors could perform limited operations and the hunt for a digital computer formidable enough to perform diversified operations led to the development of the ENIAC computer. ENIAC was developed to assist the US armed forces during the war, and it was extensively utilized by scientists working on the post-war nuclear program.

Entering the modern era

By 1947, all the computers that had been operational till date occupied large swathes of area and comprised hundreds and thousands of components required for hassle-free operation. Computers frequented with sudden operational hindrance due to the presence of vacuum and cathode ray tubes. In 1947, a novel electronic component called a transistor was assembled in the Bell laboratory by a trio of scientists. The manufacturing of transistors led to the development of integrated circuits and microprocessors that elevated the efficiency of computers by a manifold. The San Francisco Bay area, later dubbed Silicon Valley, became a brewing and breeding ground for a host of startups, including HP, Apple, and Intel, that evolved into industry giants.

IBM in the 1960s introduced IBM 360, a computer that was an amalgamation of transistors and the prevalent vacuum tubes. The hardware of a computer is one aspect of computing, the operating system or the abstract component of a device is another facet. The first modern day program was penned down by Zuse, and he laid the groundwork for the subsequent generation of operating systems. In the decades to follow, FORTRAN, COBOL, ALGOL, and BASIC programming languages were developed. IBM in the 1960s developed OS/360, the world’s first commercialized operating system. In 1971, Intel introduced the Intel 4004 microprocessor that altered the dynamics of computer architecture. The other contenders in this field were Motorola and MOS, and all of these three companies shared a noteworthy market presence. The inception of microprocessors in computers accelerated the growth of the personal computer market. While IBM and Altair introduced their versions of personal computers, none of them were configured for a layman user.

The history of computers

In 1976, Steve Jobs and Steve Wozniak founded Apple Computers, a company that sprouted from the former’s garage and leapfrogged in the personal computer industry when it introduced Apple-2 in 1977 and subsequently ended the dominance and status quo of industry giants like IBM and Xerox. In 1981, Xerox invested in Apple, and in return, Steve demanded unhindered access to the Xerox PARC research and development facility. Xerox developed a graphical user interface software that was a departure from the sophisticated command line interface, but it failed to commercialize it. Apple leveraged and capitalized upon the user-friendly graphical interface software and inculcated it in the Lisa computer, a high-end workstation that targeted industry experts. In the years to follow Apple refined the quirks that Lisa displayed and managed to churn out Macintosh, a user-friendly computer that permeated all aspects and classes of the society.

In 1975, Bill Gates and Paul Allen founded Microsoft a company that thrived on the software business. Microsoft wrote application software, like Word and Excel, for tech giants like Apple and IBM before it released its version of an operating system.

The World Wide Web was commercialized in 1991 and it refashioned the information age by opening new avenues and realms for the general public. In the following years, search engines that permitted the users to browse millions of pages on the World Wide Web were materialized, including Yahoo, Google, and Mozilla, for the convenience of users. Numerous businesses established their presence on the Internet via diversified e-commerce sites, including eBay and Amazon to name a few. The cyber space truly made the world one click away and the term “global village” was coined during the same era.

Apple, after surviving bankruptcy, embarked upon a journey to redeem itself, and in order to manifest dreams into reality, introduced innovative products and services like the iMac, iPod, iTunes, and the popular iPhone that transformed and disrupted numerous industries ranging from music to telecommunication. In order to counteract the influence of the iOS operating system, Google developed the Android operating system for handheld smart devices that is now used by numerous smartphone manufacturers.

The road ahead

The development in the dynamic computing field is ongoing. The introduction of 5G technology, quantum computing, and supercomputers has truly revolutionized this field. The future displays bright prospects for those who wish to capitalize on the knowledge and skills associated with this field.

I would like to end with a quote attributed to writer Arthur C. Clarke: “Any sufficiently advanced technology is equivalent to magic”.