top of page
Cassius Downs

Computing - history of progress for AI

Updated: Jun 17

Why: Artificial Intelligence (AI) is a term for application programs that can perform human tasks that were or are performed manually by humans—thus, the name AI refers to human and task replacement capabilities. The definition of AI has evolved over the years and will continue to evolve as the capabilities and uses of AI change.


Before there were electronic machines like computers today, people computed calculations using something to write with and something to write on. In ancient times, they used something with a point on the end and soft clay. Over time, an abacus was invented to speed up manual calculations. In recent times, paper, ink, pencils, chalkboards, and chalk were used to write, and new tools like a manual calculator for simple math and a slide-rule for higher math were invented. Exceptional people who could perform complex math calculations in their minds without tools were called 'computers'.

Abacus drawing
released into the public domain by the copyright holder
A mechanical slide ruler for math calculations
Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License,
Collectional of mechanical calculators
Granted by https://en.wikipedia.org/wiki/User:Ezrdr

The advent of electronic machines marked a transformative moment in the history of computing. The first machines, designed to replace human 'computers', were built in the UK (1943) and the USA (1945). These early electronic marvels were the precursors to the large mainframe machines (1950s—present) that we are familiar with today. The capabilities of these machines expanded exponentially, revolutionizing the way we compute. However, with their advanced capabilities came a significant cost, making them a luxury to buy and operate.

Room of mainframe computers
https://www.ibm.com/blog/blending-mainframe-power-into-the-cloud-computing-landscape/

Alongside the advancement of machines, the ability to program these computers progressed from intricate combinations of numbers and letters (basic computing language) to more sophisticated levels of coding using words and numbers (over 300 coding languages exist today). As more advanced coding levels emerged and were utilized, they demanded increased processing power, time, and resources from the mainframe, resulting in high usage costs and long execution times. This constrained the functionalities that applications could perform within time and budget constraints. As processing power improved over time, these constraints were mitigated.


The introduction of cloud computing greatly expanded the scale of computer processing and data storage power available, reducing the time required to execute new and more complex applications. Moore's Law, based on then-current computer architecture, forecasted increases in computer processing power could be expected to double every two years. Introducing new computer architectures has increased performance power to thirty times more and is scaling even higher over time. These improvements compound over time, allowing us to build massive, complex applications, some of which are called artificial intelligence.

3 views

Recent Posts

See All

Comments


bottom of page