Humble Origins
Computing is generally thought of as a 20th-century revolution. But in actuality, basic computers were being invented as early as the 1820s. A British mathematician by the name Charles Babbage developed the “Difference Machine” which utilized a system of number wheels and gears to perform polynomial computation functions. Think of it as the first modern calculator. Software as we know it–as programs stored in the memory of digital computers–did not exist prior to the 1940s.
World War II and Beyond
Modern software development is commonly thought to have begun during World War II as a means of handling complex tactical issues faced by engineers, however, it wasn’t until midway through the war when Konrad Zuse designed a digital computer capable of doing just that.
In 1943, the Colossus Computer was developed by the British government to help decode German telegraphs, becoming one of the first machines designed to crack encrypted data.
50’s and 60’s
Standardized programming languages emerged in the 1950s when it became necessary to reproduce old software on newer and more modern hardware. By the time the 1960s rolled around, software that had been given away for free with new computers was being produced and sold to end users, and 1971 brought with it the first microprocessor, which allowed for computers to be smaller without sacrificing processing power.
Stay posted until next week, when we will finish our look into the history of software development. In the meantime, contact Galalee Software Solutions in Port Orchard for all of your software development and cloud hosting needs.