Research: The Evolution of Software Development Tools and the Changing Role of the Software Engineer, revised by Claude
This material reflects my opinions and not those of my employers.
Overview: A Recurring Promise
For more than six decades, each new generation of development tools has arrived with a bold promise: this time, we’ve made coding so simple that you won’t need skilled engineers anymore. It has never been true. What has happened instead is more interesting — each wave of abstraction raised the floor of what was possible, which raised the ceiling of what was expected, which demanded skilled practitioners. However, the skills required have evolved.
This research traces that pattern across what I call significant eras, from machine code to AI-assisted development. I’ve had the unique advantage of experiencing all of them over the past three decades.
Era 1: Machine Code & Assembly Language (1940s–Early 1980s)
What it was:
The earliest programmers worked directly with binary machine code — sequences of 0s and 1s that corresponded to specific CPU instructions. Assembly language arrived as a thin layer of abstraction, replacing binary opcodes with human-readable mnemonics (e.g., MOV AX, 1 instead of 10110000 00000001).
Tools and environment:
- Programming was done on punch cards or paper tape, submitted to operators, then retrieved hours or days later
- Programs ran on mainframes (IBM System/360) or minicomputers (PDP-8, PDP-11)
- Debuggers were primitive; errors were found by reading printouts
- Memory management, register allocation, and hardware specifics were entirely the programmer’s responsibility
Role of the Software Engineer: The “programmer” of this era was a rare, highly specialized operator who understood both the problem domain and the inner workings of the machine. The role was so technical it was considered closer to electrical engineering than to any kind of application design. A deep understanding of hardware was non-negotiable.
The promise that never came: There was no mainstream promise that this era would “end programming” — it was understood as deeply expert work. That narrative would come later.
Era 2: High-Level Compiled Languages (1950s–1980s)
The shift: The invention of the compiler changed everything. Instead of writing machine-specific instructions, a programmer could write in a higher-level language and have a separate program — the compiler — translate it into machine code.
Key milestones:
- 1954–57: FORTRAN (FORmula TRANslation) — Developed by John Backus at IBM, FORTRAN was the first commercially viable high-level language. Designed for scientific and mathematical computation. Introduced loops, conditionals, and subroutines.
- 1959: COBOL — Championed by Grace Hopper and designed for business data processing. Introduced English-like syntax with the explicit goal of making programs readable by non-technical managers. The promise: business people could read, maybe even write, programs.
- 1964: BASIC — Designed at Dartmouth specifically to be learnable by non-science students. A direct, early attempt at democratizing programming.
- 1969–72: C — Developed at Bell Labs by Dennis Ritchie. Became the dominant systems programming language, bridging high-level expressiveness with low-level control. Virtually all modern operating systems trace their lineage to C.
- 1970s: Pascal — Designed for teaching structured programming; became a staple in university curricula through the 1980s.
Role of the Software Engineer: High-level languages made programmers significantly more productive and widened the talent pool. Code could now be written and maintained by people who didn’t need to know every detail of the underlying hardware. However, performance-critical systems (OS kernels, device drivers, real-time systems) still required assembly language expertise alongside higher-level skills.
The SE role began to formalize. The term “software engineering” itself emerged from a 1968 NATO conference addressing what was called the software crisis — the recognition that software projects were chronically late, over budget, and unreliable.
The promise: COBOL’s English-like syntax was partly sold on the premise that business managers could read and eventually write it themselves — reducing dependence on programmers. This did not happen. The logic of programs remained complex regardless of how readable the syntax was.
Era 3: 4th Generation Languages (4GLs) & CASE Tools (Late 1970s–Early 1990s)
The shift: If 3rd generation languages (3GLs) abstracted the hardware, 4th generation languages attempted to abstract the logic — letting users describe what they wanted rather than how to get it.
Key milestones:
- 1981: James Martin’s Application Development Without Programmers — The foundational text of the 4GL movement, explicitly arguing that these tools would eliminate the need for professional programmers.
- Key 4GL products: FOCUS, NOMAD, RAMIS, NATURAL, PowerBuilder, and eventually Microsoft Access and FileMaker.
- SQL (1974, widespread 1980s): A 4GL for database queries — ask what you want, not how to retrieve it. One of the most enduring and genuinely successful 4GL-style languages.
- CASE Tools (mid-1980s to early 1990s): Computer-Aided Software Engineering tools promised to generate code automatically from visual diagrams and models. The U.S. government invested heavily in CASE. A 1993 GAO assessment found “little evidence that CASE tools can improve software quality or productivity.”
Role of the Software Engineer: 4GLs did democratize certain categories of work — business analysts could build reports, query databases, and create simple data-entry forms without traditional programming skills. But as soon as applications required complex business logic, performance, or integration with other systems, the 4GL-generated code proved brittle, inefficient, and hard to maintain.
The net effect: 4GLs expanded the total amount of software being written, which created more demand for professional SEs to build and maintain the complex systems that non-programmers couldn’t handle. A new class of specialist — the database developer — emerged.
The promise that failed: “Application development without programmers” became something closer to “application development with a different kind of programmer.”
Era 4: Object-Oriented Programming (Late 1980s–Mid 1990s)
The shift: Object-Oriented Programming (OOP) reorganized how programs were structured — around objects (data + behavior bundled together) rather than procedures. The ideas traced back to Simula (1967) and Smalltalk (1972), but went mainstream in the late 1980s and dominated the 1990s.
Key milestones:
- 1985: C++ released — Bjarne Stroustrup’s extension of C with OOP features became the first widely adopted commercial OOP language. Used heavily in finance, games, operating systems, and embedded systems.
- 1986: First OOPSLA conference — Attended by 1,000 people, signaling OOP’s arrival as a major paradigm.
- 1991: Visual Basic 1.0 — Microsoft’s event-driven, GUI-based RAD tool, combining OOP concepts with a drag-and-drop form designer. Brought Windows application development to a much wider audience.
- 1995: Java — James Gosling at Sun Microsystems. “Write once, run anywhere” via the Java Virtual Machine (JVM). The platform-independence promise was enormously significant in the emerging internet age. Became the dominant enterprise and web backend language of the late 1990s–2000s.
- 1995: JavaScript — Brendan Eich at Netscape, designed in 10 days. Became the language of the browser and, much later, the most widely deployed programming language in the world.
- 1991/1994: Python — Guido van Rossum released Python 0.9.0 in 1991; Python 1.0 followed in January 1994. Its emphasis on readability and simplicity attracted developers from scientific, academic, and non-traditional programming backgrounds.
- 1995: Delphi — Borland’s RAD tool for Pascal, competing directly with Visual Basic. Highly productive for Windows GUI development.
Role of the Software Engineer: OOP raised the conceptual bar. SEs now needed to understand design patterns, inheritance hierarchies, polymorphism, and encapsulation. The complexity of systems grew substantially. In response, the discipline formalized: Rochester Institute of Technology introduced the first BS degree in Software Engineering in 1996, and formal methodologies (Rational Unified Process, UML diagramming) gained wide adoption.
The era also saw the emergence of specialized roles: database administrator, front-end developer, systems architect, and QA engineer became distinct positions within software teams.
The promise: OOP was sold as enabling radical reusability — write a class once, use it forever. In practice, reuse was far harder to achieve than the theory suggested. The software industry got large-scale component libraries and frameworks, but the “reuse crisis” that OOP was supposed to solve largely persisted.
Era 5: The Web, Scripting Languages & the Internet Boom (Mid 1990s–Early 2000s)
The shift: The World Wide Web (Tim Berners-Lee, 1991) transformed software from a product you shipped on disk to a service delivered over a network. This created an entirely new class of development work overnight.
Key milestones:
- 1993–94: Mosaic/Netscape — Web browsers made the internet accessible to non-technical users and created the first “web developer” role.
- HTML/CSS — Markup and styling languages that empowered designers (not just programmers) to build web pages. Another democratizing force.
- Perl (1987, web adoption mid-1990s) — Larry Wall released Perl 1.0 on December 18, 1987. It initially served as a Unix reporting and text-processing tool, but gained its reputation as a web language in the mid-1990s as the dominant CGI scripting language.
- PHP (1994–95) — Rasmus Lerdorf’s server-side scripting language designed specifically for web development. Together with Perl, PHP powered the first generation of dynamic websites. Both languages are interpreted (not compiled), fast to write, and forgiving of errors — but not always kind to maintainability.
- ASP, ColdFusion — Microsoft and Allaire’s platforms brought web development to enterprise teams and corporate developers already familiar with Visual Basic and SQL.
- The Dot-Com Boom (1995–2000) — Demand for web developers exploded. The definition of “software developer” expanded rapidly to include people who had never written a compiled program.
Role of the Software Engineer: This era massively expanded the population of people calling themselves developers. The skills barrier to entry dropped: an HTML/CSS/JavaScript developer needed no formal CS education. Web development shops hired designers who learned PHP, marketers who learned basic scripting, and business analysts who built their own tools.
At the same time, the back-end of internet infrastructure (databases, networks, security, scalability) became exponentially more complex, requiring advanced expertise. The role bifurcated: a large, accessible “web tier” and a demanding, specialized “infrastructure tier.”
The dot-com bust (2000–2002) sharply contracted the industry, but the underlying infrastructure and skills base survived and were reused in the subsequent decade’s more sustainable build-out.
Era 6: Frameworks, Agile, and Open Source (2000s)
The shift: The 2000s were defined less by new languages and more by how software was built. Frameworks, methodologies, and the open-source movement fundamentally changed the SE’s daily workflow.
Key milestones:
- 2001: The Agile Manifesto — 17 practitioners signed a document rejecting heavyweight, waterfall-style processes in favor of iterative development, collaboration, and responsiveness to change. Agile became the dominant methodology of the decade.
- 2004: Ruby on Rails — David Heinemeier Hansson’s framework introduced convention over configuration, allowing developers to build database-backed web applications dramatically faster. Frameworks like Spring (Java), Django (Python), and .NET MVC followed the same philosophy.
- 2005: Git — Linus Torvalds created Git for Linux kernel development. Distributed version control replaced centralized systems (CVS, SVN) and, with GitHub’s launch in 2008, became the universal standard for collaborative software development.
- .NET Framework and managed code — Microsoft’s platform (launched 2002) brought garbage collection, type safety, and a rich standard library to Windows development, reducing a whole category of memory-management bugs that had plagued C/C++ developers.
- Open source explosion — SourceForge (1999), then GitHub (2008) normalized the sharing and reuse of code at a global scale. The concept of building on open-source libraries and frameworks rather than writing everything from scratch became standard practice.
Role of the Software Engineer: The Agile era redefined the SE’s role from a solitary craftsperson to a collaborative team member. Daily stand-ups, sprint planning, retrospectives, pair programming, and test-driven development (TDD) became common practices. The “full-stack developer” concept emerged — expected to work across front-end, back-end, and database layers.
The open-source shift also changed the work: a larger fraction of SE time shifted from writing low-level code to integrating, configuring, and extending existing libraries and frameworks. Deep mastery of a specific platform became as valuable as general algorithmic skill.
Era 7: Cloud, DevOps & Mobile (2010s)
The shift: Infrastructure became software. The availability of cloud platforms (AWS launched 2006, Azure 2010, GCP 2011) and the smartphone revolution (iPhone 2007) created a new layer of complexity and possibility.
Key milestones:
- AWS and cloud computing — Developers could provision servers, databases, storage, and networking through code and APIs, without touching physical hardware. This gave small teams capabilities previously requiring large IT departments.
- DevOps movement — The boundary between “developer” and “operations” eroded. Continuous integration (CI), continuous delivery (CD), and infrastructure-as-code (Terraform, Ansible) became SE responsibilities.
- Docker (2013) and Kubernetes (2014) — Containerization abstracted deployment environments, enabling consistent behavior across development, testing, and production.
- iOS and Android platforms — Created entirely new development disciplines. Swift (2014) and Kotlin (2011) joined Objective-C and Java as primary mobile languages.
- npm, pip, Maven, Cargo — Package managers normalized the use of thousands of third-party libraries. A modern JavaScript project might pull in hundreds of dependencies.
- Low-code/no-code platforms (Salesforce, Appian, OutSystems) — The 2010s iteration of the 4GL promise: business users building apps without developers. Genuine productivity gains for structured enterprise use cases; limitations became apparent at scale or complexity.
- GitHub reaches 100M repos (2018) — Open source and community-driven development became a foundational assumption of the profession.
Role of the Software Engineer: The 2010s SE wore many hats: developer, tester, deployment engineer, security reviewer, data analyst, and product collaborator. “You build it, you run it” became a DevOps mantra. The expected skill surface of a professional SE expanded enormously.
The low-code/no-code wave again displaced some routine application development from professional SEs to business analysts and power users — and again, this freed senior SEs for more complex work rather than replacing them.
Era 8: AI-Assisted Development (2020s–Present)
The shift: Large Language Models (LLMs), trained on billions of lines of code, introduced a qualitatively new kind of coding tool — one that generates, explains, and refactors code in natural language dialogue.
Key milestones:
- 2021 (preview) / 2022 (GA): GitHub Copilot — Built on OpenAI’s Codex model, integrated directly into VS Code. Launched as a technical preview in June 2021; generally available June 2022. The first widely adopted AI pair programmer.
- 2022: ChatGPT — OpenAI’s conversational LLM launched November 30, 2022. Developers immediately began using it to answer architecture questions, debug code, write tests, and explain documentation at a level previously requiring a senior colleague.
- 2023: GPT-4 — Released March 14, 2023. A substantial capability leap over prior models, cementing the use of general-purpose LLMs as a routine part of the SE workflow.
- 2023–25: Proliferation of AI coding tools — Cursor, Windsurf, Amazon CodeWhisperer, Tabnine, Cody, Claude Code, and others. IDEs began integrating AI chat natively.
- 2025 adoption data (Stack Overflow Developer Survey, 49,000+ respondents): 84% of developers are using or planning to use AI coding tools. Over half of professional developers use them daily.
- GitHub Copilot reached 20 million cumulative users by July 2025, with adoption by 90% of Fortune 100 companies.
- AI writes ~41–46% of code accepted by Copilot users in active projects (GitHub, 2025). Note: this figure measures accepted Copilot suggestions as a share of total code committed in active Copilot sessions — not AI-originated code across all development broadly. Other studies cite 30–40% in similar contexts; the precise figure depends on how “AI-written” is defined and measured.
Productivity research findings: Caveat: several of the studies below are vendor-commissioned or conducted by companies with a commercial interest in positive results (Microsoft, Google, GitHub). They are widely cited in industry media but should be read with that context in mind. Primary source links are provided in the Sources section.
- Multi-company study (Microsoft, Accenture): 26% average productivity increase for developers using Copilot.
- Google internal RCT (~100 engineers): tasks completed ~21% faster with AI assistance (96 min vs. 114 min).
- Less experienced developers saw the largest gains: 35–39% speedup. Senior developers saw 8–16%.
- A 2024 MIT field experiment confirmed productivity gains for coding tasks — but noted gains were smaller on complex, novel problems.
Code quality concerns:
- GitClear analysis (211M changed lines, 2020–2024): refactoring share fell from 25% to under 10%; copy-pasted (“cloned”) code rose from 8.3% to 12.3%.
- Developer trust (Stack Overflow 2025): 29% of respondents say they trust AI outputs to be accurate (down from 40% in 2024). A separate question found 33% expressing trust and 46% expressing active distrust — these measure differently-framed questions in the same survey and should not be treated as complementary halves. The consistent direction across all framings: trust is falling. Positive sentiment toward AI tools dropped from ~70% (2023) to ~60% (2025).
Role of the Software Engineer: AI tools are accelerating an ongoing shift: SEs spend less time on boilerplate, syntax recall, and routine implementation, and more time on problem framing, architecture, code review, integration, testing strategy, and business logic. The human role is moving toward direction, judgment, and verification rather than raw code production.
As with every prior era, the tools lower the barrier for simple tasks — but simultaneously raise the ceiling of what production-grade systems require, and create new categories of complexity (AI reliability, prompt engineering, model evaluation, ethical considerations in automated systems).
The recurring pattern holds. The question is no longer whether AI will eliminate SEs, but how the SE role will evolve in response — just as it did when compilers replaced assembly, when OOP replaced procedural code, and when the cloud replaced on-premises infrastructure.
Cross-Cutting Analysis: The Pattern Across Six Decades
| Era | Tool / Paradigm | The “End of Programmers” Promise | What Actually Happened |
|---|---|---|---|
| 1950s–60s | COBOL, FORTRAN | Business managers could read/write programs | New specialists needed; programming formalized |
| 1981 | 4GLs (FOCUS, NOMAD) | Application Development Without Programmers | Created new programmer categories; complex logic still required experts |
| Mid-1980s | CASE Tools | Automated code generation from diagrams | GAO 1993: “Little evidence of improvement” |
| Early 1990s | RAD tools (VB, Delphi) | Drag-and-drop replaces coding | Widened developer pool; professional SEs moved up the stack |
| Mid-1990s | Internet / HTML / PHP | Designers and analysts can build web apps | Created massive new demand for all tiers of development |
| 2000s | Frameworks (Rails, Spring) | Convention over configuration speeds everything | Developers became more productive; expectations rose proportionally |
| 2010s | Low-code / No-code | Business users build their own apps | Valid for structured use cases; SEs still needed for everything complex |
| 2020s | AI coding assistants | Natural language replaces code | Productivity gains confirmed; demand for senior SE judgment increasing |
Key observation: Each abstraction layer democratized a prior tier of complexity, which elevated what was expected of professional SEs. The profession has never contracted in response to new tools — it has consistently re-defined itself around the new opportunities those tools created.
Notable Quotes & Data Points
“Little evidence yet exists that CASE tools can improve software quality or productivity.” — U.S. Government Accountability Office, 1993
“Application Development Without Programmers” — Title of James Martin’s 1981 book on 4GLs
“Write once, run anywhere.” — Sun Microsystems tagline for Java, 1995
“84% of developers are using or planning to use AI tools.” — Stack Overflow Developer Survey, 2025 (49,000+ respondents)
“AI writes approximately 41–46% of code in Copilot-assisted projects.” — GitHub, 2025
“Newer developers saw 35–39% speedup with AI; senior developers saw 8–16%.” — Multi-company GitHub Copilot research, 2024–25
“Only 29% of respondents say they trust AI outputs to be accurate, down from 40% in 2024.” — Stack Overflow Developer Survey, 2025 (Note: related questions in the same survey show 33% trust / 46% active distrust on differently-framed items; all point in the same direction.)
Sources & Further Reading
- History of software engineering — Wikipedia
- Evolution of Software Development — GeeksforGeeks
- The Evolution of Software Developer Roles Across Decades — Medium
- The Evolution of Application Development: 50 Years of Innovation (1975–2025)
- Fourth-generation programming language — Wikipedia
- The Eternal Promise: A History of Attempts to Eliminate Programmers
- Is AI the 4GL we’ve been waiting for? — InfoWorld
- Object-Oriented Programming — Wikipedia
- Visual Basic: The Language That Brought Programming to the Masses — Build5Nines
- Rapid Application Development — Wikipedia
- The Impact of AI on Developer Productivity: Evidence from GitHub Copilot — arXiv
- AI Copilot Code Quality: 2025 Data — GitClear
- Top 100 Developer Productivity Statistics with AI Tools 2026 — Index.dev
- History of programming languages — Wikipedia
- Higher Level Languages — Computer History Museum