The evolution of computing and software engineering can be traced through the changing focus of its research. The collected article titles reveal distinct periods, each grappling with new challenges and building upon prior advancements.
Mid-1960s: Laying the Foundations for Language and Parallelism
The mid-1960s saw researchers intensely focused on the fundamental aspects of programming languages and the emerging concept of parallelism. Titles from 1966, such as "State table analysis of programs in an ALGOL-like language" and "On facilitating parallel and multiprocessing in ALGOL," indicate a deep engagement with formal language analysis and the practical challenge of enabling multiple computations to run concurrently within the popular ALGOL language. The scope broadened slightly with "A language for describing the functions of synchronous systems," suggesting an early interest in defining and managing system behavior beyond just sequential programs. By 1969, this interest in parallelism evolved into simulating complex interactions, as seen in "On simulating networks of parallel processes in which simultaneous events may occur." While technical, the 1968 title "Letters to the editor: on improving the quality of our technical meetings" provides a glimpse into the nascent community's early efforts to self-organize and refine its knowledge exchange.
Early to Mid-1970s: Structuring Software and Managing Concurrency
Moving into the early to mid-1970s, the field shifted its attention from simply enabling parallel operations to systematically managing their interactions and designing software more robustly. A key theme became modularity and decomposition, highlighted by "On the Criteria To Be Used in Decomposing Systems into Modules" (1972) and "A Technique for Software Module Specification with Examples" (1972). This represented a notable shift towards defining clear boundaries and interfaces within complex systems. Concurrency control remained a vital area, evolving into more specific problems like those addressed in "Concurrent Control with "Readers" and "Writers"" (1971) and "Comment on Deadlock Prevention Method" (1972), underscoring the practical challenges of shared resources and preventing system freezes. By 1975, the focus on system design was further cemented with "Use of the Concept of Transparency in the Design of Hierarchically Structured Systems," emphasizing clarity and understanding in complex architectures. The theoretical underpinnings were still relevant, as shown by "On a Solution to the Cigarette Smoker's Problem (Without Conditional Statements)" and "Significant Event Simulation," which explore foundational problems in concurrent programming and system modeling.
Mid-1980s to Early 1990s: Software Engineering for Critical Systems
The mid-1980s ushered in a period where software was increasingly applied to high-stakes domains, leading to a heightened emphasis on reliability and safety. The 1983 reprint of "A Technique for Software Module Specification with Examples" suggests the enduring relevance of modularity principles from the 1970s, now perhaps seen as crucial for robust software. The landscape dramatically shifted with titles like "Software Aspects of Strategic Defense Systems" (1985), indicating software's central role in critical national infrastructure. This new context spurred research into rigorous requirements for performance and correctness, evidenced by "On Synchronization in Hard-Real-Time Systems" (1988), where timing constraints are paramount. The ultimate concern in such applications was safety, culminating in "Evaluation of Safety-Critical Software" (1990), which underscored the need for verifiable and dependable software in environments where failure could have severe consequences.
Late 1990s to Early 2000s: Professionalization and Societal Impact
As computing permeated more aspects of daily life, the late 1990s and early 2000s saw a period of introspection for the software engineering discipline, coupled with increasing public awareness and concern. The title "Software Engineering: An Unconsummated Marriage" (1997) reflects a candid assessment of the field's maturity and perhaps its struggle to consistently deliver on its promises. The widespread "Y2K" issue brought software reliability into mainstream consciousness, as seen in "Ten Myths About Y2K Inspections" (1999), signaling a public and industry-wide focus on software quality. The broader societal impact of technology became a topic of discussion with "Computers: boon or bane?" (2001), indicating a growing awareness of both the positive and negative implications of computing. Concurrently, the desire for professional standards emerged, highlighted by "Licensing software engineers in Canada" (2002), showcasing efforts to formalize and regulate the profession.
Mid-2000s to Late 2010s: Navigating Risks and Ethical Dimensions
The most recent period, spanning the mid-2000s to the late 2010s, is characterized by a strong focus on risk management, security, and the ethical implications of pervasive technology. Early concerns included digital information veracity, as reflected in "Wikipedia risks" (2005). The conversation broadened to include systemic risks within the computing ecosystem, such as "Which is riskier: OS diversity or OS monopoly?" (2007) and potentially the challenges of quantitative metrics in "Stop the numbers game" (2007). The importance of disciplined development practices was reiterated in "Risks of undisciplined development" (2010), while "The risks of stopping too soon" (2011) implied concerns about incomplete projects or inadequate testing. This emphasis on risk culminated in the direct assessment of burgeoning technologies with "The real risks of artificial intelligence" (2017). The period concludes with "An interview with Dave Parnas" (2018), which, given Parnas's seminal contributions to software engineering principles and his long-standing advocacy for responsibility in the field, likely serves as a reflective and experienced voice on these evolving challenges of risk, quality, and ethical deployment of technology.