In the history of software engineering the software engineering has evolved steadily from its founding days in the 1940s until today in the 2000s. Applications have evolved continuously. The ongoing goal to improve technologies and practices, seeks to improve the productivity of practitioners and the quality of applications to users.
Overview
There are a number of areas where the evolution of software engineering is notable:
Emergence as a profession: By the early 1980s, software engineering had already emerged as a bona fide profession, to stand beside computer science and traditional engineering. See also software engineering professionalism.
Role of women: In the 1940s, 1950s, and 1960s, men often filled the more prestigious and better paying hardware engineering roles, but often delegated the writing of software to women. Grace Hopper, Jamie Fenton and many other unsung women filled many programming jobs during the first several decades of software engineering. Today, many fewer women work in software engineering than in other professions, a situation whose cause is not clearly identified and is often attributed to sexual discrimination, cyberculture or bias in education. Many academic and professional organizations are trying hard to solve this imbalance.
Processes: Processes have become a big part of software engineering and are hailed for their potential to improve software and sharply criticized for their potential to constrict programmers.
Cost of hardware: The relative cost of software versus hardware has changed substantially over the last 50 years. When mainframes were expensive and required large support staffs, the few organizations buying them also had the resources to fund large, expensive custom software engineering projects. Computers are now much more numerous and much more powerful, which has several effects on software. The larger market can support large projects to create commercial off the shelf software, as done by companies such as Microsoft. The cheap machines allow each programmer to have a terminal capable of fairly rapid compilation. The programs in question can use techniques such as garbage collection, which make them easier and faster for the programmer to write. On the other hand, many fewer organizations are interested in employing programmers for large custom software projects, instead using commercial off the shelf software as much as possible.
The Pioneering Era
The most important development was that new computers were coming out almost every year or two, rendering existing ones obsolete. Software people had to rewrite all their programs to run on these new machines. Programmers did not have computers on their desks and had to go to the "machine room". Jobs were run by signing up for machine time or by operational staff. Jobs were run by putting punched cards for input into the machine's card reader and waiting for results to come back on the printer.
The field was so new that the idea of management by schedule was non-existent. Making predictions of a project's completion date was almost impossible. Computer hardware was application-specific. Scientific and business tasks needed different machines. Due to the need to frequently translate old software to meet the needs of new machines, high-order languages like FORTRAN, COBOL, and ALGOL were developed. Hardware vendors gave away systems software for free as hardware could not be sold without software. A few companies sold the service of building custom software but no software companies were selling packaged software.
The notion of reuse flourished. As software was free, user organizations commonly gave it away. Groups like IBM's scientific user group SHARE offered catalogs of reusable components. Academia did not yet teach the principles of computer science. Modular programming and data abstraction were already being used in programming.
1945 to 1965: The origins
The term software engineering first appeared in the late 1950s and early 1960s. Programmers have always known about civil, electrical, and computer engineering and debated what engineering might mean for software.
The NATO Science Committee sponsored two conferences on software engineering in 1968 (Garmisch, Germany — see conference report) and 1969, which gave the field its initial boost. Many believe these conferences marked the official start of the profession of software engineering.
1965 to 1985: The software crisis
Software engineering was spurred by the so-called software crisis of the 1960s, 1970s, and 1980s, which identified many of the problems of software development. Many software projects ran over budget and schedule. Some projects caused property damage. A few projects caused loss of life[citation needed]. The software crisis was originally defined in terms of productivity, but evolved to emphasize quality. Some used the term software crisis to refer to their inability to hire enough qualified programmers.
Cost and Budget Overruns: The OS/360 operating system was a classic example. This decade-long[citation needed] project from the 1960s eventually produced one of the most complex software systems at the time. OS/360 was one of the first large (1000 programmers) software projects. Fred Brooks claims in The Mythical Man Month that he made a multi-million dollar mistake of not developing a coherent architecture before starting development.
Property Damage: Software defects can cause property damage. Poor software security allows hackers to steal identities, costing time, money, and reputations.
Life and Death: Software defects can kill. Some embedded systems used in radiotherapy machines failed so catastrophically that they administered lethal doses of radiation to patients. The most famous of these failures is the Therac 25 incident.
Peter G. Neumann has kept a contemporary list of software problems and disasters. The software crisis has been slowly fizzling out, because it is unrealistic to remain in crisis mode for more than 20 years. SEs are accepting that the problems of SE are truly difficult and only hard work[citation needed] over many decades can solve them.
1985 to 1989: No silver bullet
For decades, solving the software crisis was paramount to researchers and companies producing software tools. Seemingly, they trumpeted every new technology and practice from the 1970s to the 1990s as a silver bullet to solve the software crisis. Tools, discipline, formal methods, process, and professionalism were touted as silver bullets:
Tools: Especially emphasized were tools: Structured programming, object-oriented programming, CASE tools, Ada, Java, documentation, standards, and Unified Modeling Language were touted as silver bullets.
Discipline: Some pundits argued that the software crisis was due to the lack of discipline of programmers.
Formal methods: Some believed that if formal engineering methodologies would be applied to software development, then production of software would become as predictable an industry as other branches of engineering. They advocated proving all programs correct.
Process: Many advocated the use of defined processes and methodologies like the Capability Maturity Model.
Professionalism: This led to work on a code of ethics, licenses, and professionalism.
In 1986, Fred Brooks published the No Silver Bullet article, arguing that no individual technology or practice would ever make a 10-fold improvement in productivity within 10 years.
Debate about silver bullets raged over the following decade. Advocates for Ada, components, and processes continued arguing for years that their favorite technology would be a silver bullet. Skeptics disagreed. Eventually, almost everyone accepted that no silver bullet would ever be found. Yet, claims about silver bullets pop up now and again, even today.
Some interpret no silver bullet to mean that software engineering failed. The search for a single key to success never worked. All known technologies and practices have only made incremental improvements to productivity and quality. Yet, there are no silver bullets for any other profession, either. Others interpret no silver bullet as proof that software engineering has finally matured and recognized that projects succeed due to hard work.
However, it could also be said that there are, in fact, a range of silver bullets today, including lightweight methodologies (see "Project management"), spreadsheet calculators, customized browsers, in-site search engines, database report generators, integrated design-test coding-editors with memory/differences/undo, and specialty shops that generate niche software, such as information websites, at a fraction of the cost of totally customized website development. Nevertheless, the field of software engineering appears too complex and diverse for a single "silver bullet" to improve most issues, and each issue accounts for only a small portion of all software problems.
1990 to 1999: Prominence of the Internet
The rise of the Internet led to very rapid growth in the demand for international information display/e-mail systems on the world wide web. Programmers were required to handle illustrations, maps, photographs, and other images, plus simple animation, at a rate never before seen, with few well-known methods to optimize image display/storage (such as the use of thumbnail images).
The growth of browser usage, running on the HTML language, changed the way in which information-display and retrieval was organized. The wide-spread network connections led to the growth and prevention of international computer viruses on MS Windows computers, and the vast proliferation of spam e-mail became a major design issue in e-mail systems, flooding communication channels and requiring semi-automated pre-screening. Keyword-search systems evolved into web-based search engines, and many software systems had to be re-designed, for international searching, depending on Search Engine Optimization (SEO) techniques. Human natural-language translation systems were needed to attempt to translate the information flow in multiple foreign languages, with many software systems being designed for multi-language usage, based on design concepts from human translators. Typical computer-user bases went from hundreds, or thousands of users, to, often, many-millions of international users.
2000 to Present: Lightweight Methodologies
With the expanding demand for software in many smaller organizations, the need for inexpensive software solutions led to the growth of simpler, faster methodologies that developed running software, from requirements to deployment, quicker & easier. The use of rapid-prototyping evolved to entire lightweight methodologies, such as Extreme Programming (XP), which attempted to simplify many areas of software engineering, including requirements gathering and reliability testing for the growing, vast number of small software systems. Very large software systems still used heavily-documented methodologies, with many volumes in the documentation set; however, smaller systems had a simpler, faster alternative approach to managing the development and maintenance of software calculations and algorithms, information storage/retrieval and display.
Current trends in software engineering
Software engineering is a young discipline, and is still developing. The directions in which software engineering is developing include:
1- Aspects
Aspects help software engineers deal with quality attributes by providing tools to add or remove boilerplate code from many areas in the source code. Aspects describe how all objects or functions should behave in particular circumstances. For example, aspects can add debugging, logging, or locking control into all objects of particular types. Researchers are currently working to understand how to use aspects to design general-purpose code. Related concepts include generative programming and templates.
2- Agile
Agile software development guides software development projects that evolve rapidly with changing expectations and competitive markets. Proponents of this method believe that heavy, document-driven processes (like TickIT, CMM and ISO 9000) are fading in importance. Some people believe that companies and agencies export many of the jobs that can be guided by heavy-weight processes. Related concepts include Extreme Programming, Scrum, and Lean software development.
3- Experimental
Experimental software engineering is a branch of software engineering interested in devising experiments on software, in collecting data from the experiments, and in devising laws and theories from this data. Proponents of this method advocate that the nature of software is such that we can advance the knowledge on software through experiments only.
4- Model-driven
Model Driven Design develops textual and graphical models as primary design artifacts. Development tools are available that use model transformation and code generation to generate well-organized code fragments that serve as a basis for producing complete applications.
5- Software Product Lines
Software Product Lines is a systematic way to produce families of software systems, instead of creating a succession of completely individual products. This method emphasizes extensive, systematic, formal code reuse, to try to industrialize the software development process.
The Future of Software Engineering conference (FOSE), held at ICSE 2000, documented the state of the art of SE in 2000 and listed many problems to be solved over the next decade. The FOSE tracks at the ICSE 2000 and the ICSE 2007 conferences also help identify the state of the art in software engineering.
Software engineering today
The profession is trying to define its boundary and content. The Software Engineering Body of Knowledge SWEBOK has been tabled as an ISO standard during 2006 (ISO/IEC TR 19759).
In 2006, Money Magazine and Salary.com rated software engineering as the best job in America in terms of growth, pay, stress levels, flexibility in hours and working environment, creativity, and how easy it is to enter and advance in the field.
Friday, June 19, 2009
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment