A View of 20th and 21st Century Software Engineering ABSTRACT George S dịch - A View of 20th and 21st Century Software Engineering ABSTRACT George S Việt làm thế nào để nói

A View of 20th and 21st Century Sof

A View of 20th and 21st Century Software Engineering
ABSTRACT George Santayana's statement, "Those who cannot remember the past are condemned to repeat it," is only half true. The past also includes successful histories. If you haven't been made aware of them, you're often condemned not to repeat their successes. In a rapidly expanding field such as software engineering, this happens a lot. Extensive studies of many software projects such as the Standish Reports offer convincing evidence that many projects fail to repeat past successes. This paper tries to identify at least some of the major past software experiences that were well worth repeating, and some that were not. It also tries to identify underlying phenomena influencing the evolution of software engineering practices that have at least helped the author appreciate how our field has gotten to where it has been and where it is. A counterpart Santayana-like statement about the past and future might say, "In an era of rapid change, those who repeat the past are condemned to a bleak future." (Think about the dinosaurs, and think carefully about software engineering maturity models that emphasize repeatability.) This paper also tries to identify some of the major sources of change that will affect software engineering practices in the next couple of decades, and identifies some strategies for assessing and adapting to these sources of change. It also makes some first steps towards distinguishing relatively timeless software engineering principles that are risky not to repeat, and conditions of change under which aging practices will become increasingly risky to repeat.
1. INTRODUCTION One has to be a bit presumptuous to try to characterize both the past and future of software engineering in a few pages. For one thing, there are many types of software engineering: large or small; commodity or custom; embedded or user-intensive; greenfield or legacy/COTS/reuse-driven; homebrew, outsourced, or both; casual- use or mission-critical. For another thing, unlike the engineering of electrons, materials, or chemicals, the basic software elements we engineer tend to change significantly from one decade to the next. Fortunately, I’ve been able to work on many types and generations of software engineering since starting as a programmer in 1955. I’ve made a good many mistakes in developing, managing, and acquiring software, and hopefully learned from them. I’ve been able to learn from many insightful and experienced software engineers, and to interact with many thoughtful people who have analyzed trends and practices in software engineering. These learning experiences have helped me a good deal in trying to understand how software engineering got to where it is and where it is likely to go. They have also helped in my trying to distinguish between timeless principles and obsolete practices for developing successful software-intensive systems. In this regard, I am adapting the [147] definition of “engineering” to define engineering as “the application of science and mathematics by which the properties of software are made useful to people.” The phrase “useful to people” implies that the relevant sciences include the behavioral sciences, management sciences, and economics, as well as computer science. In this paper, I’ll begin with a simple hypothesis: software people don’t like to see software engineering done unsuccessfully, and try to make things better. I’ll try to elaborate this into a high-level decade-by-decade explanation of software engineering’s past. I’ll then identify some trends affecting future software engineering practices, and summarize some implications for future software engineering researchers, practitioners, and educators.
2
2. A Hegelian View of Software Engineering’s Past The philosopher Hegel hypothesized that increased human understanding follows a path of thesis (this is why things happen the way they do); antithesis (the thesis fails in some important ways; here is a better explanation); and synthesis (the antithesis rejected too much of the original thesis; here is a hybrid that captures the best of both while avoiding their defects). Below I’ll try to apply this hypothesis to explaining the evolution of software engineering from the 1950’s to the present
2.1 1950’s Thesis: Software Engineering Is Like Hardware Engineering When I entered the software field in 1955 at General Dynamics, the prevailing thesis was, “Engineer software like you engineer hardware.” Everyone in the GD software organization was either a hardware engineer or a mathematician, and the software being developed was supporting aircraft or rocket engineering. People kept engineering notebooks and practiced such hardware precepts as “measure twice, cut once,” before running their code on the computer. This behavior was also consistent with 1950’s computing economics. On my first day on the job, my supervisor showed me the GD ERA 1103 computer, which filled a large room. He said, “Now listen. We are paying $600 an hour for this computer and $2 an hour for you, and I want you to act accordingly.” This instilled in me a number of good practices such as desk checking, buddy checking, and manually executing my programs before running them. But it also left me with a bias toward saving microseconds when the economic balance started going the other way. The most ambitious information processing project of the 1950’s was the development of the Semi-Automated Ground Environment (SAGE) for U.S. and Canadian air defense. It brought together leading radar engineers, communications engineers, computer engineers, and nascent software engineers to develop a system that would detect, track, and prevent enemy aircraft from bombing the U.S. and Canadian homelands. Figure 1 shows the software development process developed by the hardware engineers for use in SAGE [1]. It shows that sequential waterfall-type models have been used in software development for a long time. Further, if one arranges the steps in a V form with Coding at the bottom, this 1956 process is equivalent to the V-model for software development. SAGE also developed the Lincoln Labs Utility System to aid the thousands of programmers participating in SAGE software development. It included an assembler, a library and build management system, a number of utility programs, and aids to testing and debugging. The resulting SAGE system successfully met its specifications with about a one-year schedule slip. Benington’s bottom-line comment on the success was “It is easy for me to single out the one factor that led to our relative success: we were all engineers and had been trained to organize our efforts along engineering lines.” Another indication of the hardware engineering orientation of the 1950’s is in the names of the leading professional societies for software professionals: the Association for Computing Machinery and the IEEE Computer Society
2.2 1960’s Antithesis: Software Crafting By the 1960’s, however, people were finding out that software phenomenology differed from hardware phenomenology in significant ways. First, software was much easier to modify than was hardware, and it did not require expensive production lines to make product copies. One changed the program once, and then reloaded the same bit pattern onto another computer, rather than having to individually change the configuration of each copy of the hardware. This ease of modification led many people and organizations to adopt a “code and fix” approach to software development, as compared to the exhaustive Critical Design Reviews that hardware engineers performed before committing to production lines and bending metal (measure twice, cut once). Many software
applications became more people-intensive than hardware-intensive; even SAGE became more dominated by psychologists addressing human-computer interaction issues than by radar engineers.
OPERATIONAL PLAN
MACHINE SPECIFICATIONS
OPERATIONAL SPECIFICATIONS
PROGRAM SPECIFICATIONS
CODING SPECIFICATIONS
CODING
PARAMETER TESTING (SPECIFICATIONS)
ASSEMBLY TESTING (SPECIFICATIONS)
SHAKEDOWN
SYSTEM EVALUATION
Figure 1. The SAGE Software Development Process (1956) Another software difference was that software did not wear out. Thus, software reliability could only imperfectly be estimated by hardware reliability models, and “software maintenance” was a much different activity than hardware maintenance. Software was invisible, it didn’t weigh anything, but it cost a lot. It was hard to tell whether it was on schedule or not, and if you added more people to bring it back on schedule, it just got later, as Fred Brooks explained in the Mythical Man-Month [42]. Software generally had many more states, modes, and paths to test, making its specifications much more difficult. Winston Royce, in his classic 1970 paper, said, “In order to procure a $5 million hardware device, I would expect a 30- page specification would provide adequate detail to control the procurement. In order to procure $5 million worth of software, a 1500 page specification is about right in order to achieve comparable control.”[132]. Another problem with the hardware engineering approach was that the rapid expansion of demand for software outstripped the supply of engineers and mathematicians. The SAGE program began hiring and training humanities, social sciences, foreign language, and fine arts majors to develop software. Similar non-engineering people flooded into software development positions for business, government, and services data processing. These people were much more comfortable with the code-and-fix approach. They were often very creative, but their fixes often led to heavily patched spaghetti code. Many of them were heavily influenced by 1960’s “question authority” attitudes and tended to march to their own drummers rather than those of the organization employing them. A significant sub
0/5000
Từ: -
Sang: -
Kết quả (Việt) 1: [Sao chép]
Sao chép!
Một cái nhìn của thế kỷ 20 và 21 phần mềm kỹ thuật Trừu tượng George Santayana tuyên bố, "Những người không thể nhớ quá khứ được lên án để lặp lại nó," chỉ là một nửa sự thật. Quá khứ cũng bao gồm lịch sử thành công. Nếu bạn đã không được thực hiện nhận thức của họ, Anh thường lên án không để lặp lại những thành công. Trong một lĩnh vực nhanh chóng mở rộng như công nghệ phần mềm, điều này xảy ra rất nhiều. Các nghiên cứu sâu rộng của nhiều dự án phần mềm chẳng hạn như các báo cáo Standish cung cấp bằng chứng thuyết phục nhiều dự án thất bại lặp lại qua những thành công. Bài báo này cố gắng để xác định ít nhất một số thiếu kinh nghiệm phần mềm đó là cũng có giá trị lặp đi lặp lại, và một số đó là không quá khứ. Nó cũng cố gắng để xác định các hiện tượng cơ bản ảnh hưởng đến sự tiến triển của công nghệ phần mềm thực tiễn mà ít đã giúp tác giả đánh giá cao cách lĩnh vực của chúng tôi đã nhận được để nơi nó và nó ở đâu. Một đối tác Santayana như tuyên bố về quá khứ và tương lai có thể nói, "Trong một kỷ nguyên của sự thay đổi nhanh chóng, những người lặp lại quá khứ đang bị kết án một tương lai ảm đạm." (Suy nghĩ về những con khủng long, và suy nghĩ cẩn thận về mô hình trưởng thành công nghệ phần mềm nhấn mạnh lặp.) Bài báo này cũng cố gắng để xác định một số trong những nguồn chính của sự thay đổi đó sẽ ảnh hưởng đến thực tiễn công nghệ phần mềm trong vài thập kỷ tiếp theo, và xác định một số chiến lược để đánh giá và thích nghi với các nguồn của sự thay đổi. Nó cũng làm cho một số bước đầu tiên hướng tới phân biệt tương đối vô tận công nghệ phần mềm nguyên tắc đó là nguy hiểm không để lặp lại, và các điều kiện của sự thay đổi theo đó các thực hành lão hóa sẽ trở nên ngày càng nguy hiểm để lặp lại.1. giới thiệu một đã một chút táo bạo để cố gắng để mô tả quá khứ và tương lai của công nghệ phần mềm trong một vài trang. Đối với một điều, có rất nhiều loại công nghệ phần mềm: lớn hay nhỏ; hàng hóa hoặc tuỳ chỉnh; nhúng hoặc người dùng chuyên sâu; Greenfield hoặc di sản/cũi trẻ em/tái sử dụng-driven; homebrew, bên ngoài, hoặc cả hai; sử dụng bình thường hoặc nhiệm vụ quan trọng. Đối với another điều, không giống như kỹ thuật điện tử, vật liệu hoặc hóa chất, các yếu tố cơ bản phần mềm chúng tôi kỹ sư có xu hướng thay đổi đáng kể từ một thập kỷ tiếp theo. May mắn thay, tôi đã có thể làm việc trên nhiều loại và thế hệ của công nghệ phần mềm kể từ khi bắt đầu như là một lập trình viên vào năm 1955. Tôi đã thực hiện một sai lầm many trong phát triển, quản lý, và có được phần mềm, và hy vọng học được từ họ. Tôi đã có thể học hỏi từ nhiều kỹ sư phần mềm sâu sắc và có kinh nghiệm, và tương tác với những chu đáo người đã phân tích các xu hướng và thực hành trong công nghệ phần mềm. Những kinh nghiệm học tập đã giúp tôi một thỏa thuận tốt trong cố gắng để hiểu làm thế nào công nghệ phần mềm đã để nó ở đâu và nơi nó có khả năng để đi. Họ cũng đã giúp tôi cố gắng để phân biệt giữa nguyên tắc vượt thời gian và thực tiễn đã lỗi thời để phát triển thành công hệ thống phần mềm chuyên sâu. Về vấn đề này, tôi đang điều chỉnh các định nghĩa [147] "kỹ thuật" để xác định các kỹ thuật như là "ứng dụng khoa học và toán học mà các thuộc tính của phần mềm được làm hữu ích cho người dân." Cụm từ "hữu ích cho những người" ngụ ý rằng khoa học có liên quan bao gồm khoa học hành vi, khoa học quản lý, và kinh tế, cũng như các khoa học máy tính. Trong bài báo này, tôi sẽ bắt đầu với một giả thuyết đơn giản: phần mềm người không muốn nhìn thấy công nghệ phần mềm thực hiện không thành công, và cố gắng để làm cho mọi việc tốt hơn. Tôi sẽ cố gắng để giải thích điều này vào một lời giải thích cao cấp của thập kỷ của thập kỷ của quá khứ công nghệ phần mềm. Tôi sau đó sẽ xác định một số xu hướng mà ảnh hưởng đến thực tiễn công nghệ phần mềm trong tương lai, và tóm tắt một số tác động đối với các nhà nghiên cứu trong tương lai công nghệ phần mềm, thực hành và giáo dục. 22. một cái nhìn Hegelian của phần mềm kỹ thuật qua các nhà triết học Hegel đưa ra giả thuyết rằng tăng sự hiểu biết của con người sau một con đường của luận án (đây là lý do tại sao những điều xảy ra cách họ làm); ngược (luận án không thành công trong một số cách quan trọng; ở đây là một lời giải thích tốt hơn); và tổng hợp (ngược từ chối quá nhiều của luận án gốc; ở đây là một loài lai để chụp tốt nhất của cả hai trong khi tránh Khuyết tật của họ). Dưới đây tôi sẽ cố gắng áp dụng này giả thuyết để giải thích sự tiến triển của công nghệ phần mềm từ những năm 1950 đến nay2.1 1950’s Thesis: Software Engineering Is Like Hardware Engineering When I entered the software field in 1955 at General Dynamics, the prevailing thesis was, “Engineer software like you engineer hardware.” Everyone in the GD software organization was either a hardware engineer or a mathematician, and the software being developed was supporting aircraft or rocket engineering. People kept engineering notebooks and practiced such hardware precepts as “measure twice, cut once,” before running their code on the computer. This behavior was also consistent with 1950’s computing economics. On my first day on the job, my supervisor showed me the GD ERA 1103 computer, which filled a large room. He said, “Now listen. We are paying $600 an hour for this computer and $2 an hour for you, and I want you to act accordingly.” This instilled in me a number of good practices such as desk checking, buddy checking, and manually executing my programs before running them. But it also left me with a bias toward saving microseconds when the economic balance started going the other way. The most ambitious information processing project of the 1950’s was the development of the Semi-Automated Ground Environment (SAGE) for U.S. and Canadian air defense. It brought together leading radar engineers, communications engineers, computer engineers, and nascent software engineers to develop a system that would detect, track, and prevent enemy aircraft from bombing the U.S. and Canadian homelands. Figure 1 shows the software development process developed by the hardware engineers for use in SAGE [1]. It shows that sequential waterfall-type models have been used in software development for a long time. Further, if one arranges the steps in a V form with Coding at the bottom, this 1956 process is equivalent to the V-model for software development. SAGE also developed the Lincoln Labs Utility System to aid the thousands of programmers participating in SAGE software development. It included an assembler, a library and build management system, a number of utility programs, and aids to testing and debugging. The resulting SAGE system successfully met its specifications with about a one-year schedule slip. Benington’s bottom-line comment on the success was “It is easy for me to single out the one factor that led to our relative success: we were all engineers and had been trained to organize our efforts along engineering lines.” Another indication of the hardware engineering orientation of the 1950’s is in the names of the leading professional societies for software professionals: the Association for Computing Machinery and the IEEE Computer Society2.2 1960 ngược: phần mềm Crafting bởi 1960, Tuy nhiên, người đã tìm ra phần mềm đó hiện tượng khác nhau từ phần cứng hiện tượng ở cách đáng kể. Đầu tiên, phần mềm đã dễ dàng hơn nhiều để sửa đổi hơn là phần cứng, và nó không bắt buộc dây chuyền sản xuất đắt tiền để làm cho bản sao sản phẩm. Một thay đổi chương trình một lần, và sau đó nạp lại chút cùng một khuôn mẫu vào một máy tính, chứ không phải cá nhân thay đổi cấu hình của mỗi bản sao của phần cứng. Này dễ sửa đổi dẫn nhiều người và tổ chức thông qua một "mã và sửa chữa" cách tiếp cận để phát triển phần mềm, so với đánh giá đầy đủ thiết kế quan trọng kỹ sư phần cứng được thực hiện trước khi cam kết để dây chuyền sản xuất và uốn kim loại (biện pháp hai lần, cắt một lần). Nhiều phần mềm ứng dụng đã trở thành hơn người chuyên sâu hơn phần cứng chuyên sâu; SAGE thậm chí trở thành hơn bị chi phối bởi nhà tâm lý học giải quyết vấn đề của con người-máy tính tương tác hơn bởi radar kỹ sư. KẾ HOẠCH HOẠT ĐỘNGTHÔNG SỐ KỸ THUẬT MÁYTHÔNG SỐ KỸ THUẬT HOẠT ĐỘNGCHƯƠNG TRÌNH CHI TIẾT KỸ THUẬTTHÔNG SỐ KỸ THUẬT MÃ HÓAMÃ HÓATHAM SỐ THỬ NGHIỆM (ĐẶC ĐIỂM KỸ THUẬT)HỘI THỬ NGHIỆM (ĐẶC ĐIỂM KỸ THUẬT)CHUYẾN ĐI THỬ MÁYHỆ THỐNG ĐÁNH GIÁ Figure 1. The SAGE Software Development Process (1956) Another software difference was that software did not wear out. Thus, software reliability could only imperfectly be estimated by hardware reliability models, and “software maintenance” was a much different activity than hardware maintenance. Software was invisible, it didn’t weigh anything, but it cost a lot. It was hard to tell whether it was on schedule or not, and if you added more people to bring it back on schedule, it just got later, as Fred Brooks explained in the Mythical Man-Month [42]. Software generally had many more states, modes, and paths to test, making its specifications much more difficult. Winston Royce, in his classic 1970 paper, said, “In order to procure a $5 million hardware device, I would expect a 30- page specification would provide adequate detail to control the procurement. In order to procure $5 million worth of software, a 1500 page specification is about right in order to achieve comparable control.”[132]. Another problem with the hardware engineering approach was that the rapid expansion of demand for software outstripped the supply of engineers and mathematicians. The SAGE program began hiring and training humanities, social sciences, foreign language, and fine arts majors to develop software. Similar non-engineering people flooded into software development positions for business, government, and services data processing. These people were much more comfortable with the code-and-fix approach. They were often very creative, but their fixes often led to heavily patched spaghetti code. Many of them were heavily influenced by 1960’s “question authority” attitudes and tended to march to their own drummers rather than those of the organization employing them. A significant sub
đang được dịch, vui lòng đợi..
Kết quả (Việt) 2:[Sao chép]
Sao chép!
A View of 20th and 21st Century Software Engineering
ABSTRACT George Santayana's statement, "Those who cannot remember the past are condemned to repeat it," is only half true. The past also includes successful histories. If you haven't been made aware of them, you're often condemned not to repeat their successes. In a rapidly expanding field such as software engineering, this happens a lot. Extensive studies of many software projects such as the Standish Reports offer convincing evidence that many projects fail to repeat past successes. This paper tries to identify at least some of the major past software experiences that were well worth repeating, and some that were not. It also tries to identify underlying phenomena influencing the evolution of software engineering practices that have at least helped the author appreciate how our field has gotten to where it has been and where it is. A counterpart Santayana-like statement about the past and future might say, "In an era of rapid change, those who repeat the past are condemned to a bleak future." (Think about the dinosaurs, and think carefully about software engineering maturity models that emphasize repeatability.) This paper also tries to identify some of the major sources of change that will affect software engineering practices in the next couple of decades, and identifies some strategies for assessing and adapting to these sources of change. It also makes some first steps towards distinguishing relatively timeless software engineering principles that are risky not to repeat, and conditions of change under which aging practices will become increasingly risky to repeat.
1. INTRODUCTION One has to be a bit presumptuous to try to characterize both the past and future of software engineering in a few pages. For one thing, there are many types of software engineering: large or small; commodity or custom; embedded or user-intensive; greenfield or legacy/COTS/reuse-driven; homebrew, outsourced, or both; casual- use or mission-critical. For another thing, unlike the engineering of electrons, materials, or chemicals, the basic software elements we engineer tend to change significantly from one decade to the next. Fortunately, I’ve been able to work on many types and generations of software engineering since starting as a programmer in 1955. I’ve made a good many mistakes in developing, managing, and acquiring software, and hopefully learned from them. I’ve been able to learn from many insightful and experienced software engineers, and to interact with many thoughtful people who have analyzed trends and practices in software engineering. These learning experiences have helped me a good deal in trying to understand how software engineering got to where it is and where it is likely to go. They have also helped in my trying to distinguish between timeless principles and obsolete practices for developing successful software-intensive systems. In this regard, I am adapting the [147] definition of “engineering” to define engineering as “the application of science and mathematics by which the properties of software are made useful to people.” The phrase “useful to people” implies that the relevant sciences include the behavioral sciences, management sciences, and economics, as well as computer science. In this paper, I’ll begin with a simple hypothesis: software people don’t like to see software engineering done unsuccessfully, and try to make things better. I’ll try to elaborate this into a high-level decade-by-decade explanation of software engineering’s past. I’ll then identify some trends affecting future software engineering practices, and summarize some implications for future software engineering researchers, practitioners, and educators.
2
2. A Hegelian View of Software Engineering’s Past The philosopher Hegel hypothesized that increased human understanding follows a path of thesis (this is why things happen the way they do); antithesis (the thesis fails in some important ways; here is a better explanation); and synthesis (the antithesis rejected too much of the original thesis; here is a hybrid that captures the best of both while avoiding their defects). Below I’ll try to apply this hypothesis to explaining the evolution of software engineering from the 1950’s to the present
2.1 1950’s Thesis: Software Engineering Is Like Hardware Engineering When I entered the software field in 1955 at General Dynamics, the prevailing thesis was, “Engineer software like you engineer hardware.” Everyone in the GD software organization was either a hardware engineer or a mathematician, and the software being developed was supporting aircraft or rocket engineering. People kept engineering notebooks and practiced such hardware precepts as “measure twice, cut once,” before running their code on the computer. This behavior was also consistent with 1950’s computing economics. On my first day on the job, my supervisor showed me the GD ERA 1103 computer, which filled a large room. He said, “Now listen. We are paying $600 an hour for this computer and $2 an hour for you, and I want you to act accordingly.” This instilled in me a number of good practices such as desk checking, buddy checking, and manually executing my programs before running them. But it also left me with a bias toward saving microseconds when the economic balance started going the other way. The most ambitious information processing project of the 1950’s was the development of the Semi-Automated Ground Environment (SAGE) for U.S. and Canadian air defense. It brought together leading radar engineers, communications engineers, computer engineers, and nascent software engineers to develop a system that would detect, track, and prevent enemy aircraft from bombing the U.S. and Canadian homelands. Figure 1 shows the software development process developed by the hardware engineers for use in SAGE [1]. It shows that sequential waterfall-type models have been used in software development for a long time. Further, if one arranges the steps in a V form with Coding at the bottom, this 1956 process is equivalent to the V-model for software development. SAGE also developed the Lincoln Labs Utility System to aid the thousands of programmers participating in SAGE software development. It included an assembler, a library and build management system, a number of utility programs, and aids to testing and debugging. The resulting SAGE system successfully met its specifications with about a one-year schedule slip. Benington’s bottom-line comment on the success was “It is easy for me to single out the one factor that led to our relative success: we were all engineers and had been trained to organize our efforts along engineering lines.” Another indication of the hardware engineering orientation of the 1950’s is in the names of the leading professional societies for software professionals: the Association for Computing Machinery and the IEEE Computer Society
2.2 1960’s Antithesis: Software Crafting By the 1960’s, however, people were finding out that software phenomenology differed from hardware phenomenology in significant ways. First, software was much easier to modify than was hardware, and it did not require expensive production lines to make product copies. One changed the program once, and then reloaded the same bit pattern onto another computer, rather than having to individually change the configuration of each copy of the hardware. This ease of modification led many people and organizations to adopt a “code and fix” approach to software development, as compared to the exhaustive Critical Design Reviews that hardware engineers performed before committing to production lines and bending metal (measure twice, cut once). Many software
applications became more people-intensive than hardware-intensive; even SAGE became more dominated by psychologists addressing human-computer interaction issues than by radar engineers.
OPERATIONAL PLAN
MACHINE SPECIFICATIONS
OPERATIONAL SPECIFICATIONS
PROGRAM SPECIFICATIONS
CODING SPECIFICATIONS
CODING
PARAMETER TESTING (SPECIFICATIONS)
ASSEMBLY TESTING (SPECIFICATIONS)
SHAKEDOWN
SYSTEM EVALUATION
Figure 1. The SAGE Software Development Process (1956) Another software difference was that software did not wear out. Thus, software reliability could only imperfectly be estimated by hardware reliability models, and “software maintenance” was a much different activity than hardware maintenance. Software was invisible, it didn’t weigh anything, but it cost a lot. It was hard to tell whether it was on schedule or not, and if you added more people to bring it back on schedule, it just got later, as Fred Brooks explained in the Mythical Man-Month [42]. Software generally had many more states, modes, and paths to test, making its specifications much more difficult. Winston Royce, in his classic 1970 paper, said, “In order to procure a $5 million hardware device, I would expect a 30- page specification would provide adequate detail to control the procurement. In order to procure $5 million worth of software, a 1500 page specification is about right in order to achieve comparable control.”[132]. Another problem with the hardware engineering approach was that the rapid expansion of demand for software outstripped the supply of engineers and mathematicians. The SAGE program began hiring and training humanities, social sciences, foreign language, and fine arts majors to develop software. Similar non-engineering people flooded into software development positions for business, government, and services data processing. These people were much more comfortable with the code-and-fix approach. They were often very creative, but their fixes often led to heavily patched spaghetti code. Many of them were heavily influenced by 1960’s “question authority” attitudes and tended to march to their own drummers rather than those of the organization employing them. A significant sub
đang được dịch, vui lòng đợi..
 
Các ngôn ngữ khác
Hỗ trợ công cụ dịch thuật: Albania, Amharic, Anh, Armenia, Azerbaijan, Ba Lan, Ba Tư, Bantu, Basque, Belarus, Bengal, Bosnia, Bulgaria, Bồ Đào Nha, Catalan, Cebuano, Chichewa, Corsi, Creole (Haiti), Croatia, Do Thái, Estonia, Filipino, Frisia, Gael Scotland, Galicia, George, Gujarat, Hausa, Hawaii, Hindi, Hmong, Hungary, Hy Lạp, Hà Lan, Hà Lan (Nam Phi), Hàn, Iceland, Igbo, Ireland, Java, Kannada, Kazakh, Khmer, Kinyarwanda, Klingon, Kurd, Kyrgyz, Latinh, Latvia, Litva, Luxembourg, Lào, Macedonia, Malagasy, Malayalam, Malta, Maori, Marathi, Myanmar, Mã Lai, Mông Cổ, Na Uy, Nepal, Nga, Nhật, Odia (Oriya), Pashto, Pháp, Phát hiện ngôn ngữ, Phần Lan, Punjab, Quốc tế ngữ, Rumani, Samoa, Serbia, Sesotho, Shona, Sindhi, Sinhala, Slovak, Slovenia, Somali, Sunda, Swahili, Séc, Tajik, Tamil, Tatar, Telugu, Thái, Thổ Nhĩ Kỳ, Thụy Điển, Tiếng Indonesia, Tiếng Ý, Trung, Trung (Phồn thể), Turkmen, Tây Ban Nha, Ukraina, Urdu, Uyghur, Uzbek, Việt, Xứ Wales, Yiddish, Yoruba, Zulu, Đan Mạch, Đức, Ả Rập, dịch ngôn ngữ.

Copyright ©2025 I Love Translation. All reserved.

E-mail: