Skip to main content

Technological Evolution

  • Chapter
  • First Online:
Emerging Technological Risk

Abstract

Social interactions , shaping technology, allows us to investigate how technology pervades work practices , hence, understanding communities of practice with respect to technology. On the one hand, social shaping of technology highlights risk perception of technological evolution . On the other hand, technological evolution is a potential hazard, disruptive, for work practices . However, it is possible to analyse and capture technology trajectories in order to understand, with respect to technological evolution, design decisions and activities enabling engineering knowledge growth . The review of different case studies uncovers multidisciplinary aspects of technology innovation . It highlights complex interactions affecting our ability to support technology evolution , hence, technology innovation .

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Software engineering processes, for instance, involve the activities of software specification, design, implementation, validation and evolution [42] . Process models (e.g. waterfall, spiral) organise and relate phases to each other differently (according to underlying assumptions and application domain constraints).

  2. 2.

    The acquisition, deployment and use of COTS systems is somehow problematic due to limited accounts of, and guidance for assessing, human factors [12] .

  3. 3.

    It is necessary to enrich the semantics interpretation of the accessibility relation, i.e. dependency, between functional requirements by associating weights with each pair of related possible worlds, i.e. functional requirements . Intuitively, technical solutions (matching requirements ) are accessible possibilities or possible worlds in solution spaces available in the production environment. A solution space, therefore, is just a collection of solutions, which represent the organisational engineering knowledge [44] resulting from the social shaping of technology [30] . The definition in [15] intentionally recalls the notion of possible worlds underlying Kripke models . Thus, solutions are Kripke models , whereas problems are formulas of (propositional) modal logic. Collections of problems (i.e. problem spaces) are suspected issues arising during system production. Kripke models (i.e. solutions) provide the semantics in order to interpret the validity of (propositional) modalities (i.e. problems). Propositional Modal Logic allows us to express modalities . Formulas of propositional modal logic capture system properties like safety and liveness . For instance, let us consider the formula \({\square}\,P\,{\rightarrow}\,P\). The formula means that if a property \(P\) is valid at every accessible possible world, then it is actually valid at the real world. It represents a simple safety property that states ‘nothing bad ever happens’. Another example is the formula \({\square}\,P \,{\rightarrow}\, {\diamondsuit}\, P\). The formula means that if the property P is valid at every accessible possible world, then it will be valid eventually. It represents a simple liveness property that states ‘something good eventually happens’.

  4. 4.

    Hazard and Operability Analysis (HAZOP) [26, 43] involves “identifying the interconnections between components within the system and determining the corresponding interactions” [43] . Component interactions identify flows, referred as “entities” having certain properties or “attributes”, which define the system’s operation. Any deviation from these properties (i.e. attributes) highlights concerns with respect to the system’s operation [26, 43] .

References

  1. Ackerman MS, Halverson CA (2000) Reexamining organizational memory. Commun ACM 43(1):59–64. doi:10.1145/323830.323845

    Google Scholar 

  2. Ackerman MS, Halverson CA (2004) Organizational memory as objects, processes, and trajectories: an examination of organizational memory in use. Computer Supported Cooperative Work 13(2):155–189. doi:10.1023/B:COSU.0000045805.77534.2a

    Google Scholar 

  3. Anderson S, Felici M (2001) Requirements evolution—from process to product oriented management. In: Bomarius F, Komi-Sirviö S (eds) Proceedings of the third international conference on product focused software process improvement, PROFES 2001, Springer, no. 2188 in LNCS, pp 27–41. doi:10.1007/3-540-44813-6_6

  4. Anderson S, Felici M (2002) Quantitative aspects of requirements evolution. In: Proceedings of the 26th annual international conference on computer software and applications conference, COMPSAC 2002, IEEE Computer Society, pp 27–32. doi:10.1109/CMPSAC.2002.1044529

  5. Avižienis A, Laprie J-C, Randell B, Landwehr C (2004) Basic concepts and taxonomy of dependable and secure computing. IEEE Trans Dependable Secur Comput 1(1):11–33. doi:10.1109/TDSC.2004.2

    Google Scholar 

  6. Bishop P, Bloomfield R, Clement C, Guerra S (2003) Software criticality analysis of COTS/SOUP. Reliab Eng Syst Saf 81(3):291–301. doi:10.1016/S0951-8320(03)00093-0

  7. Bishop P, Bloomfield R, Clement T, Guerra S (2002) Software criticality analysis of COTS/SOUP. In: Anderson S, Bologna S, Felici M (eds) Proceedings of the 21st international conference on computer safety, reliability and security, SAFECOMP 2002, Springer, no. 2434 in LNCS, pp 198–211. doi:10.1007/3-540-45732-1_20

  8. Bishop P, Bloomfield R, Clement T, Guerra S, Jones C (2003) Integrity static analysis of COTS/SOUP. In: Anderson S, Felici M, Littlewood B (eds) Proceedings of the 22nd international conference on computer safety, Reliability and Security, SAFECOMP 2003, Springer, no. 2788 in LNCS, pp 63–76. doi:10.1007/978-3-540-39878-3_6

  9. Bloomfield R, Littlewood B (2003) Multi-legged arguments: the impact of diversity upon confidence in dependability arguments. In: Proceedings of the 2003 international conference on dependable systems and networks, DSN’03, IEEE Computer Society, pp 25–34. doi:10.1109/DSN.2003.1209913

  10. Bloomfield R, Littlewood B(2006) On the use of diverse arguments to increase confidence in dependability claims. In: Besnard D, Gacek C, Jones CB (eds) Structure for dependability: computer-based systems from an interdisciplinary perspective, Springer, Chap 13, pp 254–268. doi:10.1007/1-84628-111-3_13

  11. Bowker GC, Star SL (1999) Sorting things out: classification and its consequences. The MIT Press, Cambridge

    Google Scholar 

  12. Bruseberg A (2006) The design of complete systems: providing human factors guidance for COTS acquisition. Reliab Eng Syst Saf 91(12):1554–1565. doi:10.1016/j.ress.2006.01.016

  13. Büscher M, Shapiro D, Hartswood M, Procter R, Slack R, Voß A, Mogensen P (2002) Promises, premises and risks: sharing responsibilities, working up trust and sustaining commitment in participatory design projects. In: Binder T, Gregory J, Wagner I (eds) Proceedings of the participatory design conference, PDC 2002, pp 183–192

    Google Scholar 

  14. Douglas M, Wildavsky A (1982) Risk and culture: an essay on the selection of technological and environmental dangers. University of California Press, California

    Google Scholar 

  15. Felici M (2004) Observational models of requirements evolution. PhD thesis, School of Informatics, University of Edinburgh, Edinburgh

    Google Scholar 

  16. Felici M (2005) Evolutionary safety analysis: motivations from the air traffic management domain. In: Winther R, Gran BA, Dahll G (eds) Proceedings of the 24th international conference on computer safety, reliability and security, SAFECOMP 2005, Springer, no. 3688 in LNCS, pp 208–221. doi:10.1007/11563228_16

  17. Felici M (2006a) Capturing emerging complex interactions: safety analysis in air traffic management. Reliab Eng Syst Saf 91(12):1482–1493. doi:10.1016/j.ress.2006.01.010

  18. Felici M (2006b) Modeling safety case evolution—examples from the air traffic management domain. In: Guelfi N, Savidis A (eds) Proceedings of the 2nd international workshop on rapid integration of software engineering techniques, RISE 2005, Springer no. 3943 in LNCS, pp 81–96. doi:10.1007/11751113_7

  19. Felici M (2006c) Structuring evolution: on the evolution of socio-technical systems. In: Besnard D, Gacek C, Jones C (eds) Structure for dependability: computer-based systems from an interdisciplinary perspective, chap 3, Springer, pp 49–73. doi:10.1007/1-84628-111-3_3

  20. Halverson CA (2002) Activity theory and distributed cognition: or what does CSCW need to do with theories? Computer Supported Cooperative Work 11(1–2):243–267. doi:10.1023/A:1015298005381

  21. Hartswood M, Procter P, Slack R, Voß A, Büscher M, Rouncefield M, Rouchy P (2002) Co-realisation: towards a principled synthesis of ethnomethodology and participatory design. Scand J Inf Syst 14(2):9–30

    Google Scholar 

  22. Hartswood M, Procter R, Slack R, Soutter J, VoßA, Rouncefield M (2002) The benefits of a long engagement: from contextual design to the co-realisation of work affording artefacts. In: Proceedings of NordiCHI, ACM, pp 283–286. doi:10.1145/572020.572066

  23. Hollnagel E (1993) Human reliability analysis: context and control. Academic Press, London

    Google Scholar 

  24. Hughes, AC, Hughes, TP (eds) (2000) Systems, experts, and computers: the systems approach in management and engineering, world war II and after. The MIT Press, Cambridge

    Google Scholar 

  25. Johnson CW (2006) What are emergent properties and how do they affect the engineering of complex systems? Reliab Eng Syst Saf 91(12):1475–1481. doi:10.1016/j.ress.2006.01.008

  26. Leveson NG (1995) SAFEWARE: system safety and computers. Addison-Wesley, London

    Google Scholar 

  27. Littlewood B, Popov P, Strigini L (2001) Modeling software design diversity: a review. ACM Comput Surv 33(2):177–208. doi:10.1145/384192.384195

    Google Scholar 

  28. Littlewood B, Wright D (2007) The use of multi-legged arguments to increase confidence in safety claims for software-based systems: a study based on a BBN analysis of an idealised example. IEEE Trans on Softw Eng 33(5):347–365. doi:10.1109/TSE.2007.1002

  29. MacKenzie D (2001) Mechanizing proof: computing, risk, and trust. The MIT Press, Cambridge

    Google Scholar 

  30. MacKenzie, D, Wajcman, J (eds) (1999) The social shaping of technology, 2nd edn. Open University Press, Buckingham

    Google Scholar 

  31. MacKenzie DA (1990) Inventing accuracy: a historical sociology of nuclear missile guidance. The MIT Press, Cambridge

    Google Scholar 

  32. MacKenzie DA (1996) Knowing machines: essays on technical change. The MIT Press, Cambridge

    Google Scholar 

  33. Neumann PG (1995) Computer related risks. The ACM Press, New York

    Google Scholar 

  34. Norman DA (1993) Things that make us smart: defining human attributes in the age of the machine. Perseus Books, Cambridge

    Google Scholar 

  35. Perrow C (1999) Normal accidents: living with high-risk technologies. Princeton University Press, New Jersey

    Google Scholar 

  36. Petroski H (1982) To engineer is human: the role of failure in successful design. Vintage Books, New York

    Google Scholar 

  37. Petroski H (1994) Design paradigms: case histories of error and judgment in engineering. Cambridge University Press, Cambridge

    Google Scholar 

  38. Popov P (2002) Reliability assessment of legacy safety-critical systems upgraded with off-the-shelf components. In: Anderson S, Bologna S, Felici M (eds) Proceedings of the 21st international conference on computer safety, reliability and security, SAFECOMP 2002, Springer, no. 2434 in LNCS, pp 139–150, doi:10.1007/3-540-45732-1_15

  39. Popov P, Littlewood B (2004) The effect of testing on reliability of fault-tolerant software. In: Proceedings of the 2004 international conference on dependable systems and networks, DSN’04, IEEE Comput Soc, pp 265–274. doi:10.1109/DSN.2004.1311896

  40. Smith SP, Harrison MD (2003) Reuse in hazard analysis: identification and support. In: Anderson S, Felici M, Littlewood B (eds) Proceedings of the 22nd international conference on computer safety, reliability and security, SAFECOMP 2003, Springer, no. 2788 in LNCS, pp 382–395. doi:10.1007/978-3-540-39878-3_30

  41. Smith SP, Harrison MD (2005) Measuring reuse in hazard analysis. Reliab Eng Syst Saf 89(1):93–104. doi:10.1016/j.ress.2004.08.010

    Google Scholar 

  42. Sommerville I (2007) Software engineering, eighth edn. Addison-Wesley, Harlow

    Google Scholar 

  43. Storey N (1996) Safety-critical computer systems. Addison-Wesley, Harlow

    Google Scholar 

  44. Vincenti WG (1990) What engineers know and how they know it: analytical studies from aeronautical history. The Johns Hopkins University Press, Baltimore

    Google Scholar 

  45. Voß A, Procter R, Slack R, Hartswood M, Williams R, Rouncefield M (2002) Accomplishing ‘just-in-time’ production. In: Johnson C (ed) Human decision making and control, GIST technical report G2002-1, pp 209–211

    Google Scholar 

  46. Voß A, Slack R, Procter R, Williams R, Hartswood M, Rouncefield M (2002) Dependability as ordinary action. In: Anderson S, Bologna S, Felici M (eds) Proceedings of the 21st international conference on computer safety, reliability and security, SAFECOMP 2002, Springer, no. 2434 in LNCS, pp 32–43, doi:10.1007/3-540-45732-1_5

  47. Wallace DR, Kuhn DR (1999) Lessons from 342 medical device failures. In: Proceedings of the 4th IEEE international symposium on high-assurance systems engineering, HASE, IEEE Computer Society, pp 123–131

    Google Scholar 

  48. Williams R, Stewart J, Slack R (2005) Social learning in technological innovation: experimenting with information and communication technologies. Edward Elgar, Cheltenham

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Massimo Felici .

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag London Limited

About this chapter

Cite this chapter

Anderson, S., Felici, M. (2012). Technological Evolution . In: Emerging Technological Risk. Springer, London. https://doi.org/10.1007/978-1-4471-2143-5_3

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-2143-5_3

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-2142-8

  • Online ISBN: 978-1-4471-2143-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics