Skip to main content

Cyber Counterdeception: How to Detect Denial & Deception (D&D)

  • Chapter
  • First Online:
Cyber Warfare

Part of the book series: Advances in Information Security ((ADIS,volume 56))

Abstract

In this chapter we explore cyber-counterdeception (cyber-CD), what it is, and how it works, and how to incorporate counterdeception into cyber defenses. We review existing theories and techniques of counterdeception and relate counterdeception to the concepts of cyber attack kill chains and intrusion campaigns. We adapt theories and techniques of counterdeception to the concepts of cyber defenders’ deception chains and deception campaigns. We describe the utility of conducting cyber wargames and exercises to develop the techniques of cyber-denial & deception (cyber-D&D) and cyber-CD. Our goal is to suggest how cyber defenders can use cyber-CD, in conjunction with defensive cyber-D&D campaigns, to detect and counter cyber attackers.

Approved for Public Release; Distribution Unlimited. Case Number 14–2003. ©2014 The MITRE Corporation. ALL RIGHTS RESERVED. The views, opinions and/or findings contained in this report are those of The MITRE Corporation and should not be construed as an official government position, policy, or decision, unless designated by other documentation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Boush et al. (2009). Market research shows consumer resistance or susceptibility to persuasion can be driven by processes that operate entirely outside the conscious awareness of the consumer; e.g., Laran et al. (2011).

  2. 2.

    McNair (1991).

  3. 3.

    Whaley (2006) further wrote: “Counterdeception is … now standard jargon among specialists in military deception. This useful term was coined in 1968 by Dr. William R. Harris during a brainstorming session with me in Cambridge, Massachusetts.” Harris’s papers, while widely influencing other scholars of deception and counterdeception, are hard to come by. Epstein (1991) cites Harris (1968). Other relevant Harris counterdeception papers Epstein cited include Harris (1972); and Harris (1985).

  4. 4.

    Boush et al. (2009), inDeception in the Marketplace: The Psychology of Deceptive Persuasion and Consumer Self Protection, advocate “deception protection” for consumers (Chap 1) to help them “detect, neutralize and resist the varied types of deception” in the marketplace.

  5. 5.

    Bodmer et al. (2012) noted Chinese cyber deception in cyber wargaming (p. 82): “reports of the People’s Liberation Army (PLA) advancing their cyber-deception capabilities through a coordinated computer network attack and electronic warfare integrated exercise.” We found no references explicitly to cyber exercises ofcyber-counterdeception.

  6. 6.

    Rowe used the termcounterdeception, we believe he meant what we term herecounter-deception; Rowe (2004). Rowe (2003) proposed a counterplanning approach to planning and managing what we termcounter-deception operations. A recent description of counter-deception, “a multi-layer deception system that provides an in depth defense against … sophisticated targeted attacks,” is Wang et al. (2013).

  7. 7.

    For a general analysis of denial techniques in cyber-counter-deception (cyber-C-D), see Yuill et al. (2006).

  8. 8.

    STIX and the STIX logo are trademarks of The MITRE Corporation. The STIX license states: The MITRE Corporation (MITRE) hereby grants you a non-exclusive, royalty-free license to use Structured Threat Information Expression (STIXTM) for research, development, and commercial purposes. Any copy you make for such purposes is authorized provided you reproduce MITRE’s copyright designation and this license in any such copy (seehttp://stix.mitre.org/).

  9. 9.

    TAXII and the TAXII logo are trademarks of The MITRE Corporation. The TAXII license states: The MITRE Corporation (MITRE) hereby grants you a non-exclusive, royalty-free license to use Trusted Automated Exchange Indicator Information (TAXIITM) for research, development, and commercial purposes. Any copy you make for such purposes is authorized provided you reproduce MITRE’s copyright designation and this license in any such copy (seehttp://taxii.mitre.org/).

  10. 10.

    Other than a few references to detecting deception in social engineering situations, we found no research on cyber-counterdeception, per se, in general searching of the scholarly literature.

  11. 11.

    Some (e.g., Bennett and Waltz2007) would credit “incongruity analysis” to R. V. Jones, and his theory of spoofing and counter-spoofing. See Jones (2009), pp. 285–291: “the perception of incongruity—which my ponderings have led me to believe is the basic requirement for a sense of humour—[concluding]… the object of a practical joke [is] the creation of an incongruity.”

  12. 12.

    For example, Heuer (1981). Whether or not deception is detected, assessing hypotheses regarding the adversary’s possible courses of action against the evidence provides useful insights into adversary intentions. Heuser (1996) wrote, : “The [counterdeception] cell would be tasked to … [look] at the data from the enemy’s point of view. They would need to place themselves in the mind of the enemy, determine how they would develop a deception plan and see if evidence supports it…. The enemy may not be employing a deception plan, but the process will aid in exploring different enemy courses of action that may have been overlooked.,” Bruce and Bennett (2008) wrote: “the failure to generate hypotheses increases vulnerability to deception…One key to Why Bad Things Happen to Good Analysts has been conflicting organizational signals regarding promotion of overconfidence (“making the call”) versus promotion of more rigorous consideration of alternative hypotheses and the quality of information;” in George and Bruce (2008).

  13. 13.

    See, for example (Lachow2011; Sanger2012; Langner2013; Lindsay2013).

  14. 14.

    Although originally referred to as the “Intrusion Kill Chain” by the authors of the related seminal paper, the concept is now more generally referred to as “Cyber Kill Chain.” Seehttp://www.lockheedmartin.com/us/what-we-do/information-technology/cyber-security/cyber-kill-chain.html.

  15. 15.

    Croom (2010). The author, Lieutenant General Charles Croom (Ret.) is Vice President of Lockheed Martin Information Systems and Global Solutions.

  16. 16.

    Gilovich et al. (2002) and Dawes (2001).

  17. 17.

    Heuer (1981) and Elsässer and Stech (2007).

  18. 18.

    See Fischhoff (1982).

  19. 19.

    See Stech and Elsässer (2007).

  20. 20.

    “2nd Bureau of the People’s Liberation Army (PLA) General Staff Department’s (GSD) 3rd Department, which is most commonly known by its Military Unit Cover Designator (MUCD) as Unit 61398.” Unit 61398 functions as “the Third Department`s premier entity targeting the United States and Canada, most likely focusing on political, economic, and military-related intelligence,” Stokes et al. (2011).

  21. 21.

    There is nothing new in proposing deception versus counterdeception in RED versus BLUE wargames or applying these wargames to the cyber domain; see, for example, Feer (1989) and Cohen et al. (2001)

  22. 22.

    Some recent wargame examples are described in Alberts et al. (2010). Joint Chiefs of Staff (2006) recommends wargaming plans and courses of action for information operations. Wargaming by DOD may be less frequent than suggested by doctrine and history, e.g., regarding the 2008 Russian invasion of Georgia a U.S. Army War College analysis concluded “U.S. intelligence-gathering and analysis regarding the Russian threat to Georgia failed. … No scenarios of a Russian invasion were envisaged, wargamed, or seriously exercised;” p. 72, Cohen and Hamilton (2011).

  23. 23.

    Wheaton (2011) recommends a game-based approach to teaching strategic intelligence analysis to increase learning, improve student performance, and increase student satisfaction;.

  24. 24.

    For example, Pérez2004 wrote: “The risks vs. benefits of committing [military rescue] forces must be weighed and war-gamed carefully between all civilian/military leaders prior to committing the military arm [to a hostage rescue mission]. ”

  25. 25.

    Even when wargames are designed to explore future concepts and capabilities, they must provide a realistic basis for these innovations and a realistic environment for the exploration. See, for example, Rosen1991; Murray and Millet1996,2000; Scales and Robert2000; Knox and Murray2001; and Fisher2005.

  26. 26.

    On the other hand, counterdeception analysis can be performed synchronously in live exercises or asynchronously on “cold cases.” That is, an asynchronous counterdeception reconstruction of D&D operations from forensic materials, historical records, and other materials can be readily performed. Such asynchronous analyses are common in counter-fraud, cheating detection, art forgery analysis, malingering, etc. For example, see Stech and Elsässer (2007)

  27. 27.

    Bennett and Waltz (2007) cite the National Intelligence Council’s (2000, p. 9) conclusion regarding the future use of deception: “most adversaries will recognize the information advantage and military superiority of the United States in 2015. Rather than acquiesce to any potential US military domination, they will try to circumvent or minimize US strengths and exploit perceived weaknesses [though] greater access … to sophisticated deception-and-denial techniques…”

  28. 28.

    See, for example, Bodmer et al. (2012): “There is a very well-developed legal framework to deal with intruders, and as one of the “good guys” your responses are bound by that framework. You can expect to work closely with legal counsel to ensure that your operation is legal… But this does not mean that there are no ethical consequences to your actions. Even if your specific [cyber-D&D] operation does not push ethical boundaries, taken collectively, the actions of you and your colleagues just may.”

  29. 29.

    What Whaley (2007e) found (as might be expected) is subtle: Some cultures are clearly more deceptive than others but only during given slices of time. No single culture has excelled in deceptiveness throughout its history; while the Chinese since the 1940s have shown high levels of military-political deceptiveness, this is not true throughout Chinese history. In a given culture and time, levels of deceptiveness can be quite different across major disciplines of military, domestic politics, foreign diplomacy, and commercial business. Sometimes ends justify means, practical considerations of greed and survival sometimes override religious, moral, or ethical objections to deception. High, medium, and low levels of deceptive behavior were found in every culture at different times and regardless of its level of technology. We found no comparisons of counterdeception capabilities across cultures comparable to Whaley’s analysis of cultural deceptiveness.

  30. 30.

    Not all observers agree with Jones’s and Whaley’s concept of a “counterdeception analyst’s advantage” over deceivers, and some tend to see the deception-counterdeception contest in terms of (to use Handel’s (2005) characterization), “the persistent difficulty involved in attempting to expose deception” i.e., more along the lines of Fig. 6.5, above (Matrix of Attacker’s Denial & Deception Moves versus Defender’s Counterdenial and Counterdeception Moves). For example, Scot Macdonald (2007) sees the deceivers as generally holding the advantages over deception detectives as new tools, technologies, channels, and environments become available.

  31. 31.

    An ad hoc search on “growth of the global cyber security industry” yielded a 2014 estimate of about $77 billion, and a 2019 estimate of about $ 156 billion, i.e., more than doubling in 5 years, or roughly seven times faster growth than estimates of the growth of the global economy over the 2014–2019 time frame. Seehttp://www.asdnews.com/news-53610/Global_Cyber_Security_Market_to_be_Worth_$76.68bn_in_2014.htm; http://www.marketsandmarkets.com/PressReleases/cyber-security.asp; andhttp://www.conference-board.org/pdf_free/GEO2014_Methodology.pdf.

  32. 32.

    See, for example Caverni and Gonzalez (1990) and Yetiv (2013).

  33. 33.

    One 2009 report suggested the Chinese will employ integrated network electronic warfare which includes “using techniques such as electronic jamming, electronic deception and suppression to disrupt information acquisition and information transfer, launching a virus attack or hacking to sabotage information processing and information utilization, and using anti-radiation and other weapons based on new mechanisms to destroy enemy information platforms and information facilities.” Krekel (2009).

References

  • Alberts, David S., Reiner K. Huber, & James Moffat (2010) NATO NEC C2 Maturity Model. Washington, DC: DoD Command and Control Research Program.

    Google Scholar 

  • Anderson, Robert H. et al. (1999) Securing the U.S. Defense Information Infrastructure: a Proposed Approach. Santa Monica, CA: RAND.

    Google Scholar 

  • Arquilla, John & Douglas A. Borer, eds. (2007)Information Strategy and Warfare: A Guide to Theory and Practice. New York: Routledge.

    Google Scholar 

  • Bennett, M., & E. Waltz (2007)Counterdeception Principles and Applications for National Security. Norwood, MA: Artech House.

    Google Scholar 

  • Bloom, Richard (2013)Foundations of Psychological Profiling: Terrorism, Espionage, and Deception. Boca Raton, FL: Taylor & Francis Group.

    Google Scholar 

  • Bodmer, Sean, Max Kilger, Gregory Carpenter, & Jade Jones (2012)Reverse Deception: Organized Cyber Threat Counter-Exploitation. New York: McGraw-Hill.

    Google Scholar 

  • Boush, David M., Marian Friestad, & Peter Wright (2009)Deception in the Marketplace: The Psychology of Deceptive Persuasion and Consumer Self Protection. New York: Routledge Taylor & Francis.

    Google Scholar 

  • Bruce, James B. & Michael Bennett (2008) “Foreign Denial and Deception: Analytical Imperatives,” in George, Roger Z. & James B. Bruce (2008)Analyzing Intelligence: Origins, Obstacles, and Innovations. Washington, DC: Georgetown University Press.

    Google Scholar 

  • Calbert, Gregory (2007) “Learning to Strategize,” in Kott, Alexander and William M. McEneaney (2007)Adversarial Reasoning: Computational Approaches to Reading the Opponent’s Mind. Chapman & Hall/CRC: Boca Raton, FL.

    Google Scholar 

  • Caverni, Fabre & Michel Gonzalez, eds. (1990)Cognitive Biases. New York: Elsevier

    Google Scholar 

  • Cohen Ariel & Robert E. Hamilton (2011) The Russian Military and the Georgia War: Lessons and Implications. ERAP Monograph, June 2011, Carlisle Barracks, PA: Strategic Studies Institute, U.S. Army War College.

    Google Scholar 

  • Cohen, Fred, Irwin Marin, Jeanne Sappington, Corbin Stewart, & Eric Thomas (2001)Red Teaming Experiments with Deception Technologies. Fred Cohen & Associates, November 12, 2001.http://all.net/journal/deception/experiments/experiments.html.

  • Croom, Charles (2010) “The Defender's ‘Kill Chain’,”Military Information Technology, Vol. 14, No. 10, 2010.http://www.kmimediagroup.com/files/MIT_14-10_final.pdf.

  • Dacier, Marc, Corrado Leita, Olivier Thonnard, Van-Hau Pham, & Engin Kirda (2010) “Assessing Cybercrime Through the Eyes of the WOMBAT,” in Jajodia, Sushil, Peng Liu, Vipin Swarup, & Cliff Wang, eds. (2010)Cyber Situational Awareness: Issues and Research. New York: Springer.

    Google Scholar 

  • Dawes, R.M. (2001)Everyday Irrationality: How Pseudo Scientists, Lunatics, and the Rest of Us Systematically Fail to Think Rationally. Boulder, CO: Westview Press.

    Google Scholar 

  • Defense Science Board (2012) Task Force Report: Resilient Military Systems and the Advanced Cyber Threat. Washington, DC: Department of Defense.

    Google Scholar 

  • Elsässer, Christopher & Frank J. Stech (2007) “Detecting Deception,” in Kott, Alexander & William M. McEneaney, eds. (2007)Adversarial Reasoning: Computational Approaches to Reading the Opponent’s Mind. Boca Raton FL: Taylor & Francis Group.

    Google Scholar 

  • Epstein, Edward Jay (1991)Deception: The Invisible War Between the KGB and the CIA. New York: Random House.

    Google Scholar 

  • Feer, Fred S. (1989)Thinking-Red-in-Wargaming Workshop: Opportunities for Deception and Counterdeception in the Red Planning Process. Santa Monica, CA: RAND, May 1989.

    Google Scholar 

  • Fischhoff, B., (1982) “Debiasing,” in Kahneman, D., P. Slovic, & A. Tversky, eds. (1982)Judgment under Uncertainty: Heuristics and Biases. Cambridge UK: Cambridge University Press, 1982, pp. 422–444.

    Google Scholar 

  • Fisher, David E. (2005)A Summer Bright and Terrible: Winston Churchill, Lord Dowding, Radar, and the Impossible Triumph of the Battle of Britain. Berkeley CA: Shoemaker & Hoard.

    Google Scholar 

  • Fredericks, Brian (1997) “Information Warfare: The Organizational Dimension,” in Robert E. Neiison, ed. (1997)Sun Tzu and Information Warfare. Washington, DC: National Defense University Press.

    Google Scholar 

  • Gerwehr, Scott, & Russell W. Glenn (2002).Unweaving the Web: Deception and Adaptation in Future Urban Operations. Santa Monica, CA: RAND.

    Google Scholar 

  • Gilovich, T., D. Griffin, & D. Kahneman (2002)Heuristics and Biases. Cambridge UK: Cambridge University Press.

    Google Scholar 

  • Gowlett, Phillip (2011)Moving Forward with Computational Red Teaming. DSTO-GD-0630, March 2011, Joint Operations Division, Defence Science and Technology Organisation, Canberra Australia.

    Google Scholar 

  • Harris W. R. (1968) “Intelligence and National Security: A Bibliography with Selected Annotations.” Cambridge MA: Center for International Affairs, Harvard University.

    Google Scholar 

  • Harris W. R. (1972) “Counter-deception Planning,” Cambridge MA: Harvard University.

    Google Scholar 

  • Harris W. R. (1985) “Soviet Maskirovka and Arms Control Verification,” mimeo, Monterey CA: U.S. Navy Postgraduate School, September 1985.

    Google Scholar 

  • Handel, Michael (2005)Masters of War: Classical Strategic Thought. London: Frank Cass–Taylor & Francis

    Google Scholar 

  • Heckman, K. E., M. J. Walsh, F. J. Stech, T. A. O’Boyle, S. R. Dicato, & A. F. Herber (2013). Active Cyber Defense with Denial and Deception: A Cyber-wargame Experiment.Computers and Security, 37, 72–77. doi: 10.1016/j.cose.2013.03.015

    Google Scholar 

  • Heuer, Jr., Richards J. (1981) “Strategic Deception and Counterdeception: A Cognitive Process Approach,”International Studies Quarterly, v. 25, n. 2, June 1981, pp. 294–327.

    Google Scholar 

  • Heuer, Jr., Richards J. (1999) “Chapter 8, Analysis of Competing Hypotheses,”Psychology of Intelligence, Washington, DC: Central Intelligence Agency.https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-analysis/

  • Heuser, Stephen J. (1996)Operational Deception and Counter Deception. Newport RI: Naval War College, 14 June 1996.

    Google Scholar 

  • Hobbs, C. L. (2010)Methods for Improving IAEA Information Analysis by Reducing Cognitive Biases. IAEA Paper Number: IAEA-CN-184/276.http://www.iaea.org/safeguards/Symposium/2010/Documents/PapersRepository/276.pdf

  • Hutchins Eric M., Michael J. Cloppert, & Rohan M. Amin, “Intelligence-Driven Computer Network Defense Informed by Analysis of Adversary Campaigns and Cyber Kill Chains,” 6th Annual International Conference on Information Warfare and Security, Washington, DC, 2011.http://www.lockheedmartin.com/content/dam/lockheed/data/corporate/documents/LM-White-Paper-Intel-Driven-Defense.pdf.

  • Jajodia, Sushil, Peng Liu, Vipin Swarup, & Cliff Wang, eds. (2010)Cyber Situational Awareness: Issues and Research. New York: Springer.

    Google Scholar 

  • Johnson, Paul E., S. Grazioli, K. Jamal, and R. G. Berryman (2001) “Detecting Deception: Adversarial Problem Solving in a Low Base-rate World,”Cognitive Science, v.25, n.3, May-June.

    Google Scholar 

  • Joint Chiefs of Staff (2006) Joint Publication 3–13Information Operations. Washington, DC: Department of Defense.

    Google Scholar 

  • Jones, R.V. (1995) “Enduring Principles: Some Lessons in Intelligence,”CIA Studies in Intelligence, v. 38, n. 5, pp. 37–42.

    Google Scholar 

  • Jones, R. V. (2009)Most Secret War. London: Penguin. Croom (2010).

    Google Scholar 

  • Knox, MacGregor & Williamson Murray (2001)The dynamics of Military Revolution 1300–2050. Cambridge UK: Cambridge University Press.

    Google Scholar 

  • Kott, Alexander, & Gary Citrenbaum, eds. (2010).Estimating Impact: A Handbook of Computational Methods and Models for Anticipating Economic, Social, Political and Security Effects in International Interventions. New York: Springer.

    Google Scholar 

  • Krekel, Bryan (2009)Capability of the People’s Republic of China to Conduct Cyber Warfare and Computer Network Exploitation. McLean VA: Northrop Grumman Corporation.

    Google Scholar 

  • Lachow, Irving (2011) “The Stuxnet Enigma: Implications for the Future of Cybersecurity,”Georgetown Journal of International Affairs. 118 (2010–2011).http://heinonline.org/HOL/Page?handle?=?hein.journals/geojaf11&div?=?52&g_sent?=?1&collection?=?journals#442.

  • Langner Ralph (2013)To Kill a Centrifuge A Technical Analysis of What Stuxnet’s Creators Tried to Achieve. Hamburg: The Langner Group, November 2013.http://www.langner.com/en/wp-content/uploads/2013/11/To-kill-a-centrifuge.pdf.

  • Laran, Juliano, Amy N. Dalton, & Eduardo B. Andrade (2011) “The Curious Case of Behavioral Backlash: Why Brands Produce Priming Effects and Slogans Produce Reverse Priming Effects,”Journal of Consumer Research, v. 37, April 2011.

    Google Scholar 

  • Lindsay Jon R. (2013) Stuxnet and the Limits of Cyber Warfare. University of California: Institute on Global Conflict and Cooperation, January 2013.http://www.scribd.com/doc/159991102/Stuxnet-and-the-Limits-of-Cyber-Warfare (a version published inSecurity Studies, V.22–3, 2013.https://78462f86-a-6168c89f-s-sites.googlegroups.com/a/jonrlindsay.com/www/research/papers/StuxnetJRLSS.pdf).

  • Macdonald, Scot (2007)Propaganda and Information Warfare in the Twenty-first Century: Altered Images and Deception Operations. New York: Routledge.

    Google Scholar 

  • Mandiant (2010) M-Trends: the Advanced Persistent Threat.https://www.mandiant.com/resources/mandiant-reports/

  • Mandiant (2011) M-Trends 2011.http://www.mandiant.com/resources/m-trends/

  • Mandiant (2013) APT1: Exposing One of China’s Cyber Espionage Units.http://intelreport.mandiant.com/Mandiant_APT1_Report.pdf and Appendices.

  • McNair, Philip A. (1991)Counterdeception and the Operational Commander. Newport, RI: Naval War College.

    Google Scholar 

  • McPherson, Denver E. (2010) Deception Recognition: Rethinking the Operational Commander’s Approach. Newport RI: Joint Military Operations Department, Naval War College.

    Google Scholar 

  • Murray, Williamson & Allan R. Millett (1996)Military Innovation in the Interwar Period. Cambridge UK: Cambridge University Press.

    Google Scholar 

  • Murray, Williamson & Allan R. Millett, (2000)A War To Be Won, Fighting the Second World War. Cambridge, MA: Harvard University Press.

    Google Scholar 

  • National Intelligence Council (2000)Global Trends 2015: A Dialogue About the Future with Nongovernment Experts. Washington, DC: National Intelligence Council, NIC 2000–02, December 2000.

    Google Scholar 

  • Pérez, Carlos M. (2004)Anatomy of a Rescue: What Makes Hostage Rescue Operations Successful? Thesis, Naval Postgraduate School: Monterey, CA, September 2004.

    Google Scholar 

  • Rosen, Stephen Peter (1991)Winning the Next War: Innovation and the Modern Military. Ithaca, NY: Cornell University Press.

    Google Scholar 

  • Rowe, N. C. (2003) “Counterplanning Deceptions To Foil Cyber-Attack Plans,”Proceedings of the 2003 IEEE Workshop on Information Assurance, West Point NY: United States Military Academy, June 2003.

    Google Scholar 

  • Rowe, N. C. (2004) “A model of deception during cyber-attacks on information systems,”2004 IEEE First Symposium on Multi-Agent Security and Survivability, 30–31 Aug. 2004, pp. 21–30.

    Google Scholar 

  • Rowe, N. C. (2006) “A Taxonomy of Deception in Cyberspace,”International Conference on Information Warfare and Security, Princess Anne, MD.

    Google Scholar 

  • David E. Sanger (2012)Confront and Conceal: Obama's Secret Wars and Surprising Use of American Power. Crown: New York.

    Google Scholar 

  • Scales, Jr., Robert H. (2000)Future Warfare: Anthology, Revised Edition. Carlisle Barracks PA: U.S. Army War College.

    Google Scholar 

  • Stech, F., and C. Elsässer (2007) “Midway RevisitedStokes, Mark A., Jenny: Detecting Deception by Analysis of Competing Hypothesis,”Military Operations Research. 11/2007; v. 12, n. 1, pp. 35–55.

    Google Scholar 

  • Stokes, Mark A., Jenny Lin, & L.C. Russell Hsiao (2011) “The Chinese People’s Liberation Army Signals Intelligence and Cyber Reconnaissance Infrastructure,” Project 2049 Institute, 2011: 8,http://project2049.net/documents/pla_third_department_sigint_cyber_stokes_lin_hsiao.pdf

  • The Economist (2014) “Banks and Fraud: Hacking Back–Bankers go Undercover to Catch Bad Guys,”The Economist, April 5th 2014.http://www.economist.com/news/finance-and-economics/21600148-bankers-go-undercover-catch-bad-guys-hacking-back

  • Wang, Wei, Jeffrey Bickford, Ilona Murynets, Ramesh Subbaraman, Andrea G. Forte & Gokul Singaraju (2013) "Detecting Targeted Attacks by Multilayer Deception,"Journal of Cyber Security and Mobility, v. 2, pp. 175–199.http://riverpublishers.com/journal/journal_articles/RP_Journal_2245-1439_224.pdf

  • Whaley, Barton (2006)Detecting Deception: A Bibliography of Counterdeception Across Time, Cultures, and Disciplines, 2nd Ed. Washington, DC: Foreign Denial & Deception Committee, March 2006.

    Google Scholar 

  • Whaley, B. (2007a).The Encyclopedic Dictionary of Magic 1584–2007. Lybrary.com.

    Google Scholar 

  • Whaley, B. (2007b).Stratagem: Deception and Surprise in War. Norwood, MA: Artech House.

    Google Scholar 

  • Whaley, B. (2007c). Toward a General Theory of Deception. In J. Gooch & A. Perlmutter, eds.Military Deception and Strategic Surprise. New York: Routlege.

    Google Scholar 

  • Whaley, B. (2007d).Textbook of Political-Military Counterdeception: Basic Principles & Methods. Washington, DC: Foreign Denial & Deception Committee, August 2007.

    Google Scholar 

  • Whaley, B. (2007e).The Prevalence of Guile: Deception Through Time and Across Cultures. Washington DC: Foreign Denial & Deception Committee, August 2007.

    Google Scholar 

  • Whaley, B. (2007f) "The One Percent Solution: Costs and Benefits of Military Deception," in Arquilla, John & Douglas A. Borer, eds. (2007)Information Strategy and Warfare: A Guide to Theory and Practice. New York: Routledge.

    Google Scholar 

  • Whaley, B. (2012).The Beginner’s Guide to Detecting Deception: Essay Series #1. Foreign Denial & Deception Committee, Office of the Director of National Intelligence, Washington, DC. Unpublished manuscript.

    Google Scholar 

  • Whaley, Barton, & Jeff Busby (2002) “Detecting Deception: Practice, Practitioners, and Theory,” in Godson, R., and J. Wirtz eds. (2002)Strategic Denial and Deception: The Twenty-First Century Challenge, New Brunswick, NJ: Transaction Publishers.

    Google Scholar 

  • Wheaton, Kristan J. (2011) “Teaching Strategic Intelligence Through Games,”International Journal of Intelligence and CounterIntelligence, 24:2, 367–382.

    Google Scholar 

  • Wick, Adam (2012) “Deceiving the Deceivers: Active Counterdeception for Software Protection,” DOD SBIR Award O113-IA2–1059, Contract: FA8650–12-M-1396.http://www.sbir.gov/sbirsearch/detail/393779

  • Woolley, A. W. (2009)“Which Side Are You On? How Offensive and Defensive Strategic Orientation Impact Task Focus and Information Search in Teams,” Working Paper 548, May 2009, Carnegie Mellon University, Tepper School of Business: Pittsburgh PA.http://repository.cmu.edu/tepper/548.

  • Woolley, A. W. (2010)“Is it Really Easier to be the Bad Guys? The Effects of Strategic Orientation on Team Process in Competitive Environments.” Working Paper, June 2010, Carnegie Mellon University, Tepper School of Business: Pittsburgh PA.https://student-3k.tepper.cmu.edu/gsiadoc/wp/2009-E26.pdf.

  • Woolley, A. W. (2011) “Playing Offense vs. Defense: The Effects of Team Strategic Orientation on Team Process in Competitive Environments,”Organizational Science, v.22,n.6, Nov-Dec 2011, pp. 1384–1398.

    Google Scholar 

  • Yetiv, S. (2013)National Security Through a Cockeyed Lens: How Cognitive Bias Impacts U.S. Foreign Policy. Baltimore, MD: Johns Hopkins University Press.

    Google Scholar 

  • Yuill, Jim, Dorothy Denning, & Fred Feer (2006) “Using Deception to Hide Things from Hackers: Processes, Principles, and Techniques,”Journal of Information Warfare. 5,3: pp. 26–40.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kristin E. Heckman .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Heckman, K., Stech, F. (2015). Cyber Counterdeception: How to Detect Denial & Deception (D&D). In: Jajodia, S., Shakarian, P., Subrahmanian, V., Swarup, V., Wang, C. (eds) Cyber Warfare. Advances in Information Security, vol 56. Springer, Cham. https://doi.org/10.1007/978-3-319-14039-1_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-14039-1_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-14038-4

  • Online ISBN: 978-3-319-14039-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics