Abstract
Over the years, intrusion detection has matured into a field replete with anomaly detectors of various types. These detectors are tasked with detecting computer-based attacks, insider threats, worms and more. Their abundance easily prompts the question – is anomaly detection improving in efficacy and reliability? Current evaluation strategies may provide answers; however, they suffer from problems. For example, they produce results that are only valid within the evaluation data set and they provide very little by way of diagnostic information to tune detector performance in a principled manner.
This paper studies the problem of acquiring reliable performance results for an anomaly detector. Aspects of a data environment that will affect detector performance, such as the frequency distribution of data elements, are identified, characterized and used to construct a synthetic data environment to assess a frequency-based anomaly detector. In a series of experiments that systematically maps out the detector’s performance, areas of detection weaknesses are exposed, and strengths are identified. Finally, the extensibility of the lessons learned in the synthetic environment are observed using real-world data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Anderson, D., Lunt, T.F., Javitz, H., Tamaru, A., Valdes, A.: Detecting Unusual Program Behavior Using the Statistical Component of the Next-Generation Intrusion Detection Expert System (NIDES). Technical Report SRI-CSL-95-06, Computer Science Laboratory, SRI International (May 1995)
Anderson, D., Lunt, T.F., Javitz, H., Tamaru, A., Valdes, A.: Safeguard Final Report: Detecting Unusual Program Behavior Using the NIDES Statistical Component. Technical Report, Computer Science Laboratory, SRI International, Menlo Park, California (December 02, 1993)
Arbel, Gil.: Anomaly Detection Falls Short TechWorld (March 13, 2006), http://www.techworld.com/networking/features/index.cfm?featureID=2331
Denning, D.E.: An Intrusion-Detection Model. IEEE Transactions on Software Engineering SE-13(2), 222–232 (1987)
Forrest, S.: Computer Immune Systems. Data sets for sequence-based intrusion detection: Computer Science Department, University of New Mexico, Albuquerque, New Mexico (2006), http://www.cs.unm.edu/~immsec/systemcalls.htm
Forrest, S., Hofmeyr, S.A., Somayaji, A., Longstaff, T.A.: A Sense of Self for Unix Processes. In: IEEE Symposium on Security and Privacy, Oakland, California, May 06-08, 1996, pp. 120–128. IEEE Computer Society Press, Los Alamitos (1996)
Ghosh, A.K., Schwartzbart, A., Schatz, M.: Learning Program Behavior Profiles for Intrusion Detection. In: 1st USENIX Workshop on Intrusion Detection and Network Monitoring, Santa Clara, pp. 51–62 (1999)
Ghosh, A.K., Wanken, J., Charron, F.: Detecting Anomalous and Unknown Intrusions Against Programs. In: 14th Annual Computer Security Applications Conference, Phoenix, AZ, December 07-11,1998, pp. 259–267. IEEE Computer Society Press, Los Alamitos (1998)
Javitz, H.S., Valdes, A.: The NIDES Statistical Component: Description and Justification. Annual Report A010, March 07, 1994, SRI International, Menlo Park, California (1994)
Javitz, H.S., Valdes, A.: The SRI IDES Statistical Anomaly Detector. In: IEEE Symposium on Research in Security and Privacy, Oakland, California, May 20-22, 1991, pp. 316–326. IEEE Computer Security Press, Los Alamitos (1991)
Jha, S., Tan, K.M.C., Maxion, R.A.: Markov Chains, Classifiers, and Intrusion Detection. In: 14th IEEE Computer Security Foundations Workshop, Cape Breton, Nova Scotia, Canada, June 11-13, 2001, pp. 206–219 (2001)
Swets, J.A., Pickett, R.M.: Evaluation of Diagnostic Systems: Methods from Signal Detection Theory. Academic Press, New York (1982)
Tan, K.M.C., Maxion, R.A.: Determining the Operational Limits of an Anomaly-Based Intrusion Detector. IEEE Journal on Selected Areas in Communications, Special Issue on Design and Analysis Techniques for Security Assurance 21(1), 96–110 (2003)
Tan, K.M.C., Maxion, R.A.: The Effects of Algorithmic Diversity on Anomaly Detector Performance. In: International Conference on Dependable Systems & Networks (DSN 2005), Yokohama, Japan, June 28-July 01, 2005, pp. 216–225. IEEE Computer Society Press, Los Alamitos (2005)
Valdes, A., Anderson, D.: Statistical Methods for Computer Usage Anomaly Detection Using NIDES (Next-Generation Intrusion Detection Expert System). In: Third International Workshop on Rough Sets and Soft Computing (RSSC 1994), San Jose, California, November 10-12, 1994, pp. 104–111, San Diego (1995)
Warrender, C., Forrest, S., Pearlmutter, B.: Detecting Intrusions Using System Calls: Alternative Data Models. In: IEEE Symposium on Security and Privacy, Oakland, California, May 09-12, 1999, pp. 133–145. IEEE Computer Society Press, Los Alamitos (1999)
Zissman, M.: 1998/99 DARPA Intrusion Detection Evaluation data sets, MIT Lincoln Laboratory, http://www.ll.mit.edu/IST/ideval/data/data_index.html
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Hansen, J.P., Tan, K.M.C., Maxion, R.A. (2006). Anomaly Detector Performance Evaluation Using a Parameterized Environment. In: Zamboni, D., Kruegel, C. (eds) Recent Advances in Intrusion Detection. RAID 2006. Lecture Notes in Computer Science, vol 4219. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11856214_6
Download citation
DOI: https://doi.org/10.1007/11856214_6
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-39723-6
Online ISBN: 978-3-540-39725-0
eBook Packages: Computer ScienceComputer Science (R0)