Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11760))

  • 458 Accesses

Abstract

We propose a categorical model for information flows of cor- related secrets in programs. We show how programs act as transformers of such correlations, and that they can be seen as natural transformations between probabilistic constructors. We also study some basic properties of the construction.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Recall the Ariane disaster, which occurred when the software was executed within an environment for which it was not originally designed.

  2. 2.

    We do not assume that \(\mathcal{X}\) and \(\mathcal{Y}\) are finite. We do however assume that they are discrete and countable.

  3. 3.

    Here f.x denotes the application of a function f to the argument x.

  4. 4.

    In \({\mathbb D}(\mathcal{X}\times \mathcal{Z})\), the Kantorovich metric is equivalent to the standard Euclidian distance.

  5. 5.

    Rigorously, the sum in right hand side is an integral when \(\varDelta \) is not a discrete distribution. In this case, the left hand side should be applied to a measurable subset rather than the singleton.

References

  1. Adámek, J., Herrlich, H., Strecker, G.: Abstract and Concrete Categories. Wiley, New York (1990)

    MATH  Google Scholar 

  2. Alvim, M., Andrés, M., Palamidessi, C.: Probabilistic information flow. In Proceedings of the 25th IEEE Symposium on Logic in Computer Science, pp. 314–321 (2010)

    Google Scholar 

  3. Alvim, M., Chatzikokolakis, K., McIver, A., Morgan, C., Palamidessi, C., Smith, G.: Additive and multiplicative notions of leakage, and their capacities. In: Proceedings of the IEEE 27th Computer Security Foundations Symposium, pp. 308–322 (2014)

    Google Scholar 

  4. Alvim, M., Chatzikokolakis, K., Palamidessi, C., Smith, G.: Measuring information leakage using generalized gain functions. In: Proceedings of the 25th IEEE Computer Security Foundations Symposium, pp. 265–279, June 2012

    Google Scholar 

  5. Braun, C., Chatzikokolakis, K., Palamidessi, C.: Quantitative notions of leakage for one-try attacks. In: Proceedings of the 25th International Conference on Mathematical Foundations of Programming Semantics, pp. 75–91 (2009)

    Article  Google Scholar 

  6. Giry, M.: A categorical approach to probability theory. In: Banaschewski, B. (ed.) Categorical Aspects of Topology and Analysis. LNM, vol. 915, pp. 68–85. Springer, Heidelberg (1982). https://doi.org/10.1007/BFb0092872

    Chapter  Google Scholar 

  7. Karlof, C., Wagner, D.: Hidden Markov model cryptanalysis. In: Walter, C.D., Koç, Ç.K., Paar, C. (eds.) CHES 2003. LNCS, vol. 2779, pp. 17–34. Springer, Heidelberg (2003). https://doi.org/10.1007/978-3-540-45238-6_3

    Chapter  Google Scholar 

  8. McIver, A., Meinicke, L., Morgan, C.: Compositional closure for Bayes risk in probabilistic noninterference. In: Abramsky, S., Gavoille, C., Kirchner, C., Meyer auf der Heide, F., Spirakis, P.G. (eds.) ICALP 2010. LNCS, vol. 6199, pp. 223–235. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-14162-1_19

    Chapter  Google Scholar 

  9. McIver, A., Morgan, C., Rabehaja, T.: Abstract hidden Markov models: a monadic account of quantitative information flow. In: Proceedings of the 30th Annual ACM/IEEE Symposium on Logic in Computer Science, pp. 597–608 (2015)

    Google Scholar 

  10. McIver, A.K., Morgan, C.C., Rabehaja, T.: Algebra for quantitative information flow. In: Höfner, P., Pous, D., Struth, G. (eds.) RAMICS 2017. LNCS, vol. 10226, pp. 3–23. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-57418-9_1

    Chapter  Google Scholar 

  11. Bordenabe, N., McIver, A., Morgan, C., Rabehaja, T.: Reasoning about distributed secrets. In: Bouajjani, A., Silva, A. (eds.) FORTE 2017. LNCS, vol. 10321, pp. 156–170. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-60225-7_11

    Chapter  Google Scholar 

  12. McIver, A., Morgan, C., Smith, G., Espinoza, B., Meinicke, L.: Abstract channels and their robust information-leakage ordering. In: Abadi, M., Kremer, S. (eds.) POST 2014. LNCS, vol. 8414, pp. 83–102. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-642-54792-8_5

    Chapter  Google Scholar 

  13. Parthasarathy, K.R.: Probability Measures on Metric Spaces. Academic Press, Cambridge (1967)

    Book  Google Scholar 

  14. Smith, G.: On the foundations of quantitative information flow. In: de Alfaro, L. (ed.) FoSSaCS 2009. LNCS, vol. 5504, pp. 288–302. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-00596-1_21

    Chapter  Google Scholar 

  15. Smith, G.: Quantifying information flow using min-entropy. In: Proceedings of the 8th International Conference on Quantitative Evaluation of SysTems, pp. 159–167 (2011)

    Google Scholar 

  16. van Breugel, F.: The metric monad for probabilistic nondeterminism (2005). Draft available at http://www.cse.yorku.ca/~franck/research/drafts/monad.pdf

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tahiry Rabehaja .

Editor information

Editors and Affiliations

A Compositional sequential composition

A Compositional sequential composition

The sequential composition (8) assumes that the mutable secrets of H and K are represented by the exact same variables ranging in \(\mathcal{X}\). A slightly more general version of that result is to assume that H and K may share a hidden variable ranging in \(\mathcal{X}\) and that they also have other hidden variables specific to each HMM. In this case, if H contains another hidden secret with type \(\mathcal{X}_1\) then this secret must be considered as a collateral secret in K—thus K leaks collateral information about it, but does not update that secret. More precisely, we have the following lemma.

Lemma 4

Let with be two HMM matrices. We have

$$\begin{aligned}{}[\![H{;}K]\!]^\mathcal{Z}= [\![H]\!]^{\mathcal{X}_2\times \mathcal{Z}}{;}[\![K]\!]^{\mathcal{X}_1\times \mathcal{Z}}. \end{aligned}$$
(10)

where the composition HK is given by Eq. (2) and the composition operator (; ) on the right takes the isomorphism between \(\mathcal{X}_1\times \mathcal{X}_2\) and \(\mathcal{X}_2\times \mathcal{X}_1\) into account.

Proof

Firstly, we have the following types

  • \([\![H{;}K]\!]^\mathcal{Z}{{:}\,}{\mathbb D}(\mathcal{X}\times \mathcal{X}_1\times \mathcal{X}_2\times \mathcal{Z}){\rightarrow }{\mathbb D}^2(\mathcal{X}\times \mathcal{X}_1\times \mathcal{X}_2\times \mathcal{Z})\)

  • \([\![H]\!]^{\mathcal{X}_2\times \mathcal{Z}}{{:}\,}{\mathbb D}(\mathcal{X}\times \mathcal{X}_1\times \mathcal{X}_2\times \mathcal{Z}){\rightarrow }{\mathbb D}^2(\mathcal{X}\times \mathcal{X}_1\times \mathcal{X}_2\times \mathcal{Z})\)

  • \([\![K]\!]^{\mathcal{X}_1\times \mathcal{Z}}{{:}\,}{\mathbb D}(\mathcal{X}\times \mathcal{X}_2\times \mathcal{X}_1\times \mathcal{Z}){\rightarrow }{\mathbb D}^2(\mathcal{X}\times \mathcal{X}_2\times \mathcal{X}_1\times \mathcal{Z})\)

Let us assume that we have applied the isomorphism that permutes the positions of the \(\mathcal{X}_1\) and \(\mathcal{X}_2\) in the domain of \([\![K]\!]^{\mathcal{X}_1\times \mathcal{Z}}\) so that the right hand side composition is well defined and is given by the sequential composition in the Giry monad [6]. This type matching function is implicitly embedded into the right hand side composition in (10).

Let \(\varPi {{:}\,}{\mathbb D}(\mathcal{X}\times \mathcal{X}_1\times \mathcal{X}_2\times \mathcal{Z})\), Eqs. (1) and (2) give

$$\begin{aligned} (\varPi {\rangle }(H;K))_{y_1y_2,x''x_1'x_2'z} = \sum _{xx_1x_2}\varPi _{xx_1x_2z}\sum _{x'}H_{xx_1y_1x'x_1'}K_{x'x_2y_2x''x_2'} \end{aligned}$$
(11)

On the other hand, \([\![H]\!]^{\mathcal{X}_2\times \mathcal{Z}}{;}[\![K]\!]^{\mathcal{X}_1\times \mathcal{Z}} = \mathsf{avg}\circ {\mathbb D}[\![K]\!]^{\mathcal{X}_1\times \mathcal{Z}}\circ [\![H]\!]^{\mathcal{X}_2\times \mathcal{Z}}\) uses the Kleisli lifting. Let us consider an inner \(\delta \) of \([\![H]\!]^{\mathcal{X}_2\times \mathcal{Z}}{;}[\![K]\!]^{\mathcal{X}_1\times \mathcal{Z}}.\varPi \). There exists an inner \(\alpha \) of \([\![H]\!]^{\mathcal{X}_2\times \mathcal{Z}}.\varPi \) such that \(\delta \) is an inner of \([\![K]\!]^{\mathcal{X}_1\times \mathcal{Z}}.\alpha \). That is, there exist two observations y and \(y'\) such that

$$ \alpha _{x'x_1'x_2z}^{y_1} = \frac{\sum _{xx_1}\varPi _{xx_1x_2z}H_{xx_1y_1x'x'_1}}{\overline{\alpha }^{y_1}} $$

where \(\overline{\alpha }^{y_1}= \sum _{x'x_1'x_2z}\left( \sum _{xx_1}\varPi _{xx_1x_2z}H_{xx_1y_1x'x'_1}\right) \), and

$$ \delta _{x''x'_1x'_2z}^{y_1y_2} = \frac{\sum _{x'x_2}\alpha _{x'x'_1x_2z}^{y_1}K_{x'x_2y_2x''x_2'}}{\overline{\delta }^{y_1y_2}} $$

where

$$ \overline{\delta }^{y_1y_2} = \sum _{x''x_1'x_2'z}\left( \sum _{x'x_2}\alpha _{x'x'_1x_2z}^{y_1}K_{x'x_2y_2x''x_2'}\right) $$

By substituting \(\alpha \) into the expression of \(\delta \) and simplifying \(\overline{\alpha }^{y_1}\), we have

$$\begin{aligned} \delta _{x''x'_1x'_2z}^{y_1y_2}= & {} \frac{ \sum _{x'x_2}\left( \sum _{xx_1}\varPi _{xx_1x_2z}H_{xx_1y_1x'x'_1}\right) K_{x'x_2y_2x''x'_2} }{ \sum _{x''x_1'x_2'z}\left( \sum _{x'x_2}\left( \sum _{xx_1}\varPi _{xx_1x_2z}H_{xx_1y_1x'x'_1}\right) K_{x'x_2y_2x''x_2'}\right) }\\= & {} \frac{ \sum _{xx_1x_2}\varPi _{xx_1x_2z}\sum _{x'}H_{xx_1y_1x'x'_1}K_{x'x_2y_2x''x'_2} }{ \sum _{x''x_1'x_2'z}\left( \sum _{xx_1x_2}\varPi _{xx_1x_2z}\sum _{x'}H_{xx_1y_1x'x'_1}K_{x'x_2y_2x''x_2'}\right) }\\ \end{aligned}$$

The inner \(\delta ^{y_1y_2}\) corresponds exactly to a normalized \((y_1,y_2)\)-column of \(\varPi {\rangle }(H;K)\) as per Eq. (11).

Lemma 4 is a straightforward generalisation our sequential composition for systems independent of any other collateral type [8, 9] as well as our compositional, but single secret, semantics [10, 11]. That is, the closed sequential composition is obtained by setting \(\mathcal{X}_1 = \mathcal{X}_2 = \mathcal{Z}= \{*\}\) while the context aware version is obtained by setting \(\mathcal{X}_1 = \mathcal{X}_2 = \{*\}\). Lemma 4 is slightly more general than the composition we define in [10] because it allows the declaration of new variables “on the go”. For instance, if we set \(\mathcal{X}_1 = \{*\}\), then a new secret variable of type \(\mathcal{X}_2\) that is declared in K does not change in H. The lifted map \([\![H]\!]^{\mathcal{X}_2}\) is aware of this upcoming new secret and accounts for the correlation between the new variable and current secret of type \(\mathcal{X}\).

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Rabehaja, T., McIver, A., Morgan, C., Struth, G. (2019). Categorical Information Flow. In: Alvim, M., Chatzikokolakis, K., Olarte, C., Valencia, F. (eds) The Art of Modelling Computational Systems: A Journey from Logic and Concurrency to Security and Privacy. Lecture Notes in Computer Science(), vol 11760. Springer, Cham. https://doi.org/10.1007/978-3-030-31175-9_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-31175-9_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-31174-2

  • Online ISBN: 978-3-030-31175-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics