Change search
Link to record
Permanent link

Direct link
BETA
Berthold, Stefan
Publications (10 of 14) Show all publications
Fischer-Hübner, S. & Berthold, S. (2017). Privacy Enhancing Technologies. (3ed.). In: John Vacca (Ed.), Computer and Information Security Handbook: (pp. 759-778). Morgan Kauffman/Elsevier
Open this publication in new window or tab >>Privacy Enhancing Technologies.
2017 (English)In: Computer and Information Security Handbook / [ed] John Vacca, Morgan Kauffman/Elsevier , 2017, 3, p. 759-778Chapter in book (Refereed)
Abstract [en]

In our modern information age, recent technical developments and trends, such as mobile and pervasive computing, big data, cloud computing, and Web 2.0 applications, increasingly pose privacy dilemmas. Due to the low costs and technical advances of storage technologies, masses of personal data can easily be stored. Once disclosed, these data may be retained forever, often without the knowledge of the individuals concerned, and be removed with difficulty. Hence, it has become hard for individuals to manage and control their personal spheres. Both legal and technical means are needed to protect privacy and to (re-)establish the individuals' control. This chapter provides an overview to the area of Privacy-Enhancing Technologies (PETs), which help to protect privacy by technically enforcing legal privacy principles. It will start with defining the legal foundations of PETs, and will present a classification of PETs as well as a definition of traditional privacy properties that PETs are addressing and metrics for measuring the level of privacy that PETs are providing. Then, a selection of the most relevant PETs is presented.

Place, publisher, year, edition, pages
Morgan Kauffman/Elsevier, 2017 Edition: 3
Keywords
Data minimization; Data subjects; Legal privacy; Legitimacy; Personal privacy; Privacy; Privacy-enhancing technologies; Purpose limitation; Purpose specification; Transparency
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-65135 (URN)10.1016/B978-0-12-803843-7.00053-3 (DOI)9780128039298 (ISBN)9780128038437 (ISBN)
Available from: 2017-11-09 Created: 2017-11-09 Last updated: 2018-06-26Bibliographically approved
Angulo, J., Berthold, S., Elkhiyaoui, K., Fernandez Gago, M. C., Fischer-Hübner, S., David, N., . . . Önen, M. (2015). D:D-5.3 User-Centric Transparency Tools V2.
Open this publication in new window or tab >>D:D-5.3 User-Centric Transparency Tools V2
Show others...
2015 (English)Report (Refereed)
Publisher
p. 57
National Category
Computer and Information Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-38919 (URN)
Funder
EU, FP7, Seventh Framework Programme, FP7-ICT-2011-8-317550-A4CLOUD
Available from: 2015-12-18 Created: 2015-12-18 Last updated: 2018-06-04Bibliographically approved
Berthold, S. (2014). Inter-temporal Privacy Metrics. (Doctoral dissertation). Karlstad: Karlstad University Press
Open this publication in new window or tab >>Inter-temporal Privacy Metrics
2014 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Informational privacy of individuals has significantly gained importance after information technology has become widely deployed. Data, once digitalised, can be copied, distributed, and long-term stored at negligible costs. This has dramatic consequences for individuals that leave traces in the form of personal data whenever they interact with information technology, for instance, computers and phones; or even when information technology is recording the personal data of aware or unaware individuals. The right of individuals for informational privacy, in particular to control the flow and use of their personal data, is easily undermined by those controlling the information technology.

The objective of this thesis is to study the measurement of informational privacy with a particular focus on scenarios where an individual discloses personal data to a second party which uses this data for re-identifying the individual within a set of other individuals. We contribute with privacy metrics for several instances of this scenario in the publications included in this thesis, most notably one which adds a time dimension to the scenario for modelling the effects of the time passed between data disclosure and usage. The result is a new framework for inter-temporal privacy metrics.

Place, publisher, year, edition, pages
Karlstad: Karlstad University Press, 2014. p. 20
Series
Karlstad University Studies, ISSN 1403-8099 ; 2014:63
Keywords
privacy, unlinkability, metrics, uncertainty, valuation process, domain-specific language, anonymous communication
National Category
Computer Systems Communication Systems Probability Theory and Statistics
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-33972 (URN)978-91-7063-603-5 (ISBN)
Public defence
2014-12-16, Karlstad University, 21A342 (Eva Erikssonsalen), Universitetsgatan 2, 651 87 Karlstad, 08:15 (English)
Opponent
Supervisors
Available from: 2014-11-25 Created: 2014-10-03 Last updated: 2014-11-25Bibliographically approved
Berthold, S., Fischer-Hübner, S., Martucci, L. & Pulls, T. (2013). Crime and Punishment in the Cloud: Accountability, Transparency, and Privacy. In: : . Paper presented at DIMACS/BIC/A4Cloud/CSA International Workshop on Trustworthiness, Accountability and Forensics in the Cloud (TAFC).
Open this publication in new window or tab >>Crime and Punishment in the Cloud: Accountability, Transparency, and Privacy
2013 (English)Conference paper, Published paper (Refereed)
Abstract [en]

The goal of this work is to reason on the complexity of the relationship between three non-functional requirements in cloud comput-ing; privacy, accountability, and transparency. We provide insights on the complexity of this relationship from the perspectives of end-users, cloud service providers, and third parties, such as auditors. We shed light onthe real and perceived conflicts between privacy, transparency, and accountability, using a formal definition of transparency and an analysis on how well a privacy-preserving transparency-enhancing tool may assist in achieving accountability. Furthermore, we highlight the importance of the privacy impact assessment process for the realisation of both transparency and accountability.

Keywords
cloud accountability, cloud service provider, cloud computing, non-functional requirement, third party, privacy impact assessment process, formal definition, privacy-preserving transparency-enhancing tool
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-27844 (URN)
Conference
DIMACS/BIC/A4Cloud/CSA International Workshop on Trustworthiness, Accountability and Forensics in the Cloud (TAFC)
Projects
A4Cloud
Available from: 2013-06-14 Created: 2013-06-14 Last updated: 2018-06-04Bibliographically approved
Fischer-Hübner, S. & Berthold, S. (2013). Privacy-Enhancing Technologies (2ed.). In: Computer and Information Security Handbook: (pp. 755-772). Elsevier
Open this publication in new window or tab >>Privacy-Enhancing Technologies
2013 (English)In: Computer and Information Security Handbook, Elsevier, 2013, 2, p. 755-772Chapter in book (Other academic)
Abstract [en]

In our modern information age, recent technical developments and trends, such as mobile and pervasive computing, cloud computing, and Web 2.0 applications, increasingly pose privacy dilemmas. Due to the low costs and technical advances of storage technologies, masses of personal data can easily be stored. Once disclosed, these data may be retained forever, often without the knowledge of the individuals concerned, and be removed with difficulty. Hence, it has become hard for individuals to manage and control their personal spheres. Both legal and technical means are needed to protect privacy and to (re)establish the individuals’ control. This chapter provides an overview to the area of privacy-enhancing technologies (PETs), which help to protect privacy by technically enforcing legal privacy principles. It will start with defining the legal foundations of PETs and will present a classification of PETs as well as a definition of traditional privacy properties that PETs are addressing and metrics for measuring the level of privacy that PETs are providing. Then, a selection of the most relevant PETs is presented.

Place, publisher, year, edition, pages
Elsevier, 2013 Edition: 2
Keywords
privacy; privacy-enhancing technologies; personal privacy; legal privacy; legitimacy; purpose specification; purpose limitation; data minimization; transparency; data
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-45212 (URN)10.1016/B978-0-12-394397-2.00043-X (DOI)2-s2.0-84883964402 (Scopus ID)9780123943972 (ISBN)
Available from: 2016-08-18 Created: 2016-08-16 Last updated: 2018-06-04Bibliographically approved
Berthold, S. (2013). The Privacy Option Language: Specification & Implementation.
Open this publication in new window or tab >>The Privacy Option Language: Specification & Implementation
2013 (English)Report (Other academic)
Abstract [en]

The data protection laws in Europe require that data controllers provide privacy policies to inform individuals about the prospective processing of their personal data. The ever growing expressiveness of privacy policy languages allows to specify policies in a growing number of details. This and new options for policy negotiations transformed rather general privacy policies into specific privacy contracts between the data controller and the individual.

In this report, we specify a privacy contract language and call it the Privacy Option Language. It is modelled after the analogy between financial option contracts and data disclosures which has been presented in previous work and led to the Privacy Option notion. The language specification provides privacy by design through its data minimisation provisions, i.e., all contracts are automatically reduced to their canonical form so that individual differences in the contract formulation are inherently normalised. The language specification is extensible in two ways. First, hooks are specified in the core language and can be used to connect sublanguages. The freedom to choose any suitable sublanguage allows to specify language details independent of the core language. Second, the Privacy Option Language itself can be used as a sublanguage within a more general-domain language. We give examples for both types of extensions. Additionally, we provide tools for evaluating semantics such as human-readable presentations of Privacy Options and contract management. The definitions of the semantics are kept simple and serve as templates for more practical ones.

All functionality can be checked by interactive tests in a standard multi-purpose programming language interpreter, since the Privacy Option Language is specified as an embedded domain-specific language within Haskell. Hands-on examples are provided along with the language specification.

Publisher
p. 70
Series
Karlstad University Studies, ISSN 1403-8099 ; 2013:29
Keywords
privacy policy language, inter-temporal privacy
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-27396 (URN)978-91-7063-507-6 (ISBN)
Projects
PETweb II
Available from: 2013-05-27 Created: 2013-05-27 Last updated: 2018-01-11Bibliographically approved
Berthold, S. (2011). Towards a Formal Language for Privacy Options. In: Simone Fischer-Hübner, Penny Duquenoy, Marit Hansen, Ronald Leenes & Ge Zhang (Ed.), Privacy and Identity Management for Life: . Paper presented at 6th IFIP WG 9.2, 9.6/11.7, 11.4, 11.6/PrimeLife International Summer School, Helsingborg, Sweden, August 2-6, 2010 (pp. 27-40). Springer
Open this publication in new window or tab >>Towards a Formal Language for Privacy Options
2011 (English)In: Privacy and Identity Management for Life / [ed] Simone Fischer-Hübner, Penny Duquenoy, Marit Hansen, Ronald Leenes & Ge Zhang, Springer, 2011, p. 27-40Conference paper, Published paper (Refereed)
Place, publisher, year, edition, pages
Springer, 2011
Series
IFIP Advances in Information and Communication Technology ; 352
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-7454 (URN)10.1007/978-3-642-20769-3_3 (DOI)000300068100003 ()
Conference
6th IFIP WG 9.2, 9.6/11.7, 11.4, 11.6/PrimeLife International Summer School, Helsingborg, Sweden, August 2-6, 2010
Available from: 2011-05-25 Created: 2011-05-25 Last updated: 2018-01-12Bibliographically approved
Berthold, S. (2011). Towards Inter-temporal Privacy Metrics. (Licentiate dissertation). Karlstad: Karlstad University
Open this publication in new window or tab >>Towards Inter-temporal Privacy Metrics
2011 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

Informational privacy of individuals has significantly gained importance after information technology has become widely deployed. Data, once digitalised, can be copied and distributed at negligible costs. This has dramatic consequences for individuals that leave traces in form of personal data whenever they interact with information technology. The right of individuals for informational privacy, in particular to control the flow and use of their personal data, is easily undermined by those controlling the information technology.

The objective of this thesis is the measurement of informational privacy with a particular focus on scenarios where an individual discloses personal data to a second party, the data controller, which uses this data for re-identifying the individual within a set of others, the population. Several instances of this scenario are discussed in the appended papers, most notably one which adds a time dimension to the scenario for modelling the effects of the time passed between data disclosure and usage. This extended scenario leads to a new framework for inter-temporal privacy metrics.

The common dilemma of all privacy metrics is their dependence on the information available to the data controller. The same information may or may not be available to the individual and, as a consequence, the individual may be misguided in his decisions due to his limited access to the data controller’s information when using privacy metrics. The goal of this thesis is thus not only the specification of new privacy metrics, but also the contribution of ideas for mitigating this dilemma. However a solution will rather be a combination of technological, economical and legal means than a purely technical solution.

Place, publisher, year, edition, pages
Karlstad: Karlstad University, 2011. p. 17
Series
Karlstad University Studies, ISSN 1403-8099 ; 2011:25
Keywords
privacy, unlinkability, metrics, uncertainty, valuation process, domain-specific language, anonymous communication
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-7319 (URN)978-91-7063-357-7 (ISBN)
Presentation
2011-05-23, Sjöströmsalen 1B309, Karlstads universitet, Karlstad, 13:00 (English)
Opponent
Supervisors
Available from: 2011-05-26 Created: 2011-04-15 Last updated: 2018-01-12Bibliographically approved
Zhang, G. & Berthold, S. (2010). Hidden VoIP Calling Records from Networking Intermediaries. Paper presented at Principles, System and Applications of IP Telecommunications (IPTCOMM2010). Paper presented at Principles, System and Applications of IP Telecommunications (IPTCOMM2010). Munich, Germany: ACM
Open this publication in new window or tab >>Hidden VoIP Calling Records from Networking Intermediaries
2010 (English)Conference paper, Published paper (Refereed)
Abstract

While confidentiality of telephone conversation contents has recently received considerable attention in Internet telephony (VoIP), the protection of the caller--callee relation is largely unexplored. From the privacy research community we learn that this relation can be protected by Chaum's mixes. In early proposals of mix networks, however, it was reasonable to assume that high latency is acceptable. While the general idea has been deployed for low latency networks as well, important security measures had to be dropped for achieving performance. The result is protection against a considerably weaker adversary model in exchange for usability. In this paper, we show that it is unjustified to conclude that low latency network applications imply weak protection. On the contrary, we argue that current Internet telephony protocols provide a range of promising preconditions for adopting anonymity services with security properties similar to those of high latency anonymity networks. We expect that implementing anonymity services becomes a major challenge as customer privacy becomes one of the most important secondary goals in any (commercial) Internet application.

Place, publisher, year, edition, pages
Munich, Germany: ACM, 2010
Keywords
anonymity, voip, mix networks
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-11443 (URN)
Conference
Principles, System and Applications of IP Telecommunications (IPTCOMM2010)
Available from: 2012-02-08 Created: 2012-02-08 Last updated: 2018-01-12Bibliographically approved
Berthold, S. & Böhme, R. (2010). Valuating Privacy with Option Pricing Theory. In: Tyler Moore, David Pym, and Christos Ioannidis (Ed.), Economics of Information Security and Privacy: (pp. 187-193). New York: Springer
Open this publication in new window or tab >>Valuating Privacy with Option Pricing Theory
2010 (English)In: Economics of Information Security and Privacy / [ed] Tyler Moore, David Pym, and Christos Ioannidis, New York: Springer , 2010, p. 187-193Chapter in book (Refereed)
Abstract

One of the key challenges in the information society is responsible handling of personal data. An often-cited reason why people fail to make rational decisions regarding their own informational privacy is the high uncertainty about future consequences of information disclosures today. This chapter builds an analogy to financial options and draws on principles of option pricing to account for this uncertainty in the valuation of privacy. For this purpose, the development of a data subject's personal attributes over time and the development of the attribute distribution in the population are modelled as two stochastic processes, which fit into the Binomial Option Pricing Model (BOPM). Possible applications of such valuation methods to guide decision support in future privacy-enhancing technologies (PETs) are sketched

Place, publisher, year, edition, pages
New York: Springer, 2010
Keywords
privacy, metrics, option pricing theory
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-10132 (URN)9781441969668 (ISBN)
Available from: 2012-02-08 Created: 2012-02-08 Last updated: 2018-01-12Bibliographically approved
Organisations

Search in DiVA

Show all publications