Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • apa.csl
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Automated Log Audits for Privacy Compliance Validation: A Literature Survey
Karlstad University, Faculty of Health, Science and Technology (starting 2013), Department of Mathematics and Computer Science (from 2013). Karlstad University. (PriSec)ORCID iD: 0000-0001-9535-6621
Karlstad University, Faculty of Health, Science and Technology (starting 2013), Department of Mathematics and Computer Science (from 2013). Karlstad University. (PriSec)ORCID iD: 0000-0002-9980-3473
Karlstad University, Division for Information Technology. (PriSec)ORCID iD: 0000-0002-6938-4466
2016 (English)In: Privacy and Identity Management. Time for a Revolution?: 10th IFIP WG 9.2, 9.5, 9.6/11.7, 11.4, 11.6/SIG 9.2.2 International Summer School, Edinburgh, UK, August 16-21, 2015, Revised Selected Papers, Springer, 2016, Vol. 476, p. 13p. 312-326Conference paper, Published paper (Refereed)
Abstract [en]

Log audits are the technical means to retrospectively reconstruct and analyze system activities for determining if the system events are in accordance with the rules. In the case of privacy compliance, compliance by detection approaches are promoted for achieving data protection obligations such as accountability and transparency. However significant challenges remain to fulfill privacy requirements through these approaches. This paper presents a systematic literature review that reveals the theoretical foundations of the state-of-art detective approaches for privacy compliance. We developed a taxonomy based on the technical design describing the contextual relationships of the existing solutions. The technical designs of the existing privacy detection solutions are primarily classified into privacy misuse detection and privacy anomaly detection. However, the design principles of these solutions are, to validate need-to-know and access control obligations hence the state-of-art privacy compliance validation mechanisms focus on usage limitations and accountability. The privacy compliance guarantee they provide is subtle when compared to the requirements arising from privacy regulations and data protection obligations.

Place, publisher, year, edition, pages
Springer, 2016. Vol. 476, p. 13p. 312-326
Series
IFIP Advances in Information and Communication Technology, ISSN 1868-4238 ; 476
Keywords [en]
Log audit, privacy violation detection, privacy compliance, accountability, transparency
National Category
Computer and Information Sciences
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:kau:diva-38920DOI: 10.1007/978-3-319-41763-9_21ISBN: 978-3-319-41762-2 (print)ISBN: 978-3-319-41763-9 (print)OAI: oai:DiVA.org:kau-38920DiVA, id: diva2:885485
Conference
The IFIP Summer School 2015, Edinburgh, 16-21 August 2015.
Funder
EU, FP7, Seventh Framework Programme, FP7-ICT-2011-8-317550-A4CLOUD
Note

The school has a two-phase review process for submitted papers. In the first phase submitted papers (short versions) are reviewed and selected for presentation at the school. After the school, these papers can be revised (so that they can benefit from the discussion that occurred at the school) and are then reviewed again for inclusion in the school’s proceedings which will be published by Springer.

Available from: 2015-12-18 Created: 2015-12-18 Last updated: 2020-09-28Bibliographically approved
In thesis
1. Privacy-aware Use of Accountability Evidence
Open this publication in new window or tab >>Privacy-aware Use of Accountability Evidence
2017 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

This thesis deals with the evidence that enable accountability, the privacy risks involved in using them and a privacy-aware solution to the problem of unauthorized evidence disclosure. 

Legal means to protect privacy of an individual is anchored on the data protection perspective i.e., on the responsible collection and use of personal data. Accountability plays a crucial role in such legal privacy frameworks for assuring an individual’s privacy. In the European context, accountability principle is pervasive in the measures that are mandated by the General Data Protection Regulation. In general, these measures are technically achieved through automated privacy audits. System traces that record the system activities are the essential inputs to those automated audits. Nevertheless, the traces that enable accountability are themselves subject to privacy risks, because in most cases, they inform about processing of the personal data. Therefore, ensuring the privacy of the accountability traces is equally important as ensuring the privacy of the personal data. However, by and large, research involving accountability traces is concerned with storage, interoperability and analytics challenges rather than on the privacy implications involved in processing them.

This dissertation focuses on both the application of accountability evidence such as in the automated privacy audits and the privacy aware use of them. The overall aim of the thesis is to provide a conceptual understanding of the privacy compliance research domain and to contribute to the solutions that promote privacy-aware use of the traces that enable accountability. To address the first part of the objective, a systematic study of existing body of knowledge on automated privacy compliance is conducted. As a result, the state-of-the-art is conceptualized as taxonomies. The second part of the objective is accomplished through two results; first, a systematic understanding of the privacy challenges involved in processing of the system traces is obtained, second, a model for privacy aware access restrictions are proposed and formalized in order to prevent illegitimate access to the system traces. Access to accountability traces such as provenance are required for automatic fulfillment of accountability obligations, but they themselves contain personally identifiable information, hence in this thesis we provide a solution to prevent unauthorized access to the provenance traces.

Abstract [en]

This thesis deals with the evidence that enables accountability, the privacy risks involved in using it and proposes a privacy-aware solution for preventing unauthorized evidence disclosure.

Accountability plays a crucial role in the legal privacy frameworks for assuring individuals’ privacy.  In the European context, accountability principle is pervasive in the measures that are mandated by the General Data Protection Regulation. In general, these measures are technically achieved through automated privacy audits. Traces that record the system activities are the essential inputs to those audits. Nevertheless, such traces that enable accountability are themselves subject to privacy risks, because in most cases, they inform about the processing of the personal data. Therefore, ensuring the privacy of the traces is equally important as ensuring the privacy of the personal data. The aim of the thesis is to provide a conceptual understanding of the automated privacy compliance research and to contribute to the solutions that promote privacy-aware use of the accountability traces. This is achieved in this dissertation through a systematic study of the existing body of knowledge in automated privacy compliance, a systematic analysis of the privacy challenges involved in processing the traces and a proposal of a privacy-aware access control model for preventing illegitimate access to the traces.

Place, publisher, year, edition, pages
Karlstads universitet, 2017. p. 79
Series
Karlstad University Studies, ISSN 1403-8099 ; 2017:24
Keywords
Privacy, accountability, audit, evidence, system traces, provenance, access control, privacy compliance, security
National Category
Computer Systems
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-48550 (URN)978-91-7063-788-9 (ISBN)978-91-7063-789-6 (ISBN)
Presentation
2017-06-12, 21A342, Eva Eriksson salen, Universitetsgatan 2, Karlstad, 13:15 (English)
Opponent
Supervisors
Available from: 2017-05-22 Created: 2017-05-10 Last updated: 2019-06-17Bibliographically approved
2. Go the Extra Mile for Accountability: Privacy Protection Measures for Emerging Information Management Systems
Open this publication in new window or tab >>Go the Extra Mile for Accountability: Privacy Protection Measures for Emerging Information Management Systems
2020 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The thesis considers a systematic approach to design and develop techniques for preventing personal data exposure in next generation information management systems with the aim of ensuring accountability of data controllers (entities that process personal data).

With a rapid growth in the communication technologies, heterogenous computing environments that offer cost-effective data processing alternatives are emerging. Thus, the information-flow of personal data spans beyond the information processing practices of data controllers thereby involving other parties that process personal data. Moreover, in order to enable interoperability, data in such environments is given well-defined structure and meaning by means of graph-based data models. Graphs, inherently emphasize connections between things, and when graphs are used to model personal data records, the connections and the network structure may reveal intimate details about our inter-connected society.

In the European context, the General Data Protection Regulation (GDPR) provides a legal framework for personal data processing. The GDPR stipulates specific consequences for non-compliance to the data protection principles, in the view of ensuring accountability of data controllers in their personal data processing practices. Widely recognized approaches to implement the Privacy by Design (PbD) principle in the software application development process, are broader in scope. Hence, processes to implement personal data protection techniques for specific systems are not the central aspect of the aforementioned approaches.

In order to influence the implementation of techniques for preventing personal data misuse associated with sharing of data represented as graphs, a conceptual mechanism for building privacy techniques is developed. The conceptual mechanism consists of three elements, namely, a risk analysis for Semantic Web information management systems using Privacy Impact Assessment (PIA) approach, two privacy protection techniques for graphs enriched with semantics and a model to approach evaluation of adherence to the goals resulted from the risk analysis. The privacy protection techniques include an access control model that embodies purpose limitation principle—an essential aspect of GDPR—and adaptations of the differential privacy model for graphs with edge labels. The access control model takes into account the semantics of the graph elements for authorizing access to the graph data. In our differential privacy adaptations, we define and study through experiments, four different approaches to adapt the differential privacy model to edge-labeled graph datasets.

Abstract [en]

The thesis considers a systematic approach to design and develop techniques for preventing personal data exposure in next generation information management systems with the aim of ensuring accountability of data controllers (entities that process personal data).

With a rapid growth in the communication technologies, heterogenous computing environments that offer cost-effective data processing alternatives are emerging. Thus, the information-flow of personal data spans beyond the information processing practices of data controllers thereby involving other parties that process personal data. Moreover, in order to enable interoperability, data in such environments is given well-defined structure and meaning by means of graph-based data models. Graphs, inherently emphasize connections between things, and when graphs are used to model personal data records, the connections and the network structure may reveal intimate details about our inter-connected society.

The GDPR stipulates specific consequences for non-compliance to the data protection principles, in the view of ensuring accountability of data controllers in their personal data processing practices. Widely recognized approaches to implement the Privacy by Design (PbD) principle in the software application development process, are broader in scope. Hence, processes to implement privacy techniques for specific systems are not the central aspect of the aforementioned approaches.

In order to influence the implementation of techniques for preventing personal data misuse associated with sharing of data represented as graphs, a conceptual mechanism for building privacy techniques is developed. The conceptual mechanism consists of three elements, namely, a risk analysis for Semantic Web information management systems using Privacy Impact Assessment (PIA) approach, two privacy protection techniques for graphs enriched with semantics and a model to approach evaluation of adherence to the goals resulted from the risk analysis.

Place, publisher, year, edition, pages
Karlstad: Karlstads universitet, 2020. p. 30
Series
Karlstad University Studies, ISSN 1403-8099 ; 2020:32
Keywords
accountability, Privacy By Design (PdD), privacy risks, Privacy Impact Assessment (PIA), audits, privacy compliance, access control, differential privacy, graphs, edge-labeled graphs, Semantic Web
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-80531 (URN)978-91-7867-153-3 (ISBN)978-91-7867-157-1 (ISBN)
Public defence
2020-10-30, Frödingsalen, 1B364, Karlstads Universitet, Karlstad, 09:00 (English)
Opponent
Supervisors
Note

Article 5 part of thesis as manuscript, now published.

Available from: 2020-10-13 Created: 2020-09-28 Last updated: 2021-03-19Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textAbout school

Authority records

Reuben, JenniMartucci, Leonardo AFischer-Hübner, Simone

Search in DiVA

By author/editor
Reuben, JenniMartucci, Leonardo AFischer-Hübner, Simone
By organisation
Department of Mathematics and Computer Science (from 2013)Division for Information Technology
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 708 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • apa.csl
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf