Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • apa.csl
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Towards a Differential Privacy Theory for Edge-Labeled Directed Graphs
Karlstad University, Faculty of Health, Science and Technology (starting 2013), Department of Mathematics and Computer Science (from 2013).ORCID iD: 0000-0001-9535-6621
2018 (English)In: SICHERHEIT 2018 / [ed] Hanno Langweg, Michael Meier, Bernhard Witt & Delphine Reinhardt, Gesellschaft für Informatik, 2018, p. 273-278Conference paper, Published paper (Other academic)
Abstract [en]

Increasingly, more and more information is represented as graphs such as social network data, financial transactions and semantic assertions in Semantic Web context. Mining such data about people for useful insights has enormous social and commercial benefits. However, the privacy of the individuals in datasets is a major concern. Hence, the challenge is to enable analyses over a dataset while preserving the privacy of the individuals in the dataset. Differential privacy is a privacy model that offers a rigorous definition of privacy, which says that from the released results of an analysis it is ’difficult’ to determine if an individual contributes to the results or not. The differential privacy model is extensively studied in the context of relational databases. Nevertheless, there has been growing interest in the adaptation of differential privacy to graph data. Previous research in applying differential privacy model to graphs focuses on unlabeled graphs. However, in many applications graphs consist of labeled edges, and the analyses can be more expressive, which now takes into account the labels. Thus, it would be of interest to study the adaptation of differential privacy to edge-labeled directed graphs. In this paper, we present our foundational work towards that aim. First we present three variant notions of an individual’s information being/not being in the analyzed graph, which is the basis for formalizing the differential privacy guarantee. Next, we present our plan to study particular graph statistics using the differential privacy model, given the choice of the notion that represent the individual’s information being/not being in the analyzed graph.

Place, publisher, year, edition, pages
Gesellschaft für Informatik, 2018. p. 273-278
National Category
Computer Systems
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:kau:diva-80365DOI: 10.18420/sicherheit2018_24OAI: oai:DiVA.org:kau-80365DiVA, id: diva2:1470449
Conference
SICHERHEIT 2018
Available from: 2020-09-24 Created: 2020-09-24 Last updated: 2020-12-21Bibliographically approved
In thesis
1. Go the Extra Mile for Accountability: Privacy Protection Measures for Emerging Information Management Systems
Open this publication in new window or tab >>Go the Extra Mile for Accountability: Privacy Protection Measures for Emerging Information Management Systems
2020 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The thesis considers a systematic approach to design and develop techniques for preventing personal data exposure in next generation information management systems with the aim of ensuring accountability of data controllers (entities that process personal data).

With a rapid growth in the communication technologies, heterogenous computing environments that offer cost-effective data processing alternatives are emerging. Thus, the information-flow of personal data spans beyond the information processing practices of data controllers thereby involving other parties that process personal data. Moreover, in order to enable interoperability, data in such environments is given well-defined structure and meaning by means of graph-based data models. Graphs, inherently emphasize connections between things, and when graphs are used to model personal data records, the connections and the network structure may reveal intimate details about our inter-connected society.

In the European context, the General Data Protection Regulation (GDPR) provides a legal framework for personal data processing. The GDPR stipulates specific consequences for non-compliance to the data protection principles, in the view of ensuring accountability of data controllers in their personal data processing practices. Widely recognized approaches to implement the Privacy by Design (PbD) principle in the software application development process, are broader in scope. Hence, processes to implement personal data protection techniques for specific systems are not the central aspect of the aforementioned approaches.

In order to influence the implementation of techniques for preventing personal data misuse associated with sharing of data represented as graphs, a conceptual mechanism for building privacy techniques is developed. The conceptual mechanism consists of three elements, namely, a risk analysis for Semantic Web information management systems using Privacy Impact Assessment (PIA) approach, two privacy protection techniques for graphs enriched with semantics and a model to approach evaluation of adherence to the goals resulted from the risk analysis. The privacy protection techniques include an access control model that embodies purpose limitation principle—an essential aspect of GDPR—and adaptations of the differential privacy model for graphs with edge labels. The access control model takes into account the semantics of the graph elements for authorizing access to the graph data. In our differential privacy adaptations, we define and study through experiments, four different approaches to adapt the differential privacy model to edge-labeled graph datasets.

Abstract [en]

The thesis considers a systematic approach to design and develop techniques for preventing personal data exposure in next generation information management systems with the aim of ensuring accountability of data controllers (entities that process personal data).

With a rapid growth in the communication technologies, heterogenous computing environments that offer cost-effective data processing alternatives are emerging. Thus, the information-flow of personal data spans beyond the information processing practices of data controllers thereby involving other parties that process personal data. Moreover, in order to enable interoperability, data in such environments is given well-defined structure and meaning by means of graph-based data models. Graphs, inherently emphasize connections between things, and when graphs are used to model personal data records, the connections and the network structure may reveal intimate details about our inter-connected society.

The GDPR stipulates specific consequences for non-compliance to the data protection principles, in the view of ensuring accountability of data controllers in their personal data processing practices. Widely recognized approaches to implement the Privacy by Design (PbD) principle in the software application development process, are broader in scope. Hence, processes to implement privacy techniques for specific systems are not the central aspect of the aforementioned approaches.

In order to influence the implementation of techniques for preventing personal data misuse associated with sharing of data represented as graphs, a conceptual mechanism for building privacy techniques is developed. The conceptual mechanism consists of three elements, namely, a risk analysis for Semantic Web information management systems using Privacy Impact Assessment (PIA) approach, two privacy protection techniques for graphs enriched with semantics and a model to approach evaluation of adherence to the goals resulted from the risk analysis.

Place, publisher, year, edition, pages
Karlstad: Karlstads universitet, 2020. p. 30
Series
Karlstad University Studies, ISSN 1403-8099 ; 2020:32
Keywords
accountability, Privacy By Design (PdD), privacy risks, Privacy Impact Assessment (PIA), audits, privacy compliance, access control, differential privacy, graphs, edge-labeled graphs, Semantic Web
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-80531 (URN)978-91-7867-153-3 (ISBN)978-91-7867-157-1 (ISBN)
Public defence
2020-10-30, Frödingsalen, 1B364, Karlstads Universitet, Karlstad, 09:00 (English)
Opponent
Supervisors
Note

Article 5 part of thesis as manuscript, now published.

Available from: 2020-10-13 Created: 2020-09-28 Last updated: 2021-03-19Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Authority records

Reuben, Jenni

Search in DiVA

By author/editor
Reuben, Jenni
By organisation
Department of Mathematics and Computer Science (from 2013)
Computer Systems

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 306 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • apa.csl
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf