Change search
Link to record
Permanent link

Direct link
Fischer-Hübner, SimoneORCID iD iconorcid.org/0000-0002-6938-4466
Alternative names
Publications (10 of 201) Show all publications
Morel, V. & Fischer-Hübner, S. (2023). Automating privacy decisions -where to draw the line?. In: Proceedings - 8th IEEE European Symposium on Security and Privacy Workshops: . Paper presented at 2023 IEEE European Symposium on Security and Privacy Workshops (EuroS&PW), Delft, Netherlands, July 3-7, 2023. (pp. 108-116). Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Automating privacy decisions -where to draw the line?
2023 (English)In: Proceedings - 8th IEEE European Symposium on Security and Privacy Workshops, Institute of Electrical and Electronics Engineers (IEEE), 2023, p. 108-116Conference paper, Published paper (Refereed)
Abstract [en]

Users are often overwhelmed by privacy decisions to manage their personal data, which can happen on the web, in mobile, and in IoT environments. These decisions can take various forms -such as decisions for setting privacy permissions or privacy preferences, decisions responding to consent requests, or to intervene and ’reject’ processing of one’s personal data -, and each can have different legal impacts. In all cases and for all types of decisions, scholars and industry have been proposing tools to better automate the process of privacy decisions at different levels, in order to enhance usability. We provide in this paper an overview of the main challenges raised by the automation of privacy decisions, together with a classification scheme of the existing and envisioned work and proposals addressing automation of privacy decisions. 

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2023
Keywords
Data privacy, Classification scheme, Consent, GDPR, Permission, Privacy decision, Privacy preferences, Automation
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-96506 (URN)10.1109/EuroSPW59978.2023.00017 (DOI)2-s2.0-85168247258 (Scopus ID)
Conference
2023 IEEE European Symposium on Security and Privacy Workshops (EuroS&PW), Delft, Netherlands, July 3-7, 2023.
Funder
Knut and Alice Wallenberg Foundation
Available from: 2023-08-29 Created: 2023-08-29 Last updated: 2023-08-29Bibliographically approved
De Cock, M., Zekeriya, E., Fischer-Hübner, S., Jensen, M., Klakow, D. & Teixeira, F. (2023). Privacy enhancing technologies. In: Simone Fischer-Hübner; Dietrich Klakow; Peggy Valcke; Emmanuel Vincent (Ed.), Privacy in Speech and Language Technology: Report from Dagstuhl Seminar 22342 (pp. 90-99). Schloss Dagstuhl, Leibniz-Zentrum für Informatik
Open this publication in new window or tab >>Privacy enhancing technologies
Show others...
2023 (English)In: Privacy in Speech and Language Technology: Report from Dagstuhl Seminar 22342 / [ed] Simone Fischer-Hübner; Dietrich Klakow; Peggy Valcke; Emmanuel Vincent, Schloss Dagstuhl, Leibniz-Zentrum für Informatik , 2023, p. 90-99Chapter in book (Refereed)
Abstract [en]

Privacy-enhancing technologies (PETs) provide technical building blocks for achieving privacyby design and can be defined as technologies that embody fundamental data protection goals[13 ] including the goals of unlinkability, interveneability, transparency and the classical CIA(confidentiality, integrity, availability) security goals by minimizing personal data collectionand use, maximizing data security, and empowering individuals.The privacy by design principle of a positive sum for speech and language technologiesshould enable users to benefit from the rich functions of these technologies while protectingthe users’ privacy at the same time. The fundamental question is how to achieve privacyby design for speech and language technology without hampering the services. To achievethis goal, different PETs exist that can be utilized for this purpose. Below, we first discusswhat type of personal data are accessible via speech and text and should be the target ofprotection by PETs. Then, we provide an overview of PETs that can provide protectionand discuss their limitations and challenges that arise when used for speech and languagetechnologies.

Place, publisher, year, edition, pages
Schloss Dagstuhl, Leibniz-Zentrum für Informatik, 2023
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-94132 (URN)10.4230/DagRep.12.8.60 (DOI)
Available from: 2023-04-03 Created: 2023-04-03 Last updated: 2023-04-03Bibliographically approved
Fischer-Hübner, S., Hansen, M., Hoepman, J.-H. & Jensen, M. (2023). Privacy-Enhancing Technologies and Anonymisation in Light of GDPR and Machine Learning. In: Felix Bieker, Joachim Meyer, Sebastian Pape, Ina Schiering, Andreas Weich (Ed.), Privacy and Identity Management: . Paper presented at IFIP International Summer School on Privacy and Identity Management,[Digital], August 30-September 2, 2022. (pp. 11-20). Springer, 671 IFIP
Open this publication in new window or tab >>Privacy-Enhancing Technologies and Anonymisation in Light of GDPR and Machine Learning
2023 (English)In: Privacy and Identity Management / [ed] Felix Bieker, Joachim Meyer, Sebastian Pape, Ina Schiering, Andreas Weich, Springer, 2023, Vol. 671 IFIP, p. 11-20Conference paper, Published paper (Refereed)
Abstract [en]

The use of Privacy-Enhancing Technologies in the field of data anonymisation and pseudonymisation raises a lot of questions with respect to legal compliance under GDPR and current international data protection legislation. Here, especially the use of innovative technologies based on machine learning may increase or decrease risks to data protection. A workshop held at the IFIP Summer School on Privacy and Identity Management showed the complexity of this field and the need for further interdisciplinary research on the basis of an improved joint understanding of legal and technical concepts. 

Place, publisher, year, edition, pages
Springer, 2023
Series
IFIP Advances in Information and Communication Technology, ISSN 1868-4238, E-ISSN 1868-422X
Keywords
Machine learning, Anonymization, Data anonymization, Innovative technology, Legal compliance, Machine-learning, On-machines, Privacy enhancing technologies, Summer school, Technology-based, Data privacy
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-97456 (URN)10.1007/978-3-031-31971-6_2 (DOI)2-s2.0-85173556437 (Scopus ID)
Conference
IFIP International Summer School on Privacy and Identity Management,[Digital], August 30-September 2, 2022.
Available from: 2023-11-22 Created: 2023-11-22 Last updated: 2023-11-22Bibliographically approved
Alaqra, A. S., Karegar, F. & Fischer-Hübner, S. (2023). Structural and functional explanations for informing lay and expert users: the case of functional encryption. Paper presented at 23rd Privacy Enhancing Technologies Symposium (PETS 2023), July 10–15, 2023. Lausanne, Switzerland and Online.. Proceedings on Privacy Enhancing Technologies, 2023(4), 359-380
Open this publication in new window or tab >>Structural and functional explanations for informing lay and expert users: the case of functional encryption
2023 (English)In: Proceedings on Privacy Enhancing Technologies, E-ISSN 2299-0984, Vol. 2023, no 4, p. 359-380Article in journal (Refereed) Published
Abstract [en]

Usable explanations of privacy-enhancing technologies (PETs) help users make more informed privacy decisions, but the explanations of PETs are generally geared toward individuals with more technical knowledge. To explain functional encryption (FE) to experts and laypersons, we investigate structural and functional explanations and explore users' interests and preferences, as well as how they affect users' comprehension and decisions about sharing data. To this end (with an EU-based population), we conducted four focus groups, in combination with walk-throughs, with 13 participants in the first study, followed by an online survey with 347 experts and 370 laypersons. Both explanations were considered useful in fulfilling the different needs of participants interested in the privacy policy information. Participants, regardless of their expertise, trusted and were more satisfied with the structural explanation. However, functional explanations had a higher contribution to all participants' comprehension. We, therefore, recommend combining both types of explanations for a usable privacy policy.

Place, publisher, year, edition, pages
Privacy Enhancing Technologies Board, 2023
Keywords
functional encryption, functional & structural explanation, transparency, privacy, usability, user comprehension, mental models
National Category
Computer Sciences Information Systems
Research subject
Information Systems; Computer Science
Identifiers
urn:nbn:se:kau:diva-96542 (URN)10.56553/popets-2023-0115 (DOI)
Conference
23rd Privacy Enhancing Technologies Symposium (PETS 2023), July 10–15, 2023. Lausanne, Switzerland and Online.
Available from: 2023-08-31 Created: 2023-08-31 Last updated: 2023-09-04Bibliographically approved
Romare, P., Morel, V., Karegar, F. & Fischer-Hübner, S. (2023). Tapping into Privacy: A Study of User Preferences and Concerns on Trigger-Action Platforms. In: 20th Annual International Conference on Privacy, Security and Trust (PST): . Paper presented at 20th Annual International Conference on Privacy, Security and Trust, PST, Copenhagen, Denmark, August 21-23, 2023. (pp. 1-12). Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Tapping into Privacy: A Study of User Preferences and Concerns on Trigger-Action Platforms
2023 (English)In: 20th Annual International Conference on Privacy, Security and Trust (PST), Institute of Electrical and Electronics Engineers (IEEE), 2023, p. 1-12Conference paper, Published paper (Refereed)
Abstract [en]

The Internet of Things (IoT) devices are rapidly increasing in popularity, with more individuals using Internet-connected devices that continuously monitor their activities. This work explores privacy concerns and expectations of end-users related to Trigger-Action platforms (TAPs) in the context of the Internet of Things (IoT). TAPs allow users to customize their smart environments by creating rules that trigger actions based on specific events or conditions. As personal data flows between different entities, there is a potential for privacy concerns. In this study, we aimed to identify the privacy factors that impact users’ concerns and preferences for using IoT TAPs. To address this research objective, we conducted three focus groups with 15 participants and we extracted nine themes related to privacy factors using thematic analysis. Our participants particularly prefer to have control and transparency over the automation and are concerned about unexpected data inferences, risks and unforeseen consequences for themselves and for bystanders that are caused by the automation. The identified privacy factors can help researchers derive predefined and selectable profiles of privacy permission settings for IoT TAPs that represent the privacy preferences of different types of users as a basis for designing usable privacy controls for IoT TAPs. 

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2023
Keywords
User profile; Condition; Dataflow; End-users; Focus groups; Privacy; Privacy concerns; Privacy preferences; Smart environment; Trigger-action platform; User’s preferences; Internet of things
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-97902 (URN)10.1109/PST58708.2023.10320180 (DOI)2-s2.0-85179547300 (Scopus ID)979-8-3503-1387-1 (ISBN)979-8-3503-1388-8 (ISBN)
Conference
20th Annual International Conference on Privacy, Security and Trust, PST, Copenhagen, Denmark, August 21-23, 2023.
Funder
Knowledge FoundationKnut and Alice Wallenberg Foundation
Available from: 2024-01-03 Created: 2024-01-03 Last updated: 2024-01-03Bibliographically approved
Matthias, B., Chatzopoulou, A. & Fischer-Hübner, S. (2023). Towards a Light-Weight Certification Scheme for Cybersecurity MOOCs. In: Leslie F. Sikos, Paul Haskell-Dowland (Ed.), Cybersecurity Teaching in Higher Education: (pp. 103-125). Springer
Open this publication in new window or tab >>Towards a Light-Weight Certification Scheme for Cybersecurity MOOCs
2023 (English)In: Cybersecurity Teaching in Higher Education / [ed] Leslie F. Sikos, Paul Haskell-Dowland, Springer, 2023, p. 103-125Chapter in book (Other academic)
Abstract [en]

Online education including MOOCs (Massive Open Online Courses) have steadily gained importance during the COVID-19 pandemic. They also play an important role for enabling lifelong learning and addressing the cybersecurity skills gap. However, it is not always easy to judge the quality of MOOCs for learners or other stakeholders including organisations interested in cybersecurity MOOCs as a means for competence development of their employees. This article provides an overview of the research work conducted by the EU H2020 projects CyberSec4Europe and CONCORDIA on eliciting quality criteria for different types of cybersecurity MOOCs provided in Europe, on defining a quality branding process that was validated through trial evaluations of selected cybersecurity MOOCs, and on conducting a survey with cybersecurity MOOC stakeholders and interviews with certification experts about the role and form of quality certification. Based on this research, the article concludes by proposing building blocks for a “lightweight” certification scheme for future quality branding of cybersecurity MOOC.

Place, publisher, year, edition, pages
Springer, 2023
Keywords
Cybersecurity education, MOOCs, Quality evaluation, Certification
National Category
Computer Sciences Software Engineering
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-97460 (URN)10.1007/978-3-031-24216-8_5 (DOI)2-s2.0-85173385624 (Scopus ID)978-3-031-24215-1 (ISBN)978-3-031-24216-8 (ISBN)
Available from: 2023-11-22 Created: 2023-11-22 Last updated: 2023-11-22Bibliographically approved
Alaqra, A. S., Fischer-Hübner, S. & Karegar, F. (2023). Transparency of Privacy Risks Using PIA Visualizations. In: Moallem Abbas (Ed.), HCI for Cybersecurity, Privacy and Trust: 5th International Conference, HCI-CPT 2023, Held as Part of the 25th HCI International Conference, HCII 2023, Copenhagen, Denmark, July 23–28, 2023, Proceedings. Paper presented at HCI for Cybersecurity, Privacy and Trust. 5th International Conference, HCI-CPT 2023, Held as Part of the 25th HCI International Conference, HCII 2023, Copenhagen, Denmark, July 23–28, 2023. (pp. 3-17). Cham: Springer
Open this publication in new window or tab >>Transparency of Privacy Risks Using PIA Visualizations
2023 (English)In: HCI for Cybersecurity, Privacy and Trust: 5th International Conference, HCI-CPT 2023, Held as Part of the 25th HCI International Conference, HCII 2023, Copenhagen, Denmark, July 23–28, 2023, Proceedings / [ed] Moallem Abbas, Cham: Springer, 2023, p. 3-17Conference paper, Published paper (Refereed)
Abstract [en]

Privacy enhancing technologies allow the minimization of risks to online data. However, the transparency of the minimization process is not so clear to all types of end users. Privacy Impact Assessments (PIAs) is a standardized tool that identifies and assesses privacy risks associated with the use of a system. In this work, we used the results of the PIA conducted in our use case to visualize privacy risks to end users in the form of User Interface (UI) mock ups. We tested and evaluated the UI mock-ups via walkthroughs to investigate users' interests by observing their clicking behavior, followed by four focus group workshops. There were 13 participants (two expert groups and two lay user groups) in total. Results reveal general interests in the transparency provided by showing the risks reductions. Generally, although participants appreciate the concept of having detailed information provided about risk reductions and the type of risks, the visualization and usability of the PIA UIs require future development. Specifically, it should be tailored to the target group's mental models and background knowledge.

Place, publisher, year, edition, pages
Cham: Springer, 2023
Series
Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349 ; 14045
Keywords
Privacy Impact Assessment, User Interface, Usability, Transparency, Privacy-Enhancing Technologies
National Category
Computer Sciences Information Systems
Research subject
Information Systems; Computer Science
Identifiers
urn:nbn:se:kau:diva-96543 (URN)10.1007/978-3-031-35822-7_1 (DOI)2-s2.0-85171450746 (Scopus ID)978-3-031-35821-0 (ISBN)978-3-031-35822-7 (ISBN)
Conference
HCI for Cybersecurity, Privacy and Trust. 5th International Conference, HCI-CPT 2023, Held as Part of the 25th HCI International Conference, HCII 2023, Copenhagen, Denmark, July 23–28, 2023.
Available from: 2023-08-31 Created: 2023-08-31 Last updated: 2023-11-02Bibliographically approved
Johansen, J., Pedersen, T., Fischer-Hübner, S., Johansen, C., Schneider, G., Roosendaal, A., . . . Noll, J. (2022). A multidisciplinary definition of privacy labels. Information and Computer Security, 30(3), 452-469
Open this publication in new window or tab >>A multidisciplinary definition of privacy labels
Show others...
2022 (English)In: Information and Computer Security, E-ISSN 2056-4961, Vol. 30, no 3, p. 452-469Article in journal (Refereed) Published
Abstract [en]

Purpose This paper aims to present arguments about how a complex concept of privacy labeling can be a solution to the current state of privacy. Design/methodology/approach The authors give a precise definition of Privacy Labeling (PL), painting a panoptic portrait from seven different perspectives: Business, Legal, Regulatory, Usability and Human Factors, Educative, Technological and Multidisciplinary. They describe a common vision, proposing several important "traits of character" of PL as well as identifying "undeveloped potentialities", i.e. open problems on which the community can focus. Findings This position paper identifies the stakeholders of the PL and their needs with regard to privacy, describing how PL should be and look like to address these needs. Main aspects considered are the PL's educational power to change people's knowledge of privacy, tools useful for constructing PL and the possible visual appearances of PL. They also identify how the present landscape of privacy certifications could be improved by PL. Originality/value The authors adopt a multidisciplinary approach to defining PL as well as give guidelines in the form of goals, characteristics, open problems, starting points and a roadmap for creating the ideal PL.

Place, publisher, year, edition, pages
Emerald Group Publishing Limited, 2022
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-89655 (URN)10.1108/ICS-06-2021-0080 (DOI)000783065400001 ()2-s2.0-85129317908 (Scopus ID)
Available from: 2022-04-28 Created: 2022-04-28 Last updated: 2022-06-28Bibliographically approved
Islami, L., Fischer-Hübner, S. & Papadimitratos, P. (2022). Capturing drivers’ privacy preferences for intelligent transportation systems: An intercultural perspective. Computers & security (Print), 123, Article ID 102913.
Open this publication in new window or tab >>Capturing drivers’ privacy preferences for intelligent transportation systems: An intercultural perspective
2022 (English)In: Computers & security (Print), ISSN 0167-4048, E-ISSN 1872-6208, Vol. 123, article id 102913Article in journal (Refereed) Published
Abstract [en]

While recent research on intelligent transportation systems including vehicular communication systems has focused on technical aspects, little research work has been conducted on drivers’ privacy perceptions and preferences. Understanding the driver’s privacy perceptions and preferences will allow researchers to design usable privacy and identity management systems offering user privacy choices and controls for intelligent transportation systems. We conducted in-depth semi-structured interviews with 17 Swedish drivers to analyse their privacy perceptions and preferences for intelligent transportation systems, particularly for user control and for privacy trade-offs with cost, safety and usability. We also compare our results from the interviews with Swedish drivers with results from interviews that we conducted previously with South African drivers. Our cross-cultural comparison shows that perceived privacy implications, the drivers’ willingness to share location information under certain conditions with other parties, as well as their appreciation of Privacy Enhancing Technologies differ significantly across drivers with different cultural backgrounds. We further discuss the cultural impact on privacy preferences, including those for privacy trade-offs, and the implications of our results for usable privacy-enhancing Identity Management for future vehicular communication systems. In particular, we provide recommendations for suitable pre-defined privacy options to be offered to users with different cultural backgrounds enabling them to easily make privacy-related control choices.

Place, publisher, year, edition, pages
Elsevier, 2022
Keywords
Intelligent transportation, Vehicular communication, Privacy preferences, Privacy perceptions, Intercultural comparison, Privacy-enhancing technologies (PETs)
National Category
Computer and Information Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-92109 (URN)10.1016/j.cose.2022.102913 (DOI)000863330300004 ()2-s2.0-85138414531 (Scopus ID)
Funder
Swedish Foundation for Strategic Research
Available from: 2022-10-04 Created: 2022-10-04 Last updated: 2022-10-27Bibliographically approved
Karegar, F., Alaqra, A. S. & Fischer-Hübner, S. (2022). Exploring User-Suitable Metaphors for Differentially Private Data Analyses. In: Proceedings of the 18th Symposium on Usable Privacy and Security, SOUPS 2022: . Paper presented at 18th Symposium on Usable Privacy and Security (SOUPS), Boston, United States, August 7–9, 2022. (pp. 175-193).
Open this publication in new window or tab >>Exploring User-Suitable Metaphors for Differentially Private Data Analyses
2022 (English)In: Proceedings of the 18th Symposium on Usable Privacy and Security, SOUPS 2022, 2022, p. 175-193Conference paper, Published paper (Refereed)
Abstract [en]

Despite recent enhancements in the deployment of differential privacy (DP), little has been done to address the human aspects of DP-enabled systems. Comprehending the complex concept of DP and the privacy protection it provides could be challenging for lay users who should make informed decisions when sharing their data. Using metaphors could be suitable to convey key protection functionalities of DP to them. Based on a three-phase framework, we extracted and generated metaphors for differentially private data analysis models (local and central). We analytically evaluated the metaphors based on experts’ feedback and then empirically evaluated them in online interviews with 30 participants. Our results showed that the metaphorical explanations can successfully convey that perturbation protects privacy and that there is a privacy-accuracy trade-off. Nonetheless, conveying information at a high level leads to incorrect expectations that negatively affect users’ understanding and limits the ability to apply the concept to different contexts. In this paper, we presented the plausible suitability of metaphors and discussed the challenges of using them to facilitate informed decisions on sharing data with DP-enabled systems. 

Keywords
Data analysis models; Differential privacies; Expert feedback; Human aspects; Informed decision; Key protections; Privacy protection; Private data analysis; Three phase; Three phasis, Conveying
National Category
Computer Sciences
Research subject
Computer Science; Information Systems
Identifiers
urn:nbn:se:kau:diva-92509 (URN)2-s2.0-85140926705 (Scopus ID)9781939133304 (ISBN)
Conference
18th Symposium on Usable Privacy and Security (SOUPS), Boston, United States, August 7–9, 2022.
Available from: 2022-11-16 Created: 2022-11-16 Last updated: 2022-11-24Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-6938-4466

Search in DiVA

Show all publications