Change search
Link to record
Permanent link

Direct link
Fischer-Hübner, SimoneORCID iD iconorcid.org/0000-0002-6938-4466
Alternative names
Publications (10 of 203) Show all publications
Morel, V. & Fischer-Hübner, S. (2023). Automating privacy decisions -where to draw the line?. In: Proceedings - 8th IEEE European Symposium on Security and Privacy Workshops: . Paper presented at 2023 IEEE European Symposium on Security and Privacy Workshops (EuroS&PW), Delft, Netherlands, July 3-7, 2023. (pp. 108-116). Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Automating privacy decisions -where to draw the line?
2023 (English)In: Proceedings - 8th IEEE European Symposium on Security and Privacy Workshops, Institute of Electrical and Electronics Engineers (IEEE), 2023, p. 108-116Conference paper, Published paper (Refereed)
Abstract [en]

Users are often overwhelmed by privacy decisions to manage their personal data, which can happen on the web, in mobile, and in IoT environments. These decisions can take various forms -such as decisions for setting privacy permissions or privacy preferences, decisions responding to consent requests, or to intervene and ’reject’ processing of one’s personal data -, and each can have different legal impacts. In all cases and for all types of decisions, scholars and industry have been proposing tools to better automate the process of privacy decisions at different levels, in order to enhance usability. We provide in this paper an overview of the main challenges raised by the automation of privacy decisions, together with a classification scheme of the existing and envisioned work and proposals addressing automation of privacy decisions. 

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2023
Keywords
Data privacy, Classification scheme, Consent, GDPR, Permission, Privacy decision, Privacy preferences, Automation
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-96506 (URN)10.1109/EuroSPW59978.2023.00017 (DOI)2-s2.0-85168247258 (Scopus ID)
Conference
2023 IEEE European Symposium on Security and Privacy Workshops (EuroS&PW), Delft, Netherlands, July 3-7, 2023.
Funder
Knut and Alice Wallenberg Foundation
Available from: 2023-08-29 Created: 2023-08-29 Last updated: 2023-08-29Bibliographically approved
Johansen, J. & Fischer-Hübner, S. (2023). Expert Opinions as a Method of Validating Ideas: Applied to Making GDPR Usable. In: Nina Gerber, Alina Stöver, Karola Marky (Ed.), Human Factors in Privacy Research: Nina Gerber, Alina Stöver, Karola Marky (pp. 137-152). Springer
Open this publication in new window or tab >>Expert Opinions as a Method of Validating Ideas: Applied to Making GDPR Usable
2023 (English)In: Human Factors in Privacy Research: Nina Gerber, Alina Stöver, Karola Marky / [ed] Nina Gerber, Alina Stöver, Karola Marky, Springer, 2023, p. 137-152Chapter in book (Refereed)
Abstract [en]

This chapter presents two contributions; the first is a method and the second is the results of applying this method to usable privacy. First, we introduce a general method for validating ideas based on expert opinions. We adapt techniques that normally are used for validating data and apply them instead to analyze the expert opinions on the ideas under study. Since usually the expert opinions are varied, example-rich, and forward-looking, applying our method of ideas validation has the side effect of also identifying, in the process, open problems where the original studied ideas function as a foundation for further developments. Second, we employ a critical qualitative research, using theory triangulation to analyze the opinions coming from three groups of experts, categorized as “certifications,” “law,” and “usability.” These took part in a study where we thoroughly applied the method that we present here in order to validate five different types of ideas previously published under the collective title “Making GDPR Usable.” We will thus show how to validate ideas that come in the form of: a model, a definition, a prescriptive list, a set of criteria, and a form of rather general research idea as those usually appearing in position papers, namely “the need for evaluations and measuring of usability of privacy.”

Place, publisher, year, edition, pages
Springer, 2023
Keywords
Usable privacy, Expert opinions, Validation, GDPR, Certification, Qualitative research
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-98989 (URN)10.1007/978-3-031-28643-8_7 (DOI)978-3-031-28642-1 (ISBN)978-3-031-28643-8 (ISBN)
Available from: 2024-03-22 Created: 2024-03-22 Last updated: 2024-03-22Bibliographically approved
De Cock, M., Zekeriya, E., Fischer-Hübner, S., Jensen, M., Klakow, D. & Teixeira, F. (2023). Privacy enhancing technologies. In: Simone Fischer-Hübner; Dietrich Klakow; Peggy Valcke; Emmanuel Vincent (Ed.), Privacy in Speech and Language Technology: Report from Dagstuhl Seminar 22342 (pp. 90-99). Schloss Dagstuhl, Leibniz-Zentrum für Informatik
Open this publication in new window or tab >>Privacy enhancing technologies
Show others...
2023 (English)In: Privacy in Speech and Language Technology: Report from Dagstuhl Seminar 22342 / [ed] Simone Fischer-Hübner; Dietrich Klakow; Peggy Valcke; Emmanuel Vincent, Schloss Dagstuhl, Leibniz-Zentrum für Informatik , 2023, p. 90-99Chapter in book (Refereed)
Abstract [en]

Privacy-enhancing technologies (PETs) provide technical building blocks for achieving privacyby design and can be defined as technologies that embody fundamental data protection goals[13 ] including the goals of unlinkability, interveneability, transparency and the classical CIA(confidentiality, integrity, availability) security goals by minimizing personal data collectionand use, maximizing data security, and empowering individuals.The privacy by design principle of a positive sum for speech and language technologiesshould enable users to benefit from the rich functions of these technologies while protectingthe users’ privacy at the same time. The fundamental question is how to achieve privacyby design for speech and language technology without hampering the services. To achievethis goal, different PETs exist that can be utilized for this purpose. Below, we first discusswhat type of personal data are accessible via speech and text and should be the target ofprotection by PETs. Then, we provide an overview of PETs that can provide protectionand discuss their limitations and challenges that arise when used for speech and languagetechnologies.

Place, publisher, year, edition, pages
Schloss Dagstuhl, Leibniz-Zentrum für Informatik, 2023
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-94132 (URN)10.4230/DagRep.12.8.60 (DOI)
Available from: 2023-04-03 Created: 2023-04-03 Last updated: 2023-04-03Bibliographically approved
Fischer-Hübner, S., Klakow, D., Valcke, P. & Vincent, E. (Eds.). (2023). Privacy in Speech and Language Technology- Dagstuhl Reports. Germany: Dagstuhl Publishing
Open this publication in new window or tab >>Privacy in Speech and Language Technology- Dagstuhl Reports
2023 (English)Collection (editor) (Other academic)
Abstract [en]

This report documents the outcomes of Dagstuhl Seminar 22342 “Privacy in Speech and LanguageTechnology”. The seminar brought together 27 attendees from 9 countries (Australia, Belgium,France, Germany, the Netherlands, Norway, Portugal, Sweden, and the USA) and 6 distinctdisciplines (Speech Processing, Natural Language Processing, Privacy Enhancing Technologies,Machine Learning, Human Factors, and Law) in order to achieve a common understanding of theprivacy threats raised by speech and language technology, as well as the existing solutions andthe remaining issues in each discipline, and to draft an interdisciplinary roadmap towards solvingthose issues in the short or medium term.To achieve these goals, the first day and the morning of the second day were devoted to3-minute self-introductions by all participants intertwined with 6 tutorials to introduce theterminology, the problems faced, and the solutions brought in each of the 6 disciplines. We alsomade a list of use cases and identified 6 cross-disciplinary topics to be discussed. The remainingdays involved working groups to discuss these 6 topics, collaborative writing sessions to report onthe findings of the working groups, and wrap-up sessions to discuss these findings with each other.A hike was organized in the afternoon of the third day.The seminar was a success: all participants actively participated in the working groups andthe discussions, and went home with new ideas and new collaborators. This report gathers theabstracts of the 6 tutorials and the reports of the working groups, which we consider as valuablecontributions towards a full-fledged roadmap.

Place, publisher, year, edition, pages
Germany: Dagstuhl Publishing, 2023. p. 42
Keywords
Privacy, Speech and Language Technology, Privacy Enhancing Technologies, Dagstuhl Seminar
National Category
Computer and Information Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-98990 (URN)10.4230/DagRep.12.8.60 (DOI)
Note

Report from Dagstuhl Seminar 22342

Seminar August 21–26, 2022 – https://www.dagstuhl.de/22342

Available from: 2024-03-22 Created: 2024-03-22 Last updated: 2024-03-22
Fischer-Hübner, S., Hansen, M., Hoepman, J.-H. & Jensen, M. (2023). Privacy-Enhancing Technologies and Anonymisation in Light of GDPR and Machine Learning. In: Felix Bieker, Joachim Meyer, Sebastian Pape, Ina Schiering, Andreas Weich (Ed.), Privacy and Identity Management: . Paper presented at IFIP International Summer School on Privacy and Identity Management,[Digital], August 30-September 2, 2022. (pp. 11-20). Springer, 671 IFIP
Open this publication in new window or tab >>Privacy-Enhancing Technologies and Anonymisation in Light of GDPR and Machine Learning
2023 (English)In: Privacy and Identity Management / [ed] Felix Bieker, Joachim Meyer, Sebastian Pape, Ina Schiering, Andreas Weich, Springer, 2023, Vol. 671 IFIP, p. 11-20Conference paper, Published paper (Refereed)
Abstract [en]

The use of Privacy-Enhancing Technologies in the field of data anonymisation and pseudonymisation raises a lot of questions with respect to legal compliance under GDPR and current international data protection legislation. Here, especially the use of innovative technologies based on machine learning may increase or decrease risks to data protection. A workshop held at the IFIP Summer School on Privacy and Identity Management showed the complexity of this field and the need for further interdisciplinary research on the basis of an improved joint understanding of legal and technical concepts. 

Place, publisher, year, edition, pages
Springer, 2023
Series
IFIP Advances in Information and Communication Technology, ISSN 1868-4238, E-ISSN 1868-422X
Keywords
Machine learning, Anonymization, Data anonymization, Innovative technology, Legal compliance, Machine-learning, On-machines, Privacy enhancing technologies, Summer school, Technology-based, Data privacy
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-97456 (URN)10.1007/978-3-031-31971-6_2 (DOI)2-s2.0-85173556437 (Scopus ID)
Conference
IFIP International Summer School on Privacy and Identity Management,[Digital], August 30-September 2, 2022.
Available from: 2023-11-22 Created: 2023-11-22 Last updated: 2023-11-22Bibliographically approved
Alaqra, A. S., Karegar, F. & Fischer-Hübner, S. (2023). Structural and functional explanations for informing lay and expert users: the case of functional encryption. Paper presented at 23rd Privacy Enhancing Technologies Symposium (PETS 2023), July 10–15, 2023. Lausanne, Switzerland and Online.. Proceedings on Privacy Enhancing Technologies, 2023(4), 359-380
Open this publication in new window or tab >>Structural and functional explanations for informing lay and expert users: the case of functional encryption
2023 (English)In: Proceedings on Privacy Enhancing Technologies, E-ISSN 2299-0984, Vol. 2023, no 4, p. 359-380Article in journal (Refereed) Published
Abstract [en]

Usable explanations of privacy-enhancing technologies (PETs) help users make more informed privacy decisions, but the explanations of PETs are generally geared toward individuals with more technical knowledge. To explain functional encryption (FE) to experts and laypersons, we investigate structural and functional explanations and explore users' interests and preferences, as well as how they affect users' comprehension and decisions about sharing data. To this end (with an EU-based population), we conducted four focus groups, in combination with walk-throughs, with 13 participants in the first study, followed by an online survey with 347 experts and 370 laypersons. Both explanations were considered useful in fulfilling the different needs of participants interested in the privacy policy information. Participants, regardless of their expertise, trusted and were more satisfied with the structural explanation. However, functional explanations had a higher contribution to all participants' comprehension. We, therefore, recommend combining both types of explanations for a usable privacy policy.

Place, publisher, year, edition, pages
Privacy Enhancing Technologies Board, 2023
Keywords
functional encryption, functional & structural explanation, transparency, privacy, usability, user comprehension, mental models
National Category
Computer Sciences Information Systems
Research subject
Information Systems; Computer Science
Identifiers
urn:nbn:se:kau:diva-96542 (URN)10.56553/popets-2023-0115 (DOI)
Conference
23rd Privacy Enhancing Technologies Symposium (PETS 2023), July 10–15, 2023. Lausanne, Switzerland and Online.
Available from: 2023-08-31 Created: 2023-08-31 Last updated: 2023-09-04Bibliographically approved
Romare, P., Morel, V., Karegar, F. & Fischer-Hübner, S. (2023). Tapping into Privacy: A Study of User Preferences and Concerns on Trigger-Action Platforms. In: 20th Annual International Conference on Privacy, Security and Trust (PST): . Paper presented at 20th Annual International Conference on Privacy, Security and Trust, PST, Copenhagen, Denmark, August 21-23, 2023. (pp. 1-12). Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Tapping into Privacy: A Study of User Preferences and Concerns on Trigger-Action Platforms
2023 (English)In: 20th Annual International Conference on Privacy, Security and Trust (PST), Institute of Electrical and Electronics Engineers (IEEE), 2023, p. 1-12Conference paper, Published paper (Refereed)
Abstract [en]

The Internet of Things (IoT) devices are rapidly increasing in popularity, with more individuals using Internet-connected devices that continuously monitor their activities. This work explores privacy concerns and expectations of end-users related to Trigger-Action platforms (TAPs) in the context of the Internet of Things (IoT). TAPs allow users to customize their smart environments by creating rules that trigger actions based on specific events or conditions. As personal data flows between different entities, there is a potential for privacy concerns. In this study, we aimed to identify the privacy factors that impact users’ concerns and preferences for using IoT TAPs. To address this research objective, we conducted three focus groups with 15 participants and we extracted nine themes related to privacy factors using thematic analysis. Our participants particularly prefer to have control and transparency over the automation and are concerned about unexpected data inferences, risks and unforeseen consequences for themselves and for bystanders that are caused by the automation. The identified privacy factors can help researchers derive predefined and selectable profiles of privacy permission settings for IoT TAPs that represent the privacy preferences of different types of users as a basis for designing usable privacy controls for IoT TAPs. 

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2023
Keywords
User profile; Condition; Dataflow; End-users; Focus groups; Privacy; Privacy concerns; Privacy preferences; Smart environment; Trigger-action platform; User’s preferences; Internet of things
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-97902 (URN)10.1109/PST58708.2023.10320180 (DOI)2-s2.0-85179547300 (Scopus ID)979-8-3503-1387-1 (ISBN)979-8-3503-1388-8 (ISBN)
Conference
20th Annual International Conference on Privacy, Security and Trust, PST, Copenhagen, Denmark, August 21-23, 2023.
Funder
Knowledge FoundationKnut and Alice Wallenberg Foundation
Available from: 2024-01-03 Created: 2024-01-03 Last updated: 2024-01-03Bibliographically approved
Matthias, B., Chatzopoulou, A. & Fischer-Hübner, S. (2023). Towards a Light-Weight Certification Scheme for Cybersecurity MOOCs. In: Leslie F. Sikos, Paul Haskell-Dowland (Ed.), Cybersecurity Teaching in Higher Education: (pp. 103-125). Springer
Open this publication in new window or tab >>Towards a Light-Weight Certification Scheme for Cybersecurity MOOCs
2023 (English)In: Cybersecurity Teaching in Higher Education / [ed] Leslie F. Sikos, Paul Haskell-Dowland, Springer, 2023, p. 103-125Chapter in book (Other academic)
Abstract [en]

Online education including MOOCs (Massive Open Online Courses) have steadily gained importance during the COVID-19 pandemic. They also play an important role for enabling lifelong learning and addressing the cybersecurity skills gap. However, it is not always easy to judge the quality of MOOCs for learners or other stakeholders including organisations interested in cybersecurity MOOCs as a means for competence development of their employees. This article provides an overview of the research work conducted by the EU H2020 projects CyberSec4Europe and CONCORDIA on eliciting quality criteria for different types of cybersecurity MOOCs provided in Europe, on defining a quality branding process that was validated through trial evaluations of selected cybersecurity MOOCs, and on conducting a survey with cybersecurity MOOC stakeholders and interviews with certification experts about the role and form of quality certification. Based on this research, the article concludes by proposing building blocks for a “lightweight” certification scheme for future quality branding of cybersecurity MOOC.

Place, publisher, year, edition, pages
Springer, 2023
Keywords
Cybersecurity education, MOOCs, Quality evaluation, Certification
National Category
Computer Sciences Software Engineering
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-97460 (URN)10.1007/978-3-031-24216-8_5 (DOI)2-s2.0-85173385624 (Scopus ID)978-3-031-24215-1 (ISBN)978-3-031-24216-8 (ISBN)
Available from: 2023-11-22 Created: 2023-11-22 Last updated: 2023-11-22Bibliographically approved
Alaqra, A. S., Fischer-Hübner, S. & Karegar, F. (2023). Transparency of Privacy Risks Using PIA Visualizations. In: Moallem Abbas (Ed.), HCI for Cybersecurity, Privacy and Trust: 5th International Conference, HCI-CPT 2023, Held as Part of the 25th HCI International Conference, HCII 2023, Copenhagen, Denmark, July 23–28, 2023, Proceedings. Paper presented at HCI for Cybersecurity, Privacy and Trust. 5th International Conference, HCI-CPT 2023, Held as Part of the 25th HCI International Conference, HCII 2023, Copenhagen, Denmark, July 23–28, 2023. (pp. 3-17). Cham: Springer
Open this publication in new window or tab >>Transparency of Privacy Risks Using PIA Visualizations
2023 (English)In: HCI for Cybersecurity, Privacy and Trust: 5th International Conference, HCI-CPT 2023, Held as Part of the 25th HCI International Conference, HCII 2023, Copenhagen, Denmark, July 23–28, 2023, Proceedings / [ed] Moallem Abbas, Cham: Springer, 2023, p. 3-17Conference paper, Published paper (Refereed)
Abstract [en]

Privacy enhancing technologies allow the minimization of risks to online data. However, the transparency of the minimization process is not so clear to all types of end users. Privacy Impact Assessments (PIAs) is a standardized tool that identifies and assesses privacy risks associated with the use of a system. In this work, we used the results of the PIA conducted in our use case to visualize privacy risks to end users in the form of User Interface (UI) mock ups. We tested and evaluated the UI mock-ups via walkthroughs to investigate users' interests by observing their clicking behavior, followed by four focus group workshops. There were 13 participants (two expert groups and two lay user groups) in total. Results reveal general interests in the transparency provided by showing the risks reductions. Generally, although participants appreciate the concept of having detailed information provided about risk reductions and the type of risks, the visualization and usability of the PIA UIs require future development. Specifically, it should be tailored to the target group's mental models and background knowledge.

Place, publisher, year, edition, pages
Cham: Springer, 2023
Series
Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349 ; 14045
Keywords
Privacy Impact Assessment, User Interface, Usability, Transparency, Privacy-Enhancing Technologies
National Category
Computer Sciences Information Systems
Research subject
Information Systems; Computer Science
Identifiers
urn:nbn:se:kau:diva-96543 (URN)10.1007/978-3-031-35822-7_1 (DOI)2-s2.0-85171450746 (Scopus ID)978-3-031-35821-0 (ISBN)978-3-031-35822-7 (ISBN)
Conference
HCI for Cybersecurity, Privacy and Trust. 5th International Conference, HCI-CPT 2023, Held as Part of the 25th HCI International Conference, HCII 2023, Copenhagen, Denmark, July 23–28, 2023.
Available from: 2023-08-31 Created: 2023-08-31 Last updated: 2023-11-02Bibliographically approved
Johansen, J., Pedersen, T., Fischer-Hübner, S., Johansen, C., Schneider, G., Roosendaal, A., . . . Noll, J. (2022). A multidisciplinary definition of privacy labels. Information and Computer Security, 30(3), 452-469
Open this publication in new window or tab >>A multidisciplinary definition of privacy labels
Show others...
2022 (English)In: Information and Computer Security, E-ISSN 2056-4961, Vol. 30, no 3, p. 452-469Article in journal (Refereed) Published
Abstract [en]

Purpose This paper aims to present arguments about how a complex concept of privacy labeling can be a solution to the current state of privacy. Design/methodology/approach The authors give a precise definition of Privacy Labeling (PL), painting a panoptic portrait from seven different perspectives: Business, Legal, Regulatory, Usability and Human Factors, Educative, Technological and Multidisciplinary. They describe a common vision, proposing several important "traits of character" of PL as well as identifying "undeveloped potentialities", i.e. open problems on which the community can focus. Findings This position paper identifies the stakeholders of the PL and their needs with regard to privacy, describing how PL should be and look like to address these needs. Main aspects considered are the PL's educational power to change people's knowledge of privacy, tools useful for constructing PL and the possible visual appearances of PL. They also identify how the present landscape of privacy certifications could be improved by PL. Originality/value The authors adopt a multidisciplinary approach to defining PL as well as give guidelines in the form of goals, characteristics, open problems, starting points and a roadmap for creating the ideal PL.

Place, publisher, year, edition, pages
Emerald Group Publishing Limited, 2022
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-89655 (URN)10.1108/ICS-06-2021-0080 (DOI)000783065400001 ()2-s2.0-85129317908 (Scopus ID)
Available from: 2022-04-28 Created: 2022-04-28 Last updated: 2022-06-28Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-6938-4466

Search in DiVA

Show all publications