A Mobile ad hoc network is a collection of wireless nodes that dynamically organize themselves to form a network without the need for any fixed infrastructure or centralized administration. The network topology dynamically changes frequently in an unpredictable manner since nodes are free to move. Support for multicasting is essential in such environment as it is considered to be an efficient way to deliver information from source nodes to many client nodes. A problem with multicast routing algorithms is their efficiency as their forwarding structure determines the overall network resource consumption makes them significantly less efficient than unicast routing algorithms. In this research, we improve the performance of the popular ODMRP multicast routing protocol by restricting the domain of join query packets, which have been lost. This is achieved by augmenting the join query packets with minimum extra information (one field), which denotes the number of the visited node from previous forwarding group. Simulation results show, that our mechanisms significantly reduce the control traffic and thus the overall latency and power consumption in the network.
Multipath TCP (MPTCP) is a proposed extension to TCP that enables a number of performance advantages that have not been offered before. While the protocol specification is close to being finalized, there still remain some unaddressed challenges regarding the deployment and security implications of the protocol. This work attempts to tackle some of these concerns by proposing and implementing MPTCP aware security services and deploying them inside a proof of concept MPTCP proxy. The aim is to enable hosts, even those without native MPTCP support, to securely benefit from the MPTCP performance advantages. Our evaluations show that the security services that are implemented enable proper intrusion detection and prevention to thwart potential attacks as well as threshold rules to prevent denial of service (DoS) attacks.
With the advance of fifth generation (5G) networks, network density needs to grow significantly in order to meet the required capacity demands. A massive deployment of small cells may lead to a high cost for providing. ber connectivity to each node. Consequently, many small cells are expected to be connected through wireless links to the umbrella eNodeB, leading to a mesh backhaul topology. This backhaul solution will most probably be composed of high capacity point-to-point links, typically operating in the millimeter wave (mmWave) frequency band due to its massive bandwidth availability. In this paper, we propose a mathematical model that jointly solves the user association and backhaul routing problem in the aforementioned context, aiming at the energy efficiency maximization of the network. Our study considers the energy consumption of both the access and backhaul links, while taking into account the capacity constraints of all the nodes as well as the fulfillment of the service-level agreements (SLAs). Due to the high complexity of the optimal solution, we also propose an energy efficient heuristic algorithm (Joint), which solves the discussed joint problem, while inducing low complexity in the system. We numerically evaluate the algorithm performance by comparing it not only with the optimal solution but also with reference approaches under different traffic load scenarios and backhaul parameters. Our results demonstrate that Joint outperforms the state-of-the-art, while being able to find good solutions, close to optimal, in short time.
This deliverable provides a description of the framework identified for the collaborative activities between different partners in the context of NEWCOM department 7 on QoS provision in heterogeneous wireless networks. The considered models, assumptions and expected results are pointed out for each activity. The deliverable also includes a report on the means to achieve the integration between the different partners
This deliverable constitutes the final report of all the activities that carried out in the framework of the NEWCOM department 7. It contains a description of the main technical achievements for each one of the activities in which the department has been organised togehter with the list of indicators reflecting the degree of integration that has been achieved among the different partners
Information-centric networking (ICN) has been introduced as a potential future networking architecture. ICN promises an architecture that makes information independent from lo- cation, application, storage, and transportation. Still, it is not without challenges. Notably, there are several outstanding issues regarding congestion control: Since ICN is more or less oblivious to the location of information, it opens up for a single application flow to have several sources, something which blurs the notion of transport flows, and makes it very difficult to employ traditional end-to-end congestion control schemes in these networks. Instead, ICN networks often make use of hop-by-hop congestion control schemes. How- ever, these schemes are also tainted with problems, e.g., several of the proposed ICN congestion controls assume fixed link capacities that are known beforehand. Since this seldom is the case, this paper evaluates the consequences in terms of latency, throughput, and link usage, variable link capacities have on a hop-by-hop congestion control scheme, such as the one employed by the Multipath-aware ICN Rate-based Congestion Control (MIRCC). The evaluation was carried out in the OMNeT++ simulator, and demonstrates how seemingly small variations in link capacity significantly deterio- rate both latency and throughput, and often result in inefficient network link usage.
Future network environments will be heterogeneous and mobile terminals will have the opportunity to dynamically select among many different access technologies. Therefore, it is important to provide service continuity in case of vertical handovers when terminals change the access technology. Two important wireless access technologies are WLAN (Wireless Local Access Networks) and WMAN (Wireless Metropolitan Access Networks) networks. In this paper, we address several challenges related to a seamless integration of those technologies. We highlight important aspects for designing a WLAN/WMAN interworking architecture and we address important Quality of Service (QoS) issues for such interworked systems like degree of QoS support provided by the technologies, QoS mapping and signalling for vertical handover. By formulating several interworking scenarios, where WLAN users with ongoing voice, video and data sessions hand over to WMAN, we study QoS and performance issues and analyse feasibility of seamless session continuity through simulations
Research at universities is today often conducted as projects. This is especially true in the engineering, natural science, medicine, and social science disciplines. Research projects are typically carried out by different categories of employers, such as professors, associate professors, assistant professors, and PhD students. These projects are typically managed by the person that applied for the project money, or the person that is the most experienced researcher at the department, which is often a professor or associate professor. From such leading persons, miracles are expected. Except acting as project managers, they are also engaged in many other parallel activities, e.g., supervision of PhD students, undergraduate and graduate education, conference organization and administration, project application writing, and representing the department internally as well as externally.In this report, a survey of project management competence within research projects at Karlstad University is presented. Empirical data have been gathered through two questionnaires and six complementary interviews. Professors and associate professors as well as PhD students have participated in the study. The survey shows that the active project managers have learned to lead projects based on experience and very few project managers have a formal leadership education. This implies that long established project management methods and tools are seldom used.Based on the outcome from the survey and our own observations, four concrete activities to improve project management skills are proposed in the report. The first activity is to provide a suitable and well-balanced course in project management methods that is offered to both active and future project managers. The second activity is to establish experience networks among active project mangers at Karlstad University. The third activity is to create a mentor program for new project managers. The fourth activity is to establish a group of experienced project managers that can assist in and give support to ongoing and planned projects.
The mobile Internet is a fast growing technology that introduces new privacy risks. We argue that, since privacy legislation alone is not sufficient to protect the users privacy, technical solutions to enhanceinformational privacy of individuals are also needed. This paper introduces mCrowds, a privacy-enhancing technology that combines the concept of a crowd system in a mobile Internet setting with a filteringfunctionality to enable anonymity towards the content providers
Properties and behavior of cellular automata are considered. Cellular automata can simply be described as lattices of cells, where the cells can be in a finite number of states. By using simple rules the states of the cells are updated at discrete time steps. The evolution of cellular automata can be used for computations. Some cellular automata display universality meaning that there is no limit to the sophistication of the computations they can perform.
In this paper properties and behavior of cellular automata are considered. Cellular automata can simply be described as lattices of cells, where the cells can be in a finite number of states. By using simple rules the states of the cells are updated in parallel at discrete time steps. Depending on the rule and, to a certain degree, the initial states of the cells, the evolution of a cellular automaton is restricted to a small number a ways. Some cellular automata evolve uniformly, meaning that all cells end up in the same state, while others evolve randomly, meaning that the states of the cells appear to be totally randomized during evolution. Intermediate behavior, displaying repetitiveness or nesting, also occurs.Properties of cellular automata that are discussed in the paper are, for instance, sensitivity to initial conditions, randomness, reversibility, entropy, and conservation. These properties also appear in the physical world and cellular automata provide good examples in the understanding of these properties.The evolution of cellular automata can be used for computations. Some cellular automata even display the property of universality, a term well known from the universal Turing machine, meaning that there is no limit to the sophistication of the computations they can perform.
Cellular automata have a widespread use in the description of complex phenomena in disciplines as disparate as for example physics and economics. They are described by a lattice of cells, states of the cells, and rules for updating the states of the cells. One characteristics of a cellular automaton is the simplicity of the rules that determine how the cellular automaton evolves in time. These rules are local, are applied in parallel to all the cells and despite their simplicity they may give rise to a complex macroscopic behaviour. In this paper this is illustrated by examples from hydrodynamics and it is shown that cellular automata might provide powerful alternatives to partial differential equations
Cellular automata have a widespread use in the description of complex phenomena in disciplines as disparate as, for example, physics and economics. A cellular automaton is a lattice of cells, and the cells can be in a finite number of states. By using simple local rules the states of the cells are updated in parallel at discrete time steps. In short, a cellular automaton can be characterised by the three words - simple, local, and parallel. These three words are one of the reasons for the attractiveness of cellular automata. They are simple to implement and they are well suited for parallel computers (computations). Another reason for using cellular automata are for their spatio-temporal properties. The lattice may represent space and the updating of the cells gives adimension of time.In spite of the simplicity of cellular automata they may give rise to a complex macroscopic behaviour. This is illustrated, in this thesis, by an hydrodynamic example, namely the creation of vortices in flow behind a cylinder.Although cellular automata have the ability to describe complex phenomena it is sometimes hard to find the proper rules for a given macroscopic behaviour. One approach which has been successfully employed is to let cellular automata rules evolve (for example, through genetic algorithms) when finding the desired properties. In this thesis this is demonstrated for two-dimensional cellular automata with two possible states of the cells. A genetic algorithm is used to find rules that evolve a given initial configuration of the cells to another given configuration.
The aim of the thesis is to analyze a method which describes the electron correlation in atoms and molecules. The method is based on Rayleigh-Schrödinger perturbation theory with a partitioning of the Hamiltonian into a fairly simple zeroth-order operator and a perturbation operator. The zeroth-order Hamiltonian is founded on a one-electron Fock-type operator and two different operators have been tested. The zeroth-order wave function is constructed from a complete active space self-consistent field (CASSCF) calculation. This means that the zeroth-order wave function for open-shell systems and systems with strong configurational mixing (near degeneracy) can be obtained on an equal level as closed-shell (single determinant) states. The theory is formulated in such a way that the Möller-Plesset perturbation theory is obtained for the closed-shell (single determinant) state. The flexibility of the CASSCF method makes it possible, in principle, to construct the zeroth-order wave function (and the zeroth-order Hamiltonian) to any desired accuracy. The perturbation expansion of the energy is therefore expected to converge fast and only up to the second-order contribution has been implemented leading to fairly fast and accurate calculations. The aim of the computer implementation is to describe the electron correlation in small and medium-sized molecules (up to 20 atoms) accurately. The application of the perturbation method to a number of problems in chemistry is demonstrated in the thesis: (1) the calculation of electronic properties and harmonic vibrational frequencies of the ozone molecule; (2) the calculation of electric dipole polarizabilities of excited valence states of several first- and second-row atoms; (3) the calculation of excited states of the nickel atom, the benzene molecule, and the azabenzenes pyridine, pyrazine, pyridazine, and s-triazine
The electronic spectrum of VCr has been studied using the complete-active-space self-consistent field complete-active-space second-order perturbation theory approach. Potential energy curves for 12 electronic states have been computed. Transition energies, with respect to the X^2Delta ground state, for some of the calculated electronic states are (with possible experimental values within parentheses): 0.53 eV (0.56) for A^2Sigma+, 1.03 eV (1.14) for A^4Delta, 1.20 eV (1.14) for B^2Delta, 1.45 eV (1.51) for B^4Delta, 1.60 eV (1.51, 1.78) for C^2Delta, and 1.61 eV (1.63) for A^4Sigma^-.
People engage with multiple online services and carry out a range of different digital transactions with these services. Registering an account, sharing content in social networks, or requesting products or services online are a few examples of such digital transactions. With every transaction, people take decisions and make disclosures of personal data. Despite the possible benefits of collecting data about a person or a group of people, massive collection and aggregation of personal data carries a series of privacy and security implications which can ultimately result in a threat to people's dignity, their finances, and many other aspects of their lives. For this reason, privacy and transparency enhancing technologies are being developed to help people protect their privacy and personal data online. However, some of these technologies are usually hard to understand, difficult to use, and get in the way of people's momentary goals.
The objective of this thesis is to explore, and iteratively improve, the usability and user experience provided by novel privacy and transparency technologies. To this end, it compiles a series of case studies that address identified issues of usable privacy and transparency at four stages of a digital transaction, namely the information, agreement, fulfilment and after-sales stages. These studies contribute with a better understanding of the human-factors and design requirements that are necessary for creating user-friendly tools that can help people to protect their privacy and to control their personal information on the Internet.
The amount of personal identifiable information that people distribute over different online services has grown rapidly and considerably over the last decades. This has led to increased probabilities for identity theft, profiling and linkability attacks, which can in turn not only result in a threat to people’s personal dignity, finances, and many other aspects of their lives, but also to societies in general. Methods and tools for securing people’s online activities and protecting their privacy on the Internet, so called Privacy Enhancing Technologies (PETs), are being designed and developed. However, these technologies are often seen by ordinary users as complicated and disruptive of their primary tasks.
In this licentiate thesis, I investigate the usability aspects of three main privacy and security enhancing mechanisms. These mechanisms have the goal of helping and encouraging users to protect their privacy on the Internet as they engage in some of the steps necessary to complete a digital transaction. The three mechanisms, which have been investigated within the scope of different research projects, comprise of (1) graphical visualizations of service providers’ privacy policies and user-friendly management and matching of users’ privacy preferences “on the fly”, (2) methods for helping users create appropriate mental models of the data minimization property of anonymous credentials, and (3) employing touch-screen biometrics as a method to authenticate users into mobile devices and verify their identities during a digital transaction.
Results from these investigations suggest that these mechanisms can make digital transactions privacy-friendly and secure while at the same time delivering convenience and usability for ordinary users.
We explore how concepts from the field of network science can be employed to inform Internet users about the way their personal identifiable information (PII) is being used and shared by online services. We argue that presenting users with graphical interfaces that display information about the network structures that are formed by PII exchanges can have an impact on the decisions users take online, such as the services they choose to interact with and the information they decide release.
Frequent contact with online businesses requires Internet users to distribute large amounts of personal information. This spreading of users’ information through different Websites can eventually lead to increased probabilities for identity theft, profiling and linkability attacks, as well as other harmful consequences. Methods and tools for securing people’s online activities and protecting their privacy on the Internet, called Privacy Enhancing Technologies (PETs), are being designed and developed. However, these technologies are often perceived as complicated and obtrusive by users who are not privacy aware or are not computer or technology savvy. This chapter explores the way in which users’ involvement has been considered during the development process of PETs and argues that more democratic approaches of user involvement and data handling practices are needed. It advocates towards an approach in which people are not only seen as consumers of privacy and security technologies, but where they can play a role as the producers of ideas and sources of inspiration for the development of usable PETs that meet their actual privacy needs and concerns.
The PrimeLife Policy Language (PPL) has the objective of helping end users make the data handling practices of data controllers more transparent, allowing them to make well-informed decisions about the release of personal data in exchange for services. In this chapter, we present our work on user interfaces for the PPL policy engine, which aims at displaying the core elements of a data controller's privacy policy in an easily understandable way as well as displaying how far it corresponds with the user's privacy preferences. We also show how privacy preference management can be simplified for end users.
This paper discusses the approach taken within the PrimeLife project for providing user-friendly privacy policy interfaces for the PrimeLife Policy Language (PPL).We present the requirements, design process and usability testing of the “Send Data?” prototype, a browser extension designed and developed to deal with the powerful features provided by PPL. Our interface introduces the novel features of “on the fly” privacy management, predefined levels of privacy settings, and simplified selectionof anonymous credentials. Results from usability tests showed that users understand and appreciate these features and perceive them as being privacy-friendly, and they are therefore suggested as a good approach towards usable privacy policy display and management. Additionally, we present our lessons learnt in the design process of privacy policy interfaces.
We present a prototype of the user interface of a transparency tool that displays an overview of a user's data disclosures to different online service providers and allows them to access data collected about them stored at the services' sides. We explore one particular type of visualization method consisting of tracing lines that connect a user's disclosed personal attributes to the service to which these attributes have been disclosed. We report on the ongoing iterative process of design of such visualization, the challenges encountered and the possibilities for future improvements.
There are moments in which users might find themselves experiencing feelings of panic with the realization that their privacy or personal information on the Internet might be at risk. We present an exploratory study on common experiences of online privacy-related panic and on users' reactions to frequently occurring privacy incidents. By using the metaphor of a privacy panic button, we also gather users' expectations on the type of help that they would like to obtain in such situations. Through user interviews (n = 16) and a survey (n = 549), we identify 18 scenarios of privacy panic situations. We ranked these scenarios according to their frequency of occurrence and to the concerns of users to become victims of these incidents. We explore users' underlying worries of falling pray for these incidents and other contextual factors common to privacy panic experiences. Based on our findings we present implications for the design of a help system for users experiencing privacy panic situations.
The use of mobile smart devices for storing sensitive informationand accessing online services is increasing. At the same time, methods for authenticating users into their devices and online services that are not only secure, but also privacy and user-friendly are needed. In this paper, we present our initial explorations of the use of lock pattern dynamics as a secure and user-friendly two-factor authentication method. We developed an application for the Android mobile platform to collect data on the way individuals draw lock patterns on a touchscreen. Using a Random Forest machine learning classier this method achieves an average Equal Error Rate (EER) of approximately 10.39%, meaning that lock patterns biometrics can be used for identifying users towards their device, but could also pose a threat to privacy if the users' biometric information is handled outside their control.