Two Schemes of Privacy-Preserving Trust Evaluation

1 downloads 0 Views 18MB Size Report
better computational efficiency, while the second one provides greater security at the ..... characteristics of trust either, i.e. trust is dynamically changed with time and is a ...... Although the proposed schemes cannot detect and remove dishonest ..... privacy-preserving computation of trust,” in Proceedings of International ...
Two Schemes of Privacy-Preserving Trust Evaluation Zheng Yan* State Key Lab of Integrated Services Networks, School of Cyber Engineering, Xidian University, China Department of Communications and Networking, Aalto University, Espoo, Finland [email protected]; [email protected] Wenxiu Ding State Key Lab of Integrated Services Networks Xidian University, China [email protected] Valtteri Niemi State Key Lab of Integrated Services Networks, Xidian University, China Department of Computer Science, University of Helsinki, Finland [email protected] Athanasios V. Vasilakos University of Western Macedonia, Kozani, Greece Lulea University of Technology, Sweden [email protected] * Corresponding author: Zheng Yan (email: [email protected]) Abstract Trust evaluation computes trust values by collecting and processing trust evidence. It plays an important role in trust management that automatically ensures trust relationships among system entities and enhances system security. But trust evidence collection and process may cause privacy leakage, which makes involved entities reluctant to provide personal evidence that is essential for trust evaluation. Current literature pays little attention to Privacy-Preserving Trust Evaluation (PPTE). Existing work still has many limitations, especially on generality, efficiency and reliability. In this paper, we propose two practical schemes to guard privacy of trust evidence providers based on additive homomorphic encryption in order to support a traditional class of trust evaluation that contains evidence summation. The first scheme achieves better computational efficiency, while the second one provides greater security at the expense of a higher computational cost. Accordingly, two trust evaluation algorithms are further proposed to flexibly support different application cases. Specifically, these algorithms can overcome attacks raised by internal malicious evidence providers to some extent even though the trust evaluation is partially performed in an encrypted form. Extensive analysis and performance evaluation show the security and effectivity of our schemes for potential application prospect and their efficiency to support big data process. Keywords: Trust evaluation; homomorphic encryption; privacy preservation; secure multiparty computation; big data.

Highlights a) b) c) d)

Two security schemes for Privacy-Preserving Trust Evaluation (PPTE). Trust evaluation algorithms cooperating with the PPTE schemes to resist internal attacks. Security and performance proof of two PPTE schemes through analysis and implementation. Feasibility to support various scenarios with either small or big evidence data.

1. INTRODUCTION Trust evaluation plays an important role in trust management. It is a technical approach of representing trust for digital processing, in which the factors influencing trust are evaluated based on evidence data to get a continuous or discrete number, referred to as a trust value. In the literature, several theories, including Bayesian inference, weighted average models, subjective logic, Dempster-Shafer theory, fuzzy logic and entropy-based models, are applied to model and evaluate trust and reputation (i.e., public trust) [25]. All of above methods generate trust values by analyzing and computing evidence data collected from a number of evidence providers. Many existing methods contain evidence summation in the process of trust evaluation. Nowadays, trust evaluation has been widely applied in various fields of computer, communication and information systems. It assists in automatically ensuring trust relationships among system entities and enhancing system security. 1.1 Motivation Trust is normally evaluated based on the evidence that shows a trustor’s belief on a trustee. Typical examples of evidence include feedback on the performance and quality of the trustee, observed performance or behavior of the trustee, and recommendations on the trustee. In many scenarios, a trustor entity conducts trust evaluation according to personal experiences and the evidence collected from other entities. For example, a reputation server collects individual feedback or votes from many users about a trustee entity (e.g., a mobile application, a movie, a service, or a networking node) for reputation generation. In social networking, social trust can be assessed based on social interaction experiences and feedback collected from a sufficient number of credible individuals. However, trust evaluation could impact the privacy of involved entities. Obviously, processing and analyzing collected evidence data could reveal sensitive information of providers such as personal preferences, opinions and interests. The evaluation may also intrude the privacy of the entity being evaluated. Privacy leakage makes evidence providers hesitant about sharing personal trust evidence. On the other hand, lacking sufficient evidence will influence the accuracy of trust evaluation. Therefore, it is important to guard privacy in trust evaluation in order to guarantee fast development of trust management. Existing work related to Privacy-Preserving Trust Evaluation (PPTE) is rare and imperfect. Most existing methods of trust evaluation did not consider privacy preservation [1-10]. They generally aggregated plain trust evidence to calculate a trust value directly. In order to preserve the privacy of evidence providers, it is preferred that the collected evidence is encrypted and processed in an encrypted manner and the final evaluation result can only be accessed by authorized parties. But in this kind of methods it is hard to detect malicious evidence providers and thus filter their contributions during trust evaluation since the evidence is encrypted. Therefore, the accuracy of trust evaluation becomes difficult to be ensured. Recent advances in privacy-preserving aggregation have mainly been performed in the areas of wireless sensor systems and smart metering [30-37]. These methods cannot be directly applied into trust evaluation due to the difference of system or security models. Most existing work focused on resisting outsider-only attacks, while internal attacks raised by malicious evidence providers were not seriously considered. Recent research started to pay

attention to the privacy issue of trust evaluation [11-19], but with limitations on generality, efficiency and reliability. This makes practical deployment of these solutions very difficult and further study is therefore highly needed. Moreover, most privacy-preserving aggregation schemes [30-37] focus on aggregating collected data for a designated requesting party. This fact makes them impossible to be applied into such a scenario that the aggregated data is requested by a number of different parties due to high computation and communication costs. How to share the aggregated evidence among authorized requesters in a secure and effective way is still an open issue. We are still facing a number of challenges for PPTE. First, many traditional and existing trust evaluation methods cannot be applied if privacy preservation should be suppored. Second, accuracy and reliability of trust evaluation could be lost when privacy preservation has to be supported. It becomes very difficult to overcome internal attacks raised by malicious evidence providers. Third, reducing computation complexity becomes a challenge, especially when cryptographic technologies are applied. PPTE based on big data is hard to be supported with sound efficiency. Forth, flexibly controlling access to evaluation results for multiple authorized parties with computation efficiency has not been well solved. Finally, generality has not been seriously investigated for the purpose of supporting various trust evaluation theories and at the same time preserving privacy for all involved entities. 1.2 Main Contributions In this paper, we propose two schemes to preserve privacy in trust evaluation. To reduce the communication and computation costs, we propose to introduce two servers to realize the privacy preservation and evaluation result sharing among various requestors. We consider a scenario with two independent service parties that do not collude with each other due to their business incentives. One is an Authorized Proxy (AP) that is responsible for access control and management of aggregated evidence to enhance the privacy of entities being evaluated. The other is an Evaluation Party (EP) (e.g., offered by a cloud service provider) that processes the data collected from a number of trust evidence providers. The EP processes the collected data in an encrypted form and produces an encrypted trust pre-evaluation result. When a user requests the pre-evaluation result from EP, the EP first checks the user’s access eligibility with AP. If the check is positive, the AP re-encrypts the pre-evaluation result that can be decrypted by the requester (Scheme 1) or there is an additional step involving the EP that prevents the AP from obtaining the plain pre-evaluation result while still allowing decryption of the pre-evaluation result by the requester (Scheme 2). In either case, the requester then finishes the trust evaluation by itself by decrypting the pre-evaluation results, aggregating and processing them together with evidence statistics recorded by EP and potentially also the evidence accumulated locally. A homomorphic encryption technology, concretely additive homomorphism is applied to realize trust pre-evaluation at EP based on encrypted data collected from a number of trust evidence providers. Considering the current technical limitations of fully homomorphic encryption and its high computational complexity, our schemes attempt to achieve both efficiency and privacy preservation by conducting partial trust evaluation at a third party (e.g., EP) that cannot be fully trusted and is curious on privacy. Meanwhile, we propose two algorithms of trust evaluation in order to illustrate how the proposed two schemes can support PPTE in different trust evaluation cases. The illustrated cases contain evidence summation and achieve reliability in trust evaluation by minimizing the impact of attacks raised by malicious evidence providers. Specifically, the contributions of this paper can be summarized as below: • We show how to preserve the privacy of trust evidence providers by proposing two security schemes for PPTE in order to encourage trust evidence provision. • We propose two algorithms of trust evaluation that can flexibly support the implementation of the two PPTE schemes in different situations. Both algorithms exhibit the characteristics of trust and resist

several typical internal attacks, which is verified through simulations. • We prove the security and justify the performance of two PPTE schemes through analysis and implementation. Through comparison, we show the pros and cons of the schemes and their proper application scenarios. • We show that our schemes are suitable in the world of big data. They can be applied in various scenarios with either a small or big number of evidence providers. The rest of the paper is organized as follows. Section 2 gives a brief overview of related work. Section 3 introduces the system and threat model and our design goals, followed by detailed descriptions of two schemes and trust evaluation algorithms in Section 4. Section 5 gives security analysis and performance evaluation. The application scenarios are presented in Section 6. And finally we conclude the paper in the last section. 2. RELATED WORK 2.1 Trust Evaluation Trust evaluation has been widely studied in various fields of computer, communication and information systems, such as Mobile Ad Hoc Networks (MANET) [1], Peer-to-Peer (P2P) systems [2], cloud computing [3], pervasive computing [4], software systems [5], grid computing [6], e-commerce [7], social networking [8], and wireless sensor networks [9, 10]. Many mechanisms have been developed for supporting trusted communications and collaborations among computing nodes [25]. However, early work of trust evaluation seldom considered privacy preservation during evidence collection and process. Privacy leakage retards evidence provision and collection, thus it greatly impacts the accuracy of trust evaluation and the effectiveness of trust management. 2.2 Privacy-Preserving Aggregation Literature has a number of studies on privacy-preserving aggregation, mainly in the area of wireless sensor networks (WSNs) and smart metering [31-36]. Castelluccia et al. proposed a simple and provably secure encryption scheme that allows efficient additive aggregation of encrypted data [31]. Its security is based on the indistinguishability property of a pseudorandom function (PRF). Li et al. employed an additive homomorphic encryption and a novel key management technique and proposed an efficient protocol to obtain the Sum aggregate [35]. This scheme can also obtain the Min aggregate of time-series data and deal with dynamic joins and leaves of mobile nodes. Low aggregation error can also be achieved by leveraging a ring-based interleaved grouping technology [36]. The above schemes have a major drawback that they are not tolerant of user absence or failure. Thus, they are not applicable for PPTE where the number of evidence providers is not fixed and the provider may not contribute evidence every time. Some constructions were resilient to user failure and compromise [32]. However, all above schemes only resist outsider attacks. Data pollution attacks are outside the scope of these studies. They would be vulnerable to compromised or malicious insiders. Some existing schemes can play as a foundational block of PPTE, but further studies are still needed in order to overcome the challenges of PPTE. Joye and Libert presented a scheme allowing a distrusted aggregator to evaluate the Sum of user's private inputs with no restriction on the message space or the number of users [34]. In this scheme, different input users apply different secret keys for encryption, while the aggregator applies a secret key derived from the above secret keys for decryption. Applying this scheme for PPTE needs additional investigation since trust evaluation is more than additive aggregation. Wang et al. investigated how two non-colluding servers can leverage proxy re-encryption to jointly compute arithmetic functions over the ciphertexts of multiple users without learning the inputs, intermediate or final results [37].

But this scheme only supported additive and multiplicative homomorphic encryption. This limitation makes it hard to design a proper trust evaluation algorithm that can resist internal attacks. Using proxy re-encryption to convert encrypted individual data into a form for homomorphic computation is not very suitable for trust evaluation because it introduces an additional computational load, which makes it hard to support big data process. 2.3 Privacy-Preserving Trust Evaluation For protecting the privacy of evidence providers in trust evaluation, two classes of methods are applied: 1) applying anonymity by masking the real identities of providers; 2) protecting raw evidence by making use of the advances of Secure Multiparty Computation (SMC). Anonymity Maintaining the unlinkability between information and its providers preserves privacy. Some work adopts anonymity to enhance the privacy of information providers [28, 29]. TrustMe [11] realized mutual anonymity for both trust hosts and trust querying peers. It introduced a trust-holding agent peer for each querying peer to manage trust reports collected from other peers. However, it did not consider how to determine a trustworthy peer in the trust evaluation, thus hard to overcome some traditional attacks. LotS [12] maintained the anonymity of information for reputation generation. It realizes unlinkability between the identities of users and their reports, while at the same time it attaches a reputation score to each user. But it didn’t discuss how to resist malicious attacks in trust evaluation when the identity privacy of evidence providers is preserved. ARTSense [13-14] solved the problem of “trust without identity” in mobile sensing. By applying blinded identifiers, the trust/reputation evaluator is unable to associate a sensing report with a particular participant. ARTSense can defend against report flooding attack. But for other attacks, investigation was limited. Obviously, detecting malicious trust evidence providers is a big challenge if their identities are hidden and frequently changed. Secure Multiparty Computation SMC is a paradigm that keeps data of individual parties as secret while still computes private results from the secret data to satisfy their common interest. The outcome of the computation is made available to eligible parties. SMC enables parties with private data to collaboratively compute a global function of their data without revealing that input data to other parties, thus minimizing the threat of disclosure. SMC provides plausible solutions for problems like privacy-preserving database query, privacy-preserving scientific computation, privacy-preserving intrusion detection and privacy-preserving data mining [27]. PPTE is related to privacy-preserving data mining and privacy-preserving scientific computation, focusing on mining and computing trust values from a set of trust evidence. Quite a number of studies have been conducted in the area of SMC and few are related to trust and reputation evaluation. Some work studied how to get the additive aggregation without eroding privacy of data providers. For example, Pavlov et al. proposed several protocols depending on additive aggregation to achieve partial privacy in reputation systems [16]. But one protocol is vulnerable to collusion, while another one suffers from high computation complexity with an order 𝒪(𝑛! ), where 𝑛 is the number of witnesses. Additionally, generation of many random values burdens computation and communication processes. Erkin et al. presented a privacy-enhanced recommender system [15] in a social trust network. It introduced random numbers to protect true values and finally removed the masks by subtracting these random numbers in order to obtain the sum of ratings. Homomorphic encryption and SMC techniques were applied for generating recommendations. But this work didn’t consider how to identify trustworthy ratings for recommendations. Other efforts were made focusing on how to obtain privacy-preserving weighted trust values. Yao et al.

presented a scheme to obtain trust values from private recommendations by applying a private distributed scalar product [17], where one vector represents the private data of trustor and the other is the private data from providers. It multiplied the provided data by its weight based on homomorphic encryption. Each provider chose one random value to protect its own raw data. But all providers should contribute to offer the sum of all random values by using private multiparty summation, which introduces a security hole caused by malicious providers and also increases computation cost. In order to reduce computation cost and achieve better security, Melchor et al. proposed a new scheme that can also overcome a collusion attack [18]. Furthermore, Nithyanand and Raman proposed a privacy-preserving P2P reputation scheme (3PRep) to count the number of votes for computing a global reputation value [19]. This scheme can verify a subset of votes and resist vote spoofing and modification. But it performs 𝑛! times of homomorphic encryptions and 𝑛! times of homomorphic decryptions, where 𝑛 is the number of votes. Due to many homomorphic operations, the computation burden of this method is heavy. And the need for pre-encryption requires additional storage. The existing solutions for PPTE have some weaknesses. First, the methods that adopted a scalar product as the trust value cannot flexibly support other trust theories [17, 18]. They didn’t reflect the basic characteristics of trust either, i.e. trust is dynamically changed with time and is a subjective opinion [25]. Second, PPTE introduces additional challenges on the detection of malicious evidence providers. This makes it difficult to ensure the accuracy and reliability of trust evaluation. Third, computation complexity of PPTE requires further improvement, considering the possibility of big volume of evidence data in many application scenarios. 3. PROBLEM STATEMENT 3.1 System Model We consider a trust management system with trust evaluation functionality that involves three different kinds of entities: the nodes that interact and need to evaluate trust with each other (e.g., in an application scenario such as social networking). They can provide trust evidence based on interaction experiences; EP that is served by a cloud service provider for privacy-preserving computation and AP that is responsible for evidence access control and management. The node evaluates trust of other nodes with the support of EP and AP.

Fig.1. A system model An example system structure is shown in Figure 1. At each node, there is an application module that contains diverse applications demanding trust evaluation, such as a Social Communication Module. A Trust Manager is applied to collect, protect and disseminate trust evidence to EP. A Pre-Evaluation Result

Processor can decrypt trust pre-evaluation result. A Trust Evaluator calculates the trust value of a node by aggregating collected information. The Trust Manager also manages cryptographic keys of the node. In EP, a Data Computation Module processes the encrypted data collected from the nodes (i.e., evidence providers). A Data Statistic Module counts the statistics of trust evidence, e.g., the number of evidence pieces in a specific time slot. The AP contains an Access Policy Manager that is applied to check the access right of an evidence requester and to perform encryption or decryption in order to make the pre-evaluation result accessible to an authorized node. A Key Manager is applied to manage cryptographic keys in AP. 3.2 Threat Model Our research refers to the following threat model. The system node cares its own privacy during evidence provision and wants to evaluate trust of other nodes. It is curious on the trust evidence of other nodes, but doesn’t want to share its evidence with other system entities. We assume that EP always fulfills its computation duties according to the system design due to business incentives. But it cannot be fully trusted. It may be curious about the private information of the nodes and could disclose any of its findings. AP is either fully trusted in Scheme 1 or semi-trusted in Scheme 2 (e.g., curious on trust pre-evaluation result and could disclose it). EP and AP are assumed to not collude with each other. Neither EP nor AP intends to corrupt evidence data or computation process to prevent nodes from utilizing data correctly. EP and AP communicate in a secure way. AP and EP do not share identity information of requesting nodes and evaluated nodes with other nodes. We also assume that the communication between a node and EP is over a secure channel and therefore AP cannot get individual encrypted evidence pieces from eavesdropping on that channel. We note that existing trust evaluation methods and algorithms are various. A most common computation block is summation of evidence. We do not need the computation-heavy fully homomorphic encryption, instead we only make use of additive homomorphic encryption in our design and assume that it is secure. 3.3 Design Goals To achieve PPTE and encourage nodes to provide evidence, our design aims to achieve the following security and performance goals: (1) Privacy Preservation: the plain content of any individual trust evidence should not be disclosed in evidence collection and process; the privacy of evidence providers should not be intruded due to data provision. (2) Trust Evaluation: the trust evaluation function should exhibit the basic characteristics of trust and consider the impact of evidence collection time and the number of evidence used for evaluation. Meanwhile, trust evaluation should resist attacks raised by malicious evidence providers to some extent. (3) Low Computation Cost: the computation cost should be low in order to support big data process. Low computation expense is preferred at the node for both evidence provision and trust evaluation. 4. THE PROPOSED SCHEME 4.1 Preliminaries and Notations Homomorphic Encryption We utilize the following characteristics of homomorphic encryption, e.g., Paillier’s cryptosystem [20, 21]: 𝐻𝐸 !!!! 𝑎!   = !!!! 𝐻𝐸 𝑎! , (1) ! 𝐻𝐸 𝑎×𝑐   = 𝐻𝐸(𝑎) , (2) where 𝐻𝐸 is an encryption function with an encryption key 𝐻𝑃𝐾; 𝑎! is the data to be encrypted, 𝐻𝐸(𝑎! ) ≠ 0 . If 𝐻𝑆𝐾 is the corresponding secret key of 𝐻𝑃𝐾 , 𝐻𝐷 the decryption function with decryption key 𝐻𝑆𝐾, we have

𝐻𝐷 !!!! 𝐻𝐸 𝑎! = !!!! 𝑎! , 𝐻𝐷 𝐻𝐸(𝑎)! = 𝑎×𝑐. Thereby, we can obtain the sum of a number of values without the knowledge of any individual one.

(3) (4)

Notations Table 1 summarizes the notations used in this paper. TABLE 1. NOTATIONS Notations Description The signing key of x 𝑆𝑖𝑔𝐾! The key for verifying x’s signature 𝑉𝐾! The public key of x for public-key cryptographic operations 𝑃𝐾! The secret key of x for public-key cryptographic operations 𝑆𝐾! The public key of x for homomorphic encryption 𝐻𝑃𝐾! The secret key of x for homomorphic decryption 𝐻𝑆𝐾! The function of homomorphic key generation 𝑆𝑒𝑡𝑢𝑝𝐻𝑜𝑚(𝑘) The function of asymmetric key generation 𝑆𝑒𝑡𝑢𝑝𝐴𝑠𝑦𝑚 𝑘 The function of public/private key pair generation for signing 𝑆𝑒𝑡𝑢𝑝𝑆𝑖𝑔(𝑘)   𝐻𝐸{𝐻𝑃𝐾! , 𝑚} The function of homomorphic encryption of m with 𝐻𝑃𝐾! 𝐻𝐷{𝐻𝑆𝐾! , 𝑐} The function of homomorphic decryption of c with 𝐻𝑆𝐾! The signature of x on content m 𝑆𝑖𝑔𝑛 𝑆𝑖𝑔𝐾! , 𝑚 𝑉𝑒𝑟𝑓 𝑉𝐾! , 𝑆𝑖𝑔𝑛 𝑆𝑖𝑔𝐾! , 𝑚 , 𝑚 The function of verifying x’s signature on m with 𝑉𝐾! 𝐸 𝑃𝐾! , 𝑚 The function of public key encryption of m with 𝑃𝐾! 𝐷 𝑆𝐾! , 𝑐 The function of public key decryption of c with key 𝑆𝐾! The jth time slot 𝑡! 𝑒𝑟! (𝑦) The pre-evaluation result about node 𝑦 in time slot 𝑡! 𝑒𝑟(𝑦) The set of pre-evaluation results about node 𝑦 The trust evidence provided by node 𝑧! about node 𝑦, e.g., the vote of 𝑡𝑒(𝑧! , 𝑦) 𝑧! on 𝑦   𝑡e! (𝑦) The aggregated evidence about node 𝑦 in 𝑡!   The statistics of trust evidence of 𝑦 in 𝑡! , e.g., the number of 𝑠! (𝑦) collected trust evidence pieces 𝑡𝑒(𝑦) The set of aggregated trust evidence of 𝑦 𝑠(𝑦) The set of statistics of trust evidence of 𝑦 The maximum number of time slots taken into consideration 𝑊 The total number of time slots 𝐽 𝑁! The number of nodes that provide trust evidence in 𝑡! The total number of trust evidence pieces obtained in all time slots 𝑁 𝑇𝑣! The trust value of 𝑦 𝑇𝑣′! The previous or local trust value of 𝑦 The time when trust is evaluated 𝑡! The trust evaluation function 𝐹 or  𝐹′ 4.2 Schemes Designing schemes for PPTE by applying additive homomorphism with proxy-based re-encryption has a number of advantages.

First, privacy can be preserved in trust evaluation. Using the homomophic encryption, we hide the opinion/experience of an individual trust evidence provider on a trustee entity, thus greatly preserve its privacy since the aggregated pre-evaluation result calculated by the EP does not disclose individual information. In addition, only authorized parties can access the encrypted pre-evaluation result by applying proxy-based re-encryption. This can further improve the system security and privacy. The collected trust evidence data and pre-encryption result are always in encrypted forms at EP, thus EP has no way to know their plaintexts and is impossible to track the privacy of evidence providers. Second, although applying the homomorphic encryption to compute encrypted data, we can still achieve trustworthy trust evaluation by overcoming some potential attacks on trust evaluation mechanism, such as bad mouthing attacks and on-off attacks. The evidence provided by the trusted nodes can be finally well considered in the trust evaluation and calculation, especially at a node that has direct interaction experiences with the trustee. Refer to Section 5.4. Third, it is possible to propose a generic solution that can be applied into any application scenarios with such a common system model that system nodes report encrypted evidence on other entities to EP; EP conducts pre-process on collected encrypted data; If a system node requests the encrypted result in order to do final evaluation on a target entity locally, re-encryption is performed by asking the approval of an authorized proxy that manages access policies. We can design schemes that flexibly support evaluating trust at either a system node or a centralized party no matter it has direct experience with the target entity or not. Forth, it is feasible to take the latest advance of homomophic encryption technologies. EP can handle more computations regarding trust evidence aggregation if more computation operations, not only addition, can be technically implemented at EP. Scheme 1 In this scheme, we assume collusion does not exist between EP and AP. AP is a fully trusted party that does not disclose pre-evaluation results to any unauthorized parties. Scheme 1 contains a number of operations: System Setup, Registration, Pre-evaluation Request, Evidence Provision, Trust Pre-evaluation, Encryption Transform, and Trust Evaluation. Figure 2 illustrates the procedure of Scheme 1. System Setup: AP chooses a security parameter 1! and outputs 𝐻𝑃!" and 𝐻𝑆𝐾!" by calling 𝑆𝑒𝑡𝑢𝑝𝐻𝑜𝑚(𝑘). It also generates a key pair 𝑆𝑖𝑔𝐾!" , 𝑉𝐾!" for generating and verifying signatures by calling 𝑆𝑒𝑡𝑢𝑝𝑆𝑖𝑔(𝑘). It shares 𝐻𝑃𝐾!" and 𝑉𝐾!" in the system. Meanwhile, node 𝑥 generates its own key pair (𝑃𝐾! , 𝑆𝐾! ) for asymmetric encryption and decryption by calling 𝑆𝑒𝑡𝑢𝑝𝐴𝑠𝑦𝑚 𝑘 . Registration: Node 𝑥 registers itself at AP by providing its public key  𝑃𝐾! through a secure channel. The AP responds a successful registration by issuing its public key 𝐻𝑃𝐾!" and signed 𝑃𝐾! : 𝑆𝑖𝑔𝑛 𝑆𝑖𝑔𝐾!" , 𝑃𝐾! . Pre-evaluation Request: Node 𝑥 requests EP for trust pre-evaluation on node 𝑦 by sending 𝑆𝑖𝑔𝑛 𝑆𝑖𝑔𝐾!" , 𝑃𝐾! , 𝑃𝐾! , 𝑦 . EP verifies the AP’s signature by calling 𝑉𝑒𝑟𝑓 𝑉𝐾!" , 𝑆𝑖𝑔𝑛 𝑆𝑖𝑔𝐾!" , 𝑃𝐾! , 𝑃𝐾! and collects the trust evidence about 𝑦 to perform trust pre-evaluation if the verification is positive. Note that if we use the hash code of 𝑃𝐾! : ℎ(𝑃𝐾! ) instead of 𝑃𝐾! in the signature and its verification in this scheme, we can hide plain 𝑃𝐾! of the requesting node from EP. Evidence Provision: The nodes that ever interacted with 𝑦 provide their encrypted evidence to EP using AP’s public key based on homomorphic encryption. Concretely, node 𝑧! (𝑧! ∈ 𝑍) provides encrypted evidence: 𝐻𝐸{𝐻𝑃𝐾!" , 𝑡𝑒(𝑧! , 𝑦)} to EP, where 𝑡𝑒(𝑧! , 𝑦) denotes the evidence provided by z! about 𝑦. Trust Pre-evaluation: EP collects all encrypted evidence and multiplies them one by one, which results

!

! in a pre-evaluation result 𝐻𝐸{𝐻𝑃𝐾!" , 𝑡𝑒! (𝑦)}= !!! 𝐻𝐸{𝐻𝑃𝐾!" , 𝑡𝑒(𝑧! , 𝑦)}.  𝑁! is the number of evidence pieces in 𝑡! . It is also the number of evidence providing nodes in case each node provides only one evidence in  𝑡! . Parameter 𝑡𝑒! (𝑦) denotes the aggregated evidence about 𝑦 in  𝑡! . In addition, EP also records the statistics of gathered evidence 𝑠! 𝑦 , e.g.,  𝑁! . Next, EP forwards 𝐻𝐸{𝐻𝑃𝐾!" , 𝑡𝑒! (y)}, 𝑠! (𝑦) and  𝑃𝐾! to AP. Note that the data package from EP to AP is signed by EP in order to ensure non-repudiation. For symplifying presentation, this signature is not shown in Figure 2. Encryption Transform: After obtaining the encrypted aggregated evidence in  𝑡! , AP first verifies the validity of 𝑃𝐾! or ℎ(𝑃𝐾! ) and checks the access right of node 𝑥. If the verification is positive, AP decrypts the aggregated result and re-encrypts 𝑡𝑒! (𝑦) with 𝑃𝐾! . Then it sends  𝐸{𝑃𝐾! , 𝑡e! (𝑦)} and 𝑠! (𝑦) back to node 𝑥. Trust Evaluation: After receiving the encrypted evidence, node 𝑥 decrypts data to obtain 𝑡𝑒! (𝑦) and then evaluates trust of 𝑦 with a trust evaluation algorithm that will be introduced in Section 4.3 in details.

Fig. 2. The procedure of PPTE in Scheme 1 Scheme 2 Also in this scheme, we assume that EP and AP don’t collude, but unlike in the previous scheme, AP is not fully trusted. The AP could disclose the pre-evaluation results to an unauthorized party. The operations of Scheme 2 are similar to Scheme 1. The main difference is an additionally introduced operation named Evidence Recovery. The detailed procedure is presented in Figure 3.

Fig. 3. The procedure of PPTE in Scheme 2 System Setup: Different from Scheme 1, node 𝑥 calls 𝑆𝑒𝑡𝑢𝑝𝐻𝑜𝑚(𝑘) rather than 𝑆𝑒𝑡𝑢𝑝𝐴𝑠𝑦𝑚(𝑘) to generate its own key pair (𝐻𝑃𝐾! , 𝐻𝑆𝐾! ) for homomorphic encryption and decryption. Registration: This step is the same as Scheme 1 except that the public key 𝐻𝑃𝐾! is sent to AP to get AP’s certification 𝑆𝑖𝑔𝑛(𝑆𝑖𝑔𝐾!" , 𝐻𝑃𝐾! ). Pre-evaluation Request: Node 𝑥 requests EP for trust pre-evaluation on node 𝑦 by sending 𝑆𝑖𝑔𝑛 𝑆𝑖𝑔𝐾!" , 𝐻𝑃𝐾! , 𝐻𝑃𝐾! , 𝑦 . EP verifies the AP’s signature with 𝑉𝐾!" and collects the trust evidence about 𝑦 if the verification is positive to perform trust pre-evaluation. Evidence Provision: Node 𝑧! offers encrypted evidence to EP: 𝐻𝐸 𝐻𝑃𝐾!" , 𝑡𝑒 𝑧! , 𝑦 . Trust Pre-evaluation: The first part of this step is the same as Scheme 1: EP records 𝑠! 𝑦 and computes the product of all encrypted evidence, which results in 𝐻𝐸{𝐻𝑃𝐾!" , 𝑡e! (𝑦)} = !! 𝐻𝐸{𝐻𝑃𝐾!" , 𝑡𝑒(𝑧! , 𝑦)}. !!!

Differently from Scheme 1, EP selects a random number r, encrypts r with 𝐻𝑃𝐾!" and computes 𝐻𝐸 𝐻𝑃𝐾!" , 𝑡e! (𝑦) ×𝐻𝐸{𝐻𝑃𝐾!" , 𝑟} to get 𝐻𝐸{𝐻𝑃𝐾!" , 𝑡e! (𝑦) + 𝑟} . Next, EP sends 𝐻𝐸{𝐻𝑃𝐾!" , 𝑡e! (𝑦) + 𝑟} and  𝐻𝑃𝐾! to AP. Encryption Transform: AP verifies the validity of 𝐻𝑃𝐾! and checks the access right of node 𝑥. If the verification is positive, AP decrypts the aggregated result to obtain 𝑡e! (𝑦) + 𝑟 and re-encrypts it with 𝐻𝑃𝐾! . Then it sends  𝐻𝐸{𝐻𝑃𝐾! , 𝑡e! (𝑦) + 𝑟} back to EP. Unlike Scheme 1, AP has no way to know the plaintext of 𝑡e! (𝑦) in this step. In the new operation Evidence Recovery, EP knows 𝐻𝑃𝐾! at trust evaluation request. It recovers the encryption of the true aggregated evidence 𝐻𝐸{𝐻𝑃𝐾! , 𝑡e! (𝑦)} by multiplying 𝐻𝐸{𝐻𝑃𝐾! , 𝑡e! (𝑦) + 𝑟}   with 𝐻𝐸{𝐻𝑃𝐾! , −𝑟} without knowing 𝑡e! (𝑦). Then it sends 𝐻𝐸{𝐻𝑃𝐾! , 𝑡e! (𝑦)} and 𝑠! (𝑦) to node 𝑥. Trust Evaluation: The requesting node 𝑥 decrypts 𝐻𝐸{𝐻𝑃𝐾! , 𝑡e! (𝑦)} to obtain 𝑡e! (𝑦) and evaluates the trust value of 𝑦 with the trust evaluation algorithm.

4.3 Trust Evaluation In order to illustrate that our schemes can support PPTE in different cases that contain evidence summation and resist internal attacks raised by malicious evidence providers, we further propose two concrete trust evaluation algorithms that can co-operate with the proposed schemes to achieve PPTE. Suppose that a trust-evaluating node can obtain 𝐽 pieces of aggregated evidence in total of 𝐽 time slots. In each time slot, an evidence-providing node only offers one piece of evidence and one piece of aggregated evidence can be obtained. In time slot 𝑡! , EP collects encrypted trust evidence from 𝑁! nodes (𝑧! , 𝑧! , … , 𝑧!! ). It computes the encrypted sum of the trust evidence based on additive homomorphism as below: 𝐸 𝐻𝑃𝐾!" ,

!! 𝑡𝑒 !!!

𝑧! , 𝑦

!! 𝐸 !!!

=

 𝐻𝑃𝐾!" , 𝑡𝑒 𝑧! , 𝑦 .

(5)

If node 𝑥 requests and is allowed by AP to access the above result, it gets the encrypted aggregated evidence: 𝑒𝑟! (𝑦) = 𝐸 𝑃𝐾! ,

!! 𝑡𝑒 !!! !

𝑧! , 𝑦

(6)

in  𝑡! in Scheme 1; or 𝑒𝑟! (𝑦) = 𝐸 𝐻𝑃𝐾! ,

!! 𝑡𝑒 !!! !

𝑧! , 𝑦

(7)

in  𝑡! in Scheme 2. No matter in Scheme 1 or Scheme 2, node 𝑥 obtains the aggregated evidence 𝑡𝑒! (𝑦) through one decryption. Final trust evaluation is conducted at node 𝑥 that gets a set of aggregated evidence at different time slots. Herein, we illustrate two cases as below. Case 1 In Case 1, node 𝑥’s personal trust in node 𝑦 is available, denoted 𝑇𝑣′(𝑦). For instance, node 𝑥 is a social networking node that has social interaction experiences with node 𝑦. We apply Algorithm 1 to conduct trust evaluation at node 𝑥. For easy illustration, we assume that trust evidence is a vote expressed by a concrete value that indicates like or dislike, 𝑁 = !!!! 𝑁! and 𝑁! = 𝑠! (𝑦). Algorithm 1: Trust evaluation at node 𝑥 on node 𝑦 when 𝑇𝑣′(𝑦) is available Input: 𝑒𝑟(𝑦)={𝑒𝑟! (𝑦), 𝑒𝑟! (𝑦), … , 𝑒𝑟! (𝑦)}, 𝑡 = {𝑡! , 𝑡! , … , 𝑡! }, 𝑠(𝑦) = {𝑠! (𝑦), 𝑠! (𝑦), … , 𝑠! (𝑦)}, 𝑇𝑣′(𝑦), and 𝑡! ; Output: 𝑇𝑣(𝑦). !

! - Node 𝑥 decrypts each 𝑒𝑟! (𝑦) with its private key to get 𝑡𝑒 𝑦 = 𝑡𝑒! 𝑦 , 𝑡𝑒! 𝑦 = !!! 𝑡𝑒! 𝑧! , 𝑦 , 𝑗 = 1, … , 𝐽 - Calculate node 𝑦’s trust value at node 𝑥: 𝑇𝑣(𝑦) = 𝐹(𝑡𝑒(𝑦), 𝑠(𝑦), 𝑇𝑣′(𝑦), 𝑡! ). Function 𝐹 is designed to evaluate trust by considering the statistics (i.e., 𝑠(𝑦)) of trust evidence, evaluating time 𝑡! , all collected evidence 𝑡𝑒 𝑦 in time slots 𝑡 = {𝑡! , 𝑡! , … , 𝑡! }, as well as 𝑇𝑣′(𝑦). Since trust is influenced by time, the newly collected evidence should contribute more than the old one in the trust evaluation. Due to this reason, we apply parameter 𝑊 as the maximum number of latest time slots taken into consideration in the final trust evaluation. If 𝐽 > 𝑊, we set 𝐽 = 𝑊. A number of factors are considered in the design of function 𝐹. First, we consider the impact of the number of trust evidence providers. We use the Rayleigh cumulative distribution function 𝜃 𝐼 = 1 − 𝑒𝑥𝑝 −𝐼 ! /2𝜎 ! to model the impact of integer number 𝐼 (e.g.,  𝑠! (𝑦)), where  𝜎 > 0, is a parameter that inversely controls how fast the number 𝐼 impacts the increase of  𝜃 𝐼 . Parameter    σ   can be set from 0 to

theoretically  ∞, to capture the characteristics of different scenarios. Second, we apply time decaying to avoid potential attacks on trust evaluation (such as on-off attack and conflict behavior attack). We use 𝑒𝑥𝑝 −|𝑡! − 𝑡! |! /𝜏 to control time decaying, which is adjusted by parameter 𝜏. Meanwhile, we apply deviation 1 − 𝑡𝑒! (𝑦)/𝑠! (𝑦) − 𝑇𝑣′(𝑦) in 𝐹 to resist bad mouthing and collaborative bad mouthing attacks. The bigger the deviation of the evidence from the local trust value, the less contribution it has in the trust evaluation. In addition, we use this deviation to adjust the contribution of aggregated evidence in the final trust evaluation in order to further resist the above-mentioned attacks. We aim to achieve trustworthy trust evaluation and at the same time preserve the privacy of trust evidence providers since individual opinion on a node is hidden after homomorphic encryption. Based on the above adjustment, function 𝐹 is designed as below: !

𝑇𝑣(𝑦) = α×𝑇𝑣′(𝑦) + 𝛽× ! where

𝑆=

parameters,

! !!! 𝜃 𝑠! (𝑦) ! 𝛽 = ! !!!! 1

! !!! 𝜃

𝑠! (𝑦) ×

!"! (!) !! (!)

× 1 − 𝑡𝑒! (𝑦)/𝑠! (𝑦) − 𝑇𝑣′(𝑦) ×𝑒

× 1 − 𝑡𝑒! (𝑦)/𝑠! (𝑦) − 𝑇𝑣′(𝑦) ×𝑒

!

|!! !!! |! !

;

!

|!! !!! |!

𝛼  and  𝛽   are

!

,

(8)

general

weight

− 𝑡𝑒! (𝑦)/𝑠! (𝑦) − 𝑇𝑣′(𝑦)   and  𝛼 = 1 − 𝛽 .   Parameter 𝜂 is applied to

further adjust the contribution of aggregated evidence. Case 2 In Case 2, node 𝑥 has no opinion on node 𝑦, i.e., 𝑇𝑣′(𝑦) is not available. For instance, node 𝑥 is a social networking reputation center that has no social interaction experiences with node  𝑦. But a number of aggregated evidence pieces accumulated in different time slots can be obtained. We apply Algorithm 2 to conduct trust evaluation at node 𝑥. Algorithm 2: Trust evaluation at node    𝑥 on node    𝑦 when 𝑇𝑣′(𝑦) is not available Input: 𝑒𝑟(𝑦)={𝑒𝑟! (𝑦), 𝑒𝑟! (𝑦), … , 𝑒𝑟! (𝑦)}, 𝑡 = {𝑡! , 𝑡! , … , 𝑡! }, 𝑠(𝑦) = {𝑠! (𝑦), 𝑠! (𝑦), … , 𝑠! (𝑦)}, and 𝑡! ; Output: 𝑇𝑣(𝑦). !

! - Node 𝑥 decrypts each 𝑒𝑟! (𝑦) with its private key to get 𝑡𝑒 𝑦 = 𝑡𝑒! 𝑦 , 𝑡𝑒! 𝑦 = !!! 𝑡𝑒! 𝑧! , 𝑦 , 𝑗 = 1, … , 𝐽 - Calculate node 𝑦’s trust value at node 𝑥: 𝑇𝑣(𝑦) = 𝐹′(𝑡𝑒(𝑦), 𝑠(𝑦), 𝑡! ). We design function 𝐹′ to perform trust evaluation in Case 2 by considering the statistics of trust evidence, trust evaluating time 𝑡! , and the aggregated trust evidence. Since 𝑇𝑣′(𝑦) is not available in this case, we cannot depend on the deviation 1 − 𝑡𝑒! (𝑦)/𝑠! (𝑦) − 𝑇𝑣′(𝑦) to resist the bad mouthing attack.

Instead, we adopt a trimmed mean method to filter evidence data. We order 𝑇! = 𝜃 𝑠! (𝑦) × !

|!! !!! |!

!"! (!) !! (!)

×

! 𝑒 (𝑗 = 1, 2, 3, ⋯ , 𝐽) based on their values and trim those 𝑇! (𝑗 < 𝜖𝐽 + 1 or 𝑗 > 𝐽 − [𝜖𝐽]) that are too big or too small. Herein, 𝜖𝐽 is the biggest integer that is smaller than or equal to 𝜖𝐽 and 𝜖 is a parameter designated according to practical demand, e.g., 𝜖 ∈ [0.1, 0.2]. By applying the trimmed mean method, we can remove the evidence that has a big deviation from the average trust evidence, thus prevent potential bad mouthing attack. Hence, function 𝐹 ! is designed as below: ! !![!"] 𝑇𝑣(𝑦) = !! !! !! !! 𝑇! , (9)

where 𝑆′ =

!![!"] !! !! !! 𝜃

𝑠! (𝑦) ×𝑒

!

|!! !!! |! !

.

5. SECURITY ANALYSIS & PERFORMANCE EVALUATION 5.1 Security Analysis The data confidentiality of our schemes is achieved by the homomorphic encryption and traditional Public-Key Cryptography (PKC). Assuming that we have secure algorithms for both of them, the security of our schemes relies on the security of the proposed scheme design. Based on the assumption and threat model of our research, EP never shares the collected trust evidence with AP, so that AP cannot get the encrypted trust evidence 𝐻𝐸 𝐻𝑃𝐾!" , 𝑡𝑒 𝑧! , 𝑦  from EP although it can decrypt it. AP cannot get the individual encrypted evidence pieces from eavesdropping on the communication channels between nodes 𝑧! and EP. Hiding the plain content of evidence can preserve the privacy of evidence providers. There is no need for the evidence providers to provide their identities during evidence provision. The security of our schemes can be proved based on the following propositions. Proposition 1. EP can get neither the raw evidence 𝑡𝑒 𝑧! , 𝑦 of any provider 𝑧! on any node 𝑦, nor the aggregated evidence 𝑡e! (𝑦) at any time slot 𝑡! about 𝑦. In Scheme 1, EP collects the encrypted evidence 𝐻𝐸 𝐻𝑃𝐾!" , 𝑡𝑒 𝑧! , 𝑦 from nodes. Without 𝐻𝑆𝐾!" , EP cannot decrypt 𝐻𝐸 𝐻𝑃𝐾!" , 𝑡𝑒 𝑧! , 𝑦 to obtain 𝑡𝑒 𝑧! , 𝑦 . EP cannot either obtain the plaintext of aggregated evidence 𝑒𝑟! (𝑦) because 𝑒𝑟! (𝑦) is also encrypted with 𝐻𝑃𝐾!" . Therefore, EP cannot obtain the plaintext of individual evidence 𝑡𝑒 𝑧! , 𝑦 and the plaintext of aggregated evidence 𝑡e! (𝑦). In Scheme 2, EP collects the encrypted evidence 𝐻𝐸 𝐻𝑃𝐾!" , 𝑡𝑒 𝑧! , 𝑦 from nodes and gets one new ciphertext  𝐻𝐸{𝐻𝑃𝐾! , 𝑡e! (𝑦) + 𝑟} encrypted with 𝐻𝑃𝐾! from AP. EP can get nothing from 𝐻𝐸 𝐻𝑃𝐾!" , 𝑡𝑒 𝑧! , 𝑦 and  𝐻𝐸{𝐻𝑃𝐾! , 𝑡e! (𝑦) + 𝑟} without knowing the corresponding secret keys 𝐻𝑃𝐾!" and 𝐻𝑃𝐾! and would never obtain 𝑡𝑒 𝑧! , 𝑦 and 𝑡e! (𝑦). Proposition 2. AP cannot obtain the raw evidence 𝑡𝑒 𝑧! , 𝑦 of any provider 𝑧! . In Scheme 1, AP is fully trusted and would not collude with anyone, especially EP. AP can get the encrypted aggregated trust evidence 𝐻𝐸 𝐻𝑃𝐾!" , 𝑡𝑒! (𝑦) , thus it can get 𝑡𝑒! (𝑦). But it has no information about the evidence provided by an individual node, i.e., 𝑡𝑒 𝑧! , 𝑦 . Since AP does not disclose 𝑡𝑒! (𝑦) to any other parties, Scheme 1 can realize the goal to protect the privacy of evidence providers. Compared with Scheme 1, the data received by AP in Scheme 2 is fuzzified by EP as 𝐻𝐸{𝐻𝑃𝐾!" , 𝑡e! (𝑦) + 𝑟}, which makes the aggregated trust evidence even hidden from AP. AP cannot infer and thus know 𝑡𝑒 𝑧! , 𝑦 . This scheme can achieve better security than Scheme 1 because it can protect the aggregated evidence 𝑡e! (𝑦) from both EP and AP. Proposition 3. Our schemes can resist collusion of up to 𝑁! − 1 (𝑗 = 1, … , 𝐽) nodes regarding time slot 𝑡! , where 𝑁! is the number of evidence providers in 𝑡! . As EP cannot be fully trusted, two possible collusions exist: the requesting node 𝑥 colludes with evidence providers 𝑧! (𝑧! ∈ 𝑍); the evidence-providing node 𝑧! (𝑧! ∈ 𝑍) colludes with EP. In case that the requesting node 𝑥 colludes with a number of 𝑁 ! evidence-providing nodes in order to obtain a specific node’s trust evidence in 𝑡! , they can only obtain aggregated trust evidence of those nodes that do not participate in the collusion if 𝑁 ! < 𝑁! − 1 and 𝑡𝑒(𝑧! , 𝑦)   is not null. This is because all the trust evidence submitted to EP is encrypted with 𝐻𝑆𝐾!" . In both schemes, node 𝑥 can just obtain the aggregated trust evidence: 𝑡𝑒! (𝑦) = data of a specified node 𝑧! , it should

!! 𝑡𝑒 (𝑧 , 𝑦). If the requesting node 𝑥 !!! ! ! !! know !!!,!!! 𝑡𝑒! (𝑧! , 𝑦). In this case,

wants to disclose the private it need collude with all other

𝑁! − 1 nodes except 𝑧! . For these 𝑁! − 1 nodes, maybe they are curious about the private data of node  𝑧! , but they should first disclose their own private data to 𝑥. Normally, they do not like to obtain information at

the expense of sacrificing their own privacy. What’s more, the nodes choose to provide their evidence independently and optionally. In our schemes, there is no need for 𝑧! to provide its identity to EP during evidence provision. Finding other 𝑁! − 1 nodes from all nodes in a community is difficult in practice, which could further obstruct this attack. In general, the proposed two schemes can ensure the confidentiality of 𝑡𝑒! (𝑧! , 𝑦) by resisting the collusion of up to 𝑁! − 1 evidence providers in time slot 𝑡! . In both schemes, EP cannot decrypt 𝐻𝐸 𝐻𝑃𝐾!" , 𝑡𝑒 𝑧! , 𝑦 without 𝐻𝑆𝐾!" . Even though node 𝑧! can obtain 𝑡𝑒! (𝑦) from EP, it cannot obtain 𝑡𝑒 𝑧! , 𝑦 provided by a specific node 𝑧! because of homomorphic encryption except that node 𝑧! colludes with other 𝑁! − 1 evidence providing nodes. EP also needs to collude with 𝑁! − 1 evidence providing nodes in order to obtain one individual node’s trust evidence in 𝑡! . Obviously, the bigger 𝑁! is, the more difficult to collude and obtain 𝑡𝑒 𝑧! , 𝑦 . Proposition 4. The requesting node 𝑥 cannot reveal the evidence of a specific node by analyzing aggregated evidence 𝑡𝑒! (𝑦) (𝑗 = 1, … , 𝐽). The requesting node can obtain a lot of aggregated evidence (𝑡e! (𝑦), 𝑗 = 1, … , 𝐽) that relates to different evidence providers. Referring to Proposition 3, it is difficult to obtain the plain evidence of a specific node from one aggregated evidence. Note that the evidence-providing nodes could be uncertain and different in terms of different aggregated evidence. The aggregated trust evidence could be inevitably independent of each other in different time slots. Particularly, the requesting node has no idea the real identities of evidence providers, whose identities do not need to be shared with EP and any other parties during evidence collection. The uncertainty of provider identities makes it difficult to reveal the evidence of a specific node even through lots of aggregated evidence can be obtained. In practice, the EP, running by a cloud service provider, should follow the design of the proposed schemes due to business incentives. This greatly helps ensuring that trust pre-evaluation can be securely conducted at the EP. Even though the trust pre-evaluation is not accurate due to malicious behaviors of evidence providers, the trust evaluation functions can still help overcoming some attacks to some extent, which will be discussed and tested in Section 5.4. 5.2 Computation Complexity Each proposed scheme involves four kinds of system entities: evidence-providing node, requesting node, EP and AP. We analyze the computational complexity of each entity in each scheme, respectively. To present the computation complexity in details, we adopt RSA for PKC and digital signature and apply the homomorphic encryption proposed by Paillier [20]. We first analyze the computing complexity of main operations, such as 𝑆𝑒𝑡𝑢𝑝𝐻𝑜𝑚(𝑘), 𝑆𝑒𝑡𝑢𝑝𝑆𝑖𝑔(𝑘), 𝐻𝐸{𝐻𝑃𝐾, 𝑚}. The basic operation 𝑆𝑒𝑡𝑢𝑝𝐻𝑜𝑚(𝑘) generates a key pair for homomorphic encryption. The key generation mainly contains the computation of one modular exponentiation and one Greatest Common Divisor (GCD). Since the computation of GCD is much simpler than the modular exponentiation based on Euclidean algorithm, herein we only take modular exponentiation (and later modular multiplications) into consideration. The operation 𝑆𝑒𝑡𝑢𝑝𝑆𝑖𝑔(𝑘) also needs to do modular inverse computation, and the same is true in 𝑆𝑒𝑡𝑢𝑝𝐴𝑠𝑦𝑚(𝑘). They have the same computation complexity. The operations related to RSA only contain modular exponentiation. Each of the basic operations of RSA, e.g., encryption 𝐸{}, decryption 𝐷{}, signing signature 𝑆𝑖𝑔𝑛{} and signature verification 𝑉𝑒𝑟𝑓{} conducts one modular exponentiation. Homomorphic encryption 𝐻𝐸{𝐻𝑃𝐾, 𝑚} conducts one modular exponentiation, while homomorphic decryption 𝐻𝐷{𝐻𝑆𝐾, 𝑐} performs three modular exponentiations. Table 2 summarizes the computational operations carried out by different entities in the two schemes.

Entities AP EP Each Evidence Provider

TABLE 2. COMPUTATIONAL OPERATIONS OF TWO SCHEMES Computational Operations Procedures Scheme 1 Scheme 2 System Setup 2*ModExp 2*ModExp Encryption Transform 5*  𝐽*ModExp 5*  𝐽*ModExp Trust Pre-evaluation (𝑁 − 𝐽) *ModMult 𝑁*ModMult+  𝐽*ModExp Evidence Recovery -𝐽*ModMult+  𝐽*ModExp Evidence Provision

1*ModExp

1*ModExp

System Setup 1*ModExp 1*ModExp Trust Evaluation 𝐽*ModExp 3  𝐽*ModExp Notes: ModExp: modular exponentiation; ModMulti: modular multiplication; 𝑁! : the number of nodes providing evidence in time slot 𝑡! . 𝑁: 𝑁 = 𝑁! TABLE 3. COMPUTATION COMPLEXITY Computation Complexity Entities Scheme 1 Scheme 2 AP 𝒪(𝐽) 𝒪(𝐽) EP 𝒪(𝑁 − 𝐽) 𝒪(𝑁 + 𝐽) Evidence Provider 𝒪(1) 𝒪(1) Requesting Node 𝒪(𝐽) 𝒪(𝐽) The computation complexity of the two schemes is presented in Table 3. The differences of the two schemes are in the operations at EP. Because the homomorphic encryption and decryption occupy most of computing resources, the efficiency of our schemes mainly relies on the performance of homomorphic algorithms. But with the recent research advances on homomorphic encryptions [22-24], its computational expense is becoming low. In Table 4, we further compare our schemes with a privacy preserving P2P reputation scheme (3PRep) [19] in terms of computation complexity, communication cost and storage cost. In the two schemes proposed by us, evidence-providing nodes send their encrypted trust evidence to EP. The communication cost between EP and the evidence providing nodes is 𝒪(𝑁). The communication cost between EP and AP is 𝒪(𝐽) in both schemes. The communication cost between AP and the requesting node is 𝒪(𝐽) in Scheme 1. The communication cost between EP and the requesting node is 𝒪(𝐽) in Scheme 2. The requesting node needs to store 𝐽 pieces of aggregated evidence for trust evaluation in each scheme. EP needs to store 𝑁 pieces of encrypted evidence in Scheme 1, while in Scheme 2 EP additionally stores 𝐽 random numbers. Hence, the storage cost of EP is 𝒪(𝑁) in Scheme 1 and  𝒪(𝑁 + 𝐽) in Scheme 2. From the Table 4, we can observe that our two schemes’ performance is much better than 3PRep. Both of them load most computations at EP, which is a powerful server and has sufficient storage and computation resources. Notably, 𝑁 should be much bigger than 𝐽  (𝐽 ≪ 𝑁) in practice. Thus, our schemes greatly reduce the computation and communication burdens of a requesting node, when compared with 3PRep. In addition, the computation load of each evidence provider is also low since only one homomorphic encryption is conducted for providing one piece of evidence. In particular, Scheme 1 performs more efficient than Scheme 2, although Scheme 2 can achieve better security by supporting a more strict security model. TABLE 4. COMPARISON OF OUR SCHEMES WITH 3PREP [25] Items Scheme 1 Scheme 2 3PRep Computation Complexity at a Requesting Node 𝒪(𝐽) 𝒪(𝐽) 𝒪(𝑁 ! ) Requesting Node

Communication Cost

𝒪(𝑁 + 𝐽)

𝒪(𝑁 + 𝐽)

𝒪(𝑁 ! )

Storage Cost at a Requesting Node Storage Cost at EP

𝒪(𝐽) 𝒪(𝑁)

𝒪(𝐽) 𝒪(𝑁 + 𝐽)

𝒪(𝑁 ! ) −

5.3 Implementation and Performance Evaluation We implemented the proposed schemes by adopting Paillier’s cryptosystem for homomorphic encryption/decryption and RSA for PKC, signature generation and verification. The implementation is based on OpenSSL and Paillier Library [26]. We conducted a number of experiments to test the performance of the proposed schemes. The experiments were conducted in a laptop with Intel Core CPU i5-3337U and 4-GB RAM, running Ubuntu 13.04 that virtually executes the functions of evidence-providing nodes, evidence-requesting nodes, AP and EP. Efficiency of Schemes TABLE 5. EXECUTION TIME OF EACH OPERATION (UNITS: MILLISECOND) Key Encryption Homomorphic Decryption Signing Signature Generation Multiplication Verification RSA 63.9 0.14 -1.50 --Sign 58.8 ---1.44 0.11 2769.5 Paillier 237.4 15.6 11.5 --(105 pieces) Table 5 shows the execution time of each operation. Key generation is the most time-consuming, but it just executes once at system setup. As the modular exponentiation in Paillier’s cryptosystem is more complex than that in RSA, it spends more time in homomorphic encryption and decryption. In this test, the aggregated evidence was a fusion result of 105 evidence pieces. TABLE 6. OPERATING TIME OF PROPOSED TWO SCHEMES (UNITS: MILLISECOND) Operations Scheme 1 Scheme 2 Node x Registration (1) 63.9 237.4 AP System Setup (2) 296.2 296.2 Evidence Provision 15.6 15.6 Trust Pre-evaluation (105 pieces) 2769.5 2785.1 Encryption Transform 11.75 27.21 Evidence Recovery -12.6 Trust Evaluation without Running the Evaluation Algorithm 1.5 11.5 Total Time without (1)(2) 2798.35 2852.01 Total Time 3158.45 3385.61 We tested the operation time of each scheme without running the trust evaluation function at the requesting node (since it only spends less than 0.1 millisecond (ms) when 𝐽 < 100). We will report the performance test results of the two trust evaluation functions in Section 5.4. The operation time of two schemes is compared in Table 6. We observe that one of the most time-consuming operations is trust pre-evaluation. As EP has to deal with as many as 105 pieces of evidence, the processing time less than 3 seconds is reasonable. We also observe that system setup and node registration also spend much time, but they are only executed once. Comparing the two schemes, Scheme 2 is more time-consuming in terms of node registration, trust pre-evaluation, encryption transform, evidence recovery and final trust evaluation. But Scheme 2 can ensure that no other entities (including AP) except the requesting node can obtain the aggregated trust evidence. Each scheme has its own advantage. If we don’t consider system setup and node registration, the operating times of two schemes are quite similar, especially in the case of big data (evidence) process. Scheme 1 is more applicable than Scheme 2 for small evidence processing in the case

that security requirement is not strict. Scheme 2 is suitable for big evidence processing especially when a more strict security requirement is applied. According to concrete requirements, we can adopt Scheme 1 or Scheme 2 in practice. Scalability of Schemes

Fig. 4 (a) Homomorphic multiplication time of EP with different numbers of evidence pieces. (The x-axis and y-axis are shown in a logarithmic scale.); (b) Computation time of EP with different numbers of aggregated evidence pieces (𝑁! = 10! ) in two schemes; (c) Computation time of AP with different numbers of aggregated evidence pieces in two schemes; (d) Computation time of a requesting node with different numbers of aggregated evidence pieces in two schemes. The number of evidence pieces and the total number of aggregated evidence (i.e., the number of time slots) affect the efficiency of our two schemes. In practice, there could be either big or small evidence data collected for the purpose of trust evaluation. In order to show the feasibility of our schemes in different application scenarios, it is important to test their performance with regard to different number of evidence providers and different size of trust evidence sets. Figure 4(a) shows the homomorphic multiplication time of EP with different numbers of evidence providers. We observe that the computation time is about 27s even when the number of evidence providers in one time slot reaches 106. The computation time is acceptable for EP, which is a cloud computing service provider that has sufficient resources for computation. We tested the performance of EP regarding trust pre-evaluation if different numbers of aggregated evidence  (𝐽 = 10, … , 100) should be computed when 𝑁! = 10! . Figure 4(b) shows the computation time at EP of two schemes. We observe that the computation time of two schemes are very similar in terms of big evidence process. Scheme 2 achieves better security with a bit more computation cost than Scheme 1. When 𝑁! = 10! and 𝐽 = 100, the computation time of trust pre-evaluation at EP is less than 300 seconds. Figure 4(c) shows the computation time of AP in two schemes with different numbers of aggregated evidence   𝐽 = 10, … ,100 . We observe that the computation time of AP increases lineally with the number of aggregated evidence. The computation time at AP in Scheme 2 is about double of that in Scheme 1. The computation cost should be acceptable for AP since it is a computing server. Choice of Scheme 1 or Scheme 2 relies on the trade off between the computation cost and security requirements. Figure 4(d) shows the computation time of a requesting node in two schemes with different numbers of

aggregated evidence. We didn’t consider the time of running the trust evaluation algorithms in this test because both of them spend less than 0.1 millisecond (ms) when 𝐽 < 100. Meanwhile, we didn’t apply 𝑊 to select the latest aggregated evidence in order to compare two schemes. We observe that the computation time at the requesting node increases almost linearly with the number of aggregated evidence. The computation time of Scheme 1 is very acceptable for the requesting node. In Scheme 2, the requesting node spends about 300 milliseconds when 𝐽 reaches 20. This is reasonable since 𝑊 is not big in general if considering the influence of time on trust. Note that evidence collection time varies in different application scenarios. The longer the time slot, the more evidence is collected to process. We could set a suitable duration for each time slot and select proper 𝑊 to adapt to different scenarios in order to release the computation burden of the requesting node. Our schemes are flexible to support various practical cases. For instance, the EP could automatically collect evidence for multiple time slots based on the request from a particular node. The evaluation is iterated with time. The previous trust evaluation result can play as the local value of a new round of evaluation. This design is feasible for big data process and greatly helps reducing the computation load at the requesting node. 5.4 Performance of Trust Evaluation We further evaluated the performance of the trust evaluation functions with regard to accuracy, efficiency and robustness. To simplify the evaluation, we used a value in scope [0, 1] to represent trust evidence. The bigger the value, the more trust held by a trust evidence provider. In our tests, a node (i.e., either honest or dishonest) provides one piece of evidence in each time slot. All collected evidence is aggregated once in a time slot. Table 7 lists the default parameter settings of the tests. TABLE 7. PARAMETER SETTINGS Parameter Value 100000 𝑁!  (𝑗 = 1,2, … , 𝐽) 20000 σ 10 𝑊 0.01ms Duration of a time slot (𝑖. 𝑒. , 𝑡! − 𝑡!!! ) 1.0 𝜏 0.1 𝜖 0.15 𝜂 Accuracy We designed an experiment to test the accuracy of trust evaluation. For testing Algorithm 1, we assume all nodes provide their evidence honestly and the requesting node holds its own trust opinion. Nine cases were tested: Case 1.1: each node always provides trust evidence as 0.9 and the requesting node holds a trust value 0.9; Case 1.2: each node always provides trust evidence as 0.1 and the requesting node holds a trust value 0.1; Case 1.3: each node always provides trust evidence as 0.5 and the requesting node holds a trust value 0.5; Case 1.4: each node provides trust evidence increasingly from 0.1 to 0.9 with time flying (from time slot 1 to 9) and the requesting node holds an initial trust value 0.1; Case 1.5: each node provides trust evidence decreasingly from 0.9 to 0.1 with time flying and the requesting node holds an initial trust value 0.9; Case 1.6: each node provides trust evidence increasingly from 0.1 to 0.9 with time flying and the requesting node always holds a trust value 0.1; Case 1.7: each node provides trust evidence decreasingly from 0.9 to 0.1 with time flying and the requesting node always holds a trust value 0.9; Case 1.8: each node provides trust evidence increasingly from 0.1 to 0.9 with time flying and the requesting node’s trust value is changing in the same way; Case 1.9: each node provides trust evidence decreasingly from 0.9 to 0.1 with

time flying and the requesting node’s trust value is changing in the same way. In Case 1.4) and 1.5), 𝑡𝑣′(𝑦) is set as the trust evaluation result achived in the previous time slot since trust is dynamically changed and impacted by other nodes.

Fig. 5 (a) Accuracy of trust evaluation based on Algorithm 1; (b). Accuracy of trust evaluation based on Algorithm 2 Figure 5(a) shows the result of this test. We observe that Algorithm 1 can accurately evaluate trust as expectation. The trust value is changing according to the newly collected evidence and the local trust value (if any) plays a key role in the final evaluation. This result conforms to our experiences in daily life: trust is a subjective opinion and it is dynamically changed according to new knowledge and experiences. We tested the accuracy of Algorithm 2 in the cases that no initial and on-going trust values were held by the requesting node in the following five cases: Case 2.1: each node always provides trust evidence as 0.9; Case 2.2: each node always provides trust evidence as 0.1; Case 2.3: each node always provides trust evidence as 0.5; Case 2.4: each node provides trust evidence increasingly from 0.1 to 0.9 with time flying (from time slot 1 to 9); Case 2.5: each node provides trust evidence decreasingly from 0.9 to 0.1 with time flying. Figure 5(b) shows the experimental result. We observe that Algorithm 2 can also accurately evaluate trust as expected. Efficiency The efficiency of the two trust evaluation functions is highly impacted by the number of aggregated evidence. Figure 6 shows the computation time of two functions with different 𝐽 (𝐽 = 10, … , 10! ). We observe that the computation time of two functions is less than 1000 milliseconds even though 𝐽 is as high as 104. This result indicates that both trust evaluation functions are very efficient and suitable to be applied in a mobile node with limited processing capability considering that 𝐽 is normally not so big. We note that 𝐹 is less efficient than 𝐹′ when 𝐽 is smaller than 100. But when 𝐽 is becoming bigger, 𝐹 is more efficient than 𝐹′. This is because ordering 𝑇! for performing the trimmed mean method takes time.

Fig. 6. The computation time of two functions with different numbers of aggregated evidence pieces. (The x-axis is shown in a logarithmic scale.) Robustness We tested the robustness of two trust evaluation algorithms under two system attacks: bad-mouthing attack and on-off attack. 1) Bad-mouthing Attack: dishonest nodes intentionally inflate a bad target node or deflate a good one. Although the proposed schemes cannot detect and remove dishonest evidence providers, our trust evaluation functions can resist this attack to some extent. We simulated some dishonest evidence providers that occupy 0%, 5%, 10%, 15% or 20% of all system nodes (M=105). We tested two cases: a) honest evidence is 0.9, dishonest evidence is 0.1, and 𝑇𝑣′(𝑦) = 0.9; b) honest evidence is 0.1, dishonest evidence is 0.9, and 𝑇𝑣′(𝑦) = 0.1.

Fig. 7. Performance of Algorithm 1 under bad-mouthing attack: a) honest evidence is 0.9, dishonest evidence is 0.1 and 𝑇𝑣′(𝑦) = 0.9; b) honest evidence is 0.1, dishonest evidence is 0.9 and 𝑇𝑣′(𝑦) = 0.1 Figure 7 shows the performance of Algorithm 1 under the bad-mouthing attack in the above two cases. We observe that the dishonest evidence has some influence on the trust evaluation, which does not become more serious with time flying. If there are 20% dishonest providers, the average evidence value is 0.74 while the evaluation result based on Algorithm 1 is about 0.88 in Case a); the average evidence value is 0.16 while

the evaluation result is 0.12 in Case b). This implies that Algorithm 1 can resist the bad-mouthing attack very well. Figure 8 shows the performance of Algorithm 2 under the bad-mouthing attack in the same two cases except that 𝑇𝑣′(𝑦) is not available. We observe that the dishonest evidence had some influence on the trust evaluation, but it does not become more serious with time flying. Algorithm 2 cannot resist the bad-mouthing attack as well as Algorithm 1. The main reason is it lacks knowledge to judge the credibility of evidence.

Fig. 8. Performance of Algorithm 2 under bad-mouthing attack: a) honest evidence is 0.9 and dishonest evidence is 0.1; b) honest evidence is 0.1 and dishonest evidence is 0.9

Fig. 9. Performance of Algorithm 1 under on-off attack: a) honest evidence is 0.9, dishonest evidence is 0.1 and 𝑇𝑣′(𝑦) = 0.9; b) honest evidence is 0.1, dishonest evidence is 0.9 and 𝑇𝑣′(𝑦) = 0.1 2) On-off Attack: dishonest nodes provide honest or dishonest evidence alternatively. Our simulation setting is the same as above, except that the attack comes at the last time slot before trust evaluation. That is only the last aggregated evidence is not honest. Figure 9 shows the performance of Algorithm 1 under the on-off attack. We observe that the dishonest evidence has some influence on trust evaluation when the

number of aggregated evidence is small. But when this number is big enough, the influence of dishonest evidence is becoming trivial in both cases. Figure 10 shows the performance of Algorithm 2 under the on-off attack. We set 𝜖 = 0.1 so that the trim works when 𝐽 ≥ 10. We observe that the dishonest evidence has influence on trust evaluation when the number of aggregated evidence is small. But when this number is big enough, the influence of dishonest evidence is becoming trivial in both cases. Particularly, the trimmed mean method starts to work and immediately resists the attack perfectly when 𝐽 ≥ 10. Thus, the trimmed mean method can effectively resist the on-off attack when sufficient number of aggregated evidence pieces is available.

Fig. 10. Performance of Algorithm 2 under on-off attack: a) honest evidence is 0.9 and dishonest evidence is 0.1; b) honest evidence is 0.1 and dishonest evidence is 0.9 Performance of Trimmed Mean Method We further tested the performance of the trimmed mean method. We compared the result with trimming and without trimming. Since the trimming starts to work when 𝐽 ≥ 10 (ϵ = 0.1), we tested five cases with different values of 𝐽: 1) 𝐽 = 9 with trimming; 2) 𝐽 = 10 with trimming; 3) 𝐽 = 10 without trimming; 4) 𝐽 = 11 with trimming; 5) 𝐽 = 11 without trimming. In all cases, the last aggregated evidence contains the evidence provided by different percentages of dishonest providers (from 0% to 30%). In the test, we assumed that honest evidence providers offer honest evidence as 0.9 in case a) and 0.1 in case b). Dishonest evidence providers provide dishonest evidence as 0.1 in case a) and 0.9 in case b). From Figure 11, we can see that the trimmed mean method can kick out inaccurate aggregated evidence and ensure the accuracy of final trust evaluation when trimming starts to work.

Fig. 11. Effect of trimmed mean method to resist attack: a) honest evidence is 0.9 and dishonest evidence is 0.1; b) honest evidence is 0.1 and dishonest evidence is 0.9 6. APPLICATION SCENARIOS Trust evaluation based on data aggregation is widely applied in many technical fields, such as E-Commerce, Internet of Things (IoT), sensor networks, social networking, cloud computing, and so on. Each of the two proposed schemes has its own special characteristics and is appropriate for different applications. Scheme 1 has lower computational cost at EP and requesting nodes although it is less secure than Scheme 2. It requests that AP is fully trusted. Therefore, it can be applied into such scenarios that request fast process at the EP and requesting nodes. For example, it is suitable for evaluating trust of mobile applications [38]. This is because the requestors are often mobile devices with limited computation capability, while AP can be served by a mobile app store to offer general application recommendations. There is no harm for AP to know the aggregated trust evidence of a mobile app in order to know its popularity. Similarly, Scheme 1 can be applied to evaluate trust of various products while an e-shopping center serves as AP. What is more, Scheme 1 can be applied into mobile social networking to evaluate social networking nodes’ trust with privacy preservation. In this case, the social networking server can serve as AP and EP is deployed by a cloud service provider. Scheme 2 achieves better security with the cost of higher computation. In practice, it is more suitable for evaluating the trust of different enterprises, mobile operators, or organization units. It is of great significance to process the collected evidence with advanced privacy preservation. The evaluated targets are normally business competitors and the requesting parties cannot trust any other parties. Applying Scheme 2 can easily gain customer acceptance in this scenario. Two semi-trusted cloud service providers can be adopted to deploy AP and EP. They can deal with big volume of data with their enomous computing powers and meanwhile realize authentication and access control on aggregated trust evidence. Moreover, Scheme 2 can be widely used for data aggregation in various IoT scenarios, e.g., eHealth services. For example, medical and clinical researchers collect and process health statistics of patients. In order to ensure their privacy, their sensitive data should be well protected during any step of data process. It is critically essential to hide provided plain data and plain aggregated data from any processing parties (i.e., EP and AP) and achieve aggregated data sharing with unspecified requestors through serious access control.

7. CONCLUSION In this paper, we proposed two feasible schemes to preserve the privacy of trust evidence providers in trust evaluation. The first scheme applies PKC and additive homomorphism to keep the individual evidence confidential from other involved entities. While, the second scheme relies only on additive homomorphism to achieve the goal of privacy preservation without disclosing the individual evidence and the aggregated trust evidence to any unauthorized entities. It can ensure privacy in a better way at the expense of a higher computation cost (especially at a trust-evaluating node), but this cost is not obvious for big evidence data process at EP. We further presented two trust evaluation algorithms to cooperate with the proposed schemes in different application scenarios. Extensive analysis and performance evaluation show that our schemes are feasible regarding security, computation complexity, communication cost, storage cost and efficiency, especially for supporting big data process. Both schemes are secure enough for PPTE and at the same time achieve low computation cost at evidence providing nodes and evidence-requesting/evaluating nodes. In addition, the experimental test showed that the proposed trust evaluation algorithms are accurate, efficient and robust against typical attacks raised by malicious evidence providers. Our schemes concern practical demands and benefit from the advance of additive homomorphic encryption. They can be applied to support PPTE that contains trust evidence summation. Thus, they are feasible to be adopted in practice and can be deployed in many situations. If fully homomorphic encryption can be realized, more computations can be performed at EP. But how to achieve verifiable computing at EP and overcome internal attacks without breaching privacy is still an open issue, which is a topic of our future work.

ACKNOWLEDGEMENTS This work is sponsored by the Program of Introducing Talents of Discipline to Universities under Grant B08038, the PhD grant (JY0300130104) of Chinese Educational Ministry, the initial grant of Chinese Educational Ministry for researchers from abroad (JY0600132901), the grant of Shaanxi Province for excellent researchers from abroad (680F1303) and Aalto University.

REFERENCES [1]

[2] [3] [4]

[5] [6] [7]

Y. L. Sun, W. Yu, Z. Han, and K. Liu, “Information theoretic framework of trust modeling and evaluation for ad hoc networks,” IEEE Journal on Selected Areas in Communications, vol. 24, no. 2, pp. 305-317, 2006. X. Li, F. Zhou, and X. Yang, “A multi-dimensional trust evaluation model for large-scale P2P computing,” Journal of Parallel and Distributed Computing, vol.71, no.6, pp. 837-847, 2011. X. N. Wu, R. L. Zhang, B. Zeng, and S. Y. Zhou, "A trust evaluation model for cloud computing,” Procedia Computer Science, vol. 17, pp. 1170-1177, 2013. R. He, J. W. Niu, M. Yuan, and J. P. Hu, “A novel cloud-based trust model for pervasive computing,” in Proceedings of the Fourth International Conference on Computer and Information Technology, pp. 693-700, 2004. Z. Yan, and C. Prehofer, “Autonomic trust management for a component based software system,” IEEE Transactions on Dependable and Secure Computing, vol. 8, no. 6, pp. 810-823, 2011. C. Lin, V. Varadharajan, Y. Wang, and P. Vineet, “Enhancing grid security with trust management,” in Proceedings of IEEE International Conference on Services Computing, 2004, pp. 303-310. P. Resnick, and R. Zeckhauser, “Trust among strangers in internet transactions: empirical analysis of eBay’s reputation system,” Advances in Applied Microeconomics, vol. 11, pp. 127-157, 2002.

[8]

[9]

[10]

[11]

[12]

[13] [14]

[15]

[16]

[17]

[18]

[19] [20] [21] [22] [23]

F. Hao, G. Y. Min, M. Lin, C. Q. Luo, and L.T.Yang, “MobiFuzzyTrust: an efficient fuzzy trust inference mechanism in mobile social networks,” IEEE Transactions on Parallel and Distributed Systems, vol. 25, no.11, pp. 2944 – 2955, 2014. D. J. He, C. Chen, S. Chan, J. J. Bu, and A.V. Vasilakos, “ReTrust: attack-resistant and lightweight trust management for medical sensor networks,” IEEE Transactions on Information Technology in Biomedicine, vol.16, no.4, pp. 623-632, July 2012. D. He, C. Chen, S. Chan, J. J. Bu, and A. V. Vasilakos, "A distributed trust evaluation model and its application scenarios for medical sensor networks," IEEE Transactions on Information Technology in Biomedicine, vol.16, no.6, pp.1164-1175, Nov. 2012. A. Singh, and L. Liu, “TrustMe: anonymous management of trust relationships in decentralized P2P systems,” in Proceedings of the Third IEEE International Conference on Peer-to-Peer Computing. pp. 142-149, 2003. A. Michalas, and N. Komninos, “The lord of the sense: A privacy preserving reputation system for participatory sensing applications,” in Proceedings of IEEE Symposium on Computers and Communication, pp. 1-6, 2014. X. Wang, W. Cheng, P. Mohapatra, and T. Abdelzaher, “ARTSense: anonymous reputation and trust in participatory sensing,” in Proceedings of INFOCOM, pp. 2517-2525, 2013. X. Wang, W. Cheng, P. Mohapatra, and T. Abdelzaher, “Enabling reputation and trust in privacy-preserving mobile sensing,” IEEE Transactions on Mobile Computing, vol.13, no.12, pp. 2777-2790, 2014. Z. Erkin, T. Veugen, and R. L. Lagendijk “Generating private recommendations in a social trust network,” in Proceedings of International Conference on Computational Aspects of Social Networks, pp. 82-87, 2011. E. Pavlov, J. S. Rosenschein, and Z. Topol, “Supporting privacy in decentralized additive reputation systems,” in Proceedings of the Second International Conference on Trust Management, pp. 108-119, 2004. D. Yao, R. Tamassia, and S. Proctor, “Private distributed scalar product protocol with application to privacy-preserving computation of trust,” in Proceedings of International Federation for Information Processing, vol.238, Trust Management, pp. 1-16, 2007. C. A. Melchor, B. Ait-Salem, and P. Gaborit, “A collusion-resistant distributed scalar product protocol with application to privacy-preserving computation of trust,” in Proceedings of Eighth IEEE International Symposium on Network Computing and Applications, pp. 140-147, 2009. R. Nithyanand, and K. Raman, “Fuzzy privacy preserving peer-to-peer reputation management,” IACR Cryptology ePrint Archive, pp. 442, 2009. P. Paillier, “Public-key cryptosystems based on composite degree residuosity classes,” in Proceedings of Advances in Cryptology - EUROCRYPT’99. Springer Berlin Heidelberg, pp. 223-238, 1999. T. ElGamal, “A public key cryptosystem and a signature scheme based on discrete logarithms,” in Proceedings of Advances in Cryptology, pp. 10-18, 1985. I. Damgård, and M. Jurik, “A generalisation, a simplication and some applications of Paillier's probabilistic public-key system,” in Proceedings of Public Key Cryptography. pp. 119-136, 2001. D. Catalano, R. Gennaro, N. H. Graham, and P. Q. Nguyen, “Paillier's cryptosystem revisited,” in Proceedings of the 8th ACM Conference on Computer and Communications Security, pp. 206-214, 2001.

[24] M. J. Jurik, “Extensions to the paillier cryptosystem with applications to crypto logical protocols,” BRICS, 2003. [25] Z. Yan, Trust Management in Mobile Environments – Usable and Autonomic Models, IGI Global, 2013. [26] J. Bethencourt, “Paillier Library,” Available at: http://hms.isi.jhu.edu/acsc/libpaillier/#documentation, Jan. 2010. [27] Z. Yan, P. Zhang, and A.V. Vasilakos, “A survey on trust management for Internet of Things,” Journal of Network and Computer Applications, 42, pp. 120–134, 2014. [28] Z. Yan, Y. Chen, and Y. Shen, “PerContRep: a practical reputation system for pervasive content services”, Supercomputing, vol.70, no.3, pp.1051-1074, 2014. [29] Z. Yan, Y. Chen, Y. Shen, “A practical reputation system for pervasive social chatting,” Computer and System Sciences, vol.79, no. 5, pp. 556-572, 2013. [30] E. Ayday, J. L. Raisaro, J. Hubaux, and J. Rougemont, “Protecting and evaluating genomic privacy in medical tests and personalized medicine,” in Proceedings of WPES, pp. 95–106. ACM, 2013. [31] C. Castelluccia, A. C.-F. Chan, E. Mykletun, and G. Tsudik, “Efficient and provably secure aggregation of encrypted data in wireless sensor networks,” ACM Transactions on Sensor Networks, vol.5, no.3, 2009. [32] T.-H. H. Chan, E. Shi, and D. Song, “Privacy-preserving stream aggregation with fault tolerance,” in Proceedings of Financial Cryptography, volume 7397 of LNCS, pp. 200–214. Springer, 2012. [33] Z. Erkin, and G. Tsudik, “Private computation of spatial and temporal power consumption with smart meters,” in ACNS, volume 7341 of LNCS, pp. 561–577. Springer, 2012. [34] M. Joye, and B. Libert, “A scalable scheme for privacy-preserving aggregation of time-series data,” in Financial Cryptography, volume 7859 of LNCS, pp. 111–125. Springer, 2013. [35] Q. Li, and G. Cao, “Efficient privacy-preserving stream aggregation in mobile sensing with low aggregation error,” in Proceedings of Privacy Enhancing Technologies, volume 7981 of LNCS, pp. 60–81. Springer, 2013. [36] Q. Li, G. Cao, and T. F. L. Porta, “Efficient and privacy-aware data aggregation in mobile sensing,” IEEE Transactions on Dependable Secure Computing, vol.11, no.2, pp. 115–129, 2014. [37] B. Wang, M. Li, S. S. M. Chow, and H. Li, “A tale of two clouds: Computing on data encrypted under multiple keys,” in Proceedings of IEEE CNS 2014, pp. 337-345, 2014. [38] T. Dang, Z. Yan, F. Tong, W. D. Zhang, and P. Zhang, “Implementation of a trust-behavior based reputation system for mobile applications,” in Proceedings of BWCCA 2014, pp. 221-228. Brief Biographies of Authors: Zheng Yan is a computer scientist with interests in trust, security and privacy. She is currently a professor in Xidian University, China and a visiting professor in Aalto University, Finland. Before joining academia in 2011, she worked as a senior researcher at the Nokia Research Center, Helsinki since 2000. She received her Ph.D in Electrical Engineering from Helsinki University of Technology. She authored more than 100 publications and two books. She is the inventor of 10 patents and 29 patent applications. She serves as an editor or a guest editor for many reputable journals and an organization committee member for numerous of international conferences. She is a senior member of

IEEE. Contact her at: [email protected]. Wenxiu Ding received her B.Eng degree in information security from Xidian University, Xi’an, China in 2012. She is currently pursuing her Ph.D. degree in information security from the School of Telecommunications Engineering at Xidian University. Her research interests include verifiable computation, privacy preservation, data mining and trust management. Miss Ding can be reached at [email protected].

Valtteri Niemi is currently a professor in University of Helsinki, doing research in cryptology and its applications. Professor Niemi serves also as a high-end foreign expert in Xidian University, starting from 2014. Dr. Niemi participated in 3GPP SA3 (security) standardization group from its beginning and during 2003-2009 he was the chairman of the group. Before 3GPP, he took part in ETSI SMG 10 for GSM security specification work. He has published more than 70 scientific articles and he is a co-author of four books and more than 20 patent families. Athanasios V. Vasilakos is currently a professor with Lulea University of Technology, Sweden. He served or is serving as an Editor or/and Guest Editor for many technical journals, such as the IEEE Transactions on Network and Service Management; IEEE Transactions on Cloud Computing, IEEE Transactions on Information Forensics and Security; IEEE Transactions on Cybernetics; IEEE Transactions on Information Technology in Biomedicine; ACM Transactions on Autonomous and Adaptive Systems; IEEE Journal on Selected Area in Communications. He is also a General Chair of the European Alliances for Innovation (www.eai.eu ).