Performance Testing of Semantic Publish/Subscribe Systems - Springer

4 downloads 25373 Views 114KB Size Report
In this paper we present an evaluation framework for systematically evaluating publish/subscribe systems and its application to identify performance bottlenecks ...
Performance Testing of Semantic Publish/Subscribe Systems Martin Murth1, Dietmar Winkler2, Stefan Biffl2, Eva Kühn1, and Thomas Moser2 1

Institute of Computer Languages, Vienna University of Technology Christian Doppler Laboratory “Software Engineering Integration for Flexible Automation Systems”, Vienna University of Technology {mm,eva}@complang.tuwien.ac.at, {dietmar.winkler,stefan.biffl,thomas.moser}@tuwien.ac.at 2

Abstract. Publish/subscribe mechanisms support clients in observing knowledge represented in semantic repositories and responding to knowledge changes. Currently available implementations of semantic publish/subscribe systems differ significantly with respect to performance and functionality. In this paper we present an evaluation framework for systematically evaluating publish/subscribe systems and its application to identify performance bottlenecks and optimization approaches.

1 Introduction and Motivation The application of semantic repositories enables managing highly dynamic knowledge bases [4]. Semantic publish/subscribe mechanisms foster the notification of changes systematically [3]. Registered queries (e.g., using SPARQL) on repositories and individual subscriptions will lead to the notification of individual subscribers initiated by knowledge base updates. Several publish/subscribe mechanisms have been developed in the past, e.g., the Semantic Event Notification System (SENS) [3] due to various application requirements (e.g., focus on functional behavior and performance measures). Nevertheless, a key question is how to evaluate publish/subscribe systems with focus on performance measures efficiently. Several benchmark frameworks, e.g., LUBM [1] [4], focus on the assessment of load, reasoning, and query performance of semantic repositories. However, a standardized approach for evaluating semantic publish/subscribe mechanisms is not yet available. We developed the SEP-BM (Semantic Event Processing Benchmark) framework focusing on two common performance metrics, i.e., notification time and publication throughput, and implemented a framework for measuring these metrics for semantic publish/subscribe systems [4].

2 SEP-BM Benchmark Framework Figure 1 presents the concept of the novel benchmark framework consisting of a benchmark base configuration and a benchmark runner. The benchmark base configuration comprises data sets based on an ontology and 20 query definitions for subscription to test performance measures: The configuration generator provides sequences of publication operations (i.e., scenarios); the reference data generator provides traceability information regarding publication/notification relationships for R. Meersman et al. (Eds.): OTM 2010 Workshops, LNCS 6428, pp. 45–46, 2010. © Springer-Verlag Berlin Heidelberg 2010

46

M. Murth et al.

measurement purposes. The benchmark runner executes the scenarios and performs data analyses, i.e., analyzing notification time and publication throughput.

Fig. 1. Benchmark framework components

Focusing on these major success-critical issues regarding publish/subscribe performance we applied a set of defined test scenarios to SENS and an optimized (heuristicsupported) variant. Not surprisingly, the observation of a first SENS [2] prototype (without heuristic optimization) showed a relationship between low notification times and high publication throughput rates for complex reasoning approaches and/or large input data and low selectivity. Based on these findings we developed a heuristics-based optimization mechanism [2] and identified general prerequisites for successful employment of heuristic optimizations, typically leading to performance problems: (a) need for complex reasoning and querying tasks during the evaluation of subscriptions and (b) large input sets and low selectivity of the subscription query statements. See Murth et al. for details of SEP-BM Framework and the initial evaluation results [4].

3 Conclusion According to the initial evaluation the SEP-BM framework [2] helps to (a) enable performance bottleneck detection of publish/subscribe systems, (b) identify root causes of performance bottlenecks and indicate candidate optimization approaches, and (c) support quantifiable performance measurement to evaluate optimization mechanisms. Acknowledgments. This work has been supported by the Christian Doppler Forschungsgesellschaft and the BMWFJ, Austria.

References [1] Guo, Y., Pan, Z., Heflin, J.: LUBM: A benchmark for OWL knowledge base systems. Journal of Web Semantics 3(2-3), 158–182 (2005) [2] Murth, M., Kühn, E.: A heuristics framework for semantic subscription processing. In: Aroyo, L., Traverso, P., Ciravegna, F., Cimiano, P., Heath, T., Hyvönen, E., Mizoguchi, R., Oren, E., Sabou, M., Simperl, E. (eds.) ESWC 2009. LNCS, vol. 5554, pp. 96–110. Springer, Heidelberg (2009) [3] Murth, M., Kühn, E.: Knowledge-based coordination with a reliable semantic subscription mechanism. In: Proc. 24th ACM Symp. of Applied Computing (SAC) - Special Track on Coordination Models, Languages and Applications, pp. 1374–1380. ACM, New York (2009) [4] Murth, M., Winkler, D., Biffl, S., Kühn, E., Moser, T.: Performance Testing of Semantic Publish/Subscribe Systems, TU Vienna, Technical Report, IFS:QSE 10/05 (2010), http://qse.ifs.tuwien.ac.at/publication/IFS-QSE-10-05.pdf

Suggest Documents