Model-Driven Performance Measurement and Assessment with ...

13 downloads 0 Views 978KB Size Report
Model-Driven Performance Measurement and Assessment with MoDePeMART. 10/7/2009 MODELS Denver, Colorado. Marko Boskovic. Wilhelm Hasselbring 2/ ...
Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Bošković, Wilhelm Hasselbring Athabasca University, Canada [email protected] University of Kiel, Germany [email protected]

Motivation • Current research is mostly dedicated to the performance prediction • Problems with measurement – Differences between modeling and implementation constructs – Performance analysts usually are not experts in design modeling languages – Inconsistency between data structures for data collection and metrics computation

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

2/21

MoDePeMART B

MRT:SimpleAssessment

SA

1. Design

:Duration

A

…… public a() public a() ……

3, Transformation (and Compilation)

2. Instrumentation

SB

Platform code with probes

SQL DML Assessment Queries

SQL DDL Tables Definition and DML Initialization Code

4. Deployment

5. Initialization 6. Testing 6. Data collection (measurements)

7. Assessment

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

3/21

Outline • • • • •

Software Performance and Evaluation Modeling Dimensions The Metamodel Evaluation Future work

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

4/21

Software Performance and Evaluation • Software Performance is the degree to which a software system or component meets its objectives for timeliness (C.U. Smith and L.G. Williams, A Practical Guide to Creating Responsive Scalable Software, AW 2001) • Performance metrics – Response time (min, max, ddistribution...) – Throughput (jobs/sec) – Utilization

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

5/21

Outline • Software Performance and Evaluation • Modeling Dimensions – Transformational Modeling Dimension – Reactive Modeling Dimension

• The Metamodel • Evaluation • Future work

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

6/21

Transformational Modeling Dimension • Algorithmic problem decomposition (E. Dijkstra, A Discipline of Programming. Prentice Hall PTR, 1979)

• Modeling constructs: – Simple commands – Composite commands (command block) • Invocation and sequential composition

– Guarded command (if (c), opt (UML)) – Guarded command set (switch (C), alternative(UML)) – Loop

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

7/21

Reactive Modeling Dimension • Stimulus/response behaviour • Response depends on stimuli and current state • Modeling constructs: – States – Transitions – Stimuli/signals

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

8/21

Outline • Software Performance and Evaluation • Modeling Dimensions • Metamodel – Transformational context and metamodel part – Reactive context and metamodel part – Assessment and metrics metamodel part

• Evaluation • Future Work 10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

9/21

Transformational Context

Negative

How to measure the response time of obtaining a music video from the database?

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

10/21

Transf. Context Metamodel -absent InstrumentedElement

-i ns trumentedE lement 1

1

1

-c o nt ai ne d

SubScenario

1 -p re c ed e d 1 -p re c ed e s

-s c enarioEventConditi on 0..1

* *

ScenarioEv ent

mmercial Use Only StateCondition

Alternativ es

Contain

Precede

1 1

* Negation

* 1 -c o nt ai ns

1 * -m e a s u re d S c en a ri oE ven t

-s c en a rio E ve n t

Root 1 ..* -a lt erna ti ve -s c en a rio Ro o t 1 *

Scenario 1

*

MeasuredE v ent *

-e ve nt S c en a rio *

-g ro u pE ven t

* Group

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

11/21

Reactive Context

What is the impact of compression?

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

12/21

Reactive Context Metamodel InstrumentedElement

-instrumentedElement 1

-instrumentedElement 1

For Non-commercial Use

-leftOperand

1

-rightOperand

1

-scenarioEventCondition 0..1 StateCondition

ScenarioEv ent

-operand 1

Binary

NOT

*

AND

*

*

* ConditionElement

*

-instrumentedElementName : String -conditionRelation : ConditionRelation OR < >

ConditionRelation contains during overlaps overlapped

mercial Use Only 10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

For No Marko Boskovic Wilhelm Hasselbring

13/21

Metrics and Assessment Metamodel • Metrics metamodel part – Duration (min, max, stdev, percentile, ddistribution) – Ocurrence rate – Percentage

• Assessment – Simple – Composite – Specification of validity period 10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

14/21

Outline • • • •

Software Performance and Evaluation Modeling Dimensions The Metamodel Evaluation – PEMA Profile – Comparative Analysis – Limitations

• Future work 10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

15/21

PEMA Profile (1/2) • • • •

Class Diagrams and State Diagrams Implementation in Magic Draw Target platforms are JavaRMI and MySQL Transformation with openArchitectureware

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

16/21

PEMA Profile (2/2)

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

17/21

Comparative Analysis

Comparative analysis of related work + facilitated, - not facilitated, o partially facilitated 10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

18/21

Limitations • Applicable in systems with non–communicating concurrency • Synchronous communication • Scenarios without loop-backs • Granularity of the timing mechanism is larger than the execution of one command • Job flow assumption • Arrival patterns recognition is not supported • Data assessment is not supported

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

19/21

Future Work • Metamodel extension – Utilization – Data assessment – Loop iteration number – Arrival patterns

• Profile development – Usage of suitable diagrams – Application to Activity Diagram

• Use for monitoring 10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

20/21

Thank you for your attention! Comments&Questions?

10/7/2009 MODELS Denver, Colorado

Model-Driven Performance Measurement and Assessment with MoDePeMART

Marko Boskovic Wilhelm Hasselbring

21/21

Suggest Documents