Experiences and Lessons Learned from a Case Study - Fraunhofer IESE

10 downloads 0 Views 272KB Size Report
Software Engineering (IESE). Fraunhofer-Platz 1, 67663 Kaiserslautern. Alcatel-Lucent Deutschland AG. Thurn-und-Taxis-Straße 10, 90411 Nürnberg. Testo AG.
Variability Management in Small Development Organizations – Experiences and Lessons Learned from a Case Study Authors: Daniel Pech Jens Knodel Ralf Carbon Clemens Schitter Dirk Hein Architecture-centric Quality Engineering

Accepted for Publication at SPLC 2009

IESE-Report No. 077.09/E Version 1.0 June 2009 A publication by Fraunhofer IESE

Fraunhofer IESE is an institute of the Fraunhofer Gesellschaft. The institute transfers innovative software development techniques, methods and tools into industrial practice, assists companies in building software competencies customized to their needs, and helps them to establish a competitive market position. Fraunhofer IESE is directed by Prof. Dr. Dieter Rombach (Executive Director) Prof. Dr. Peter Liggesmeyer (Director) Fraunhofer-Platz 1 67663 Kaiserslautern

Das Verbundprojekt ArQuE wird vom Bundesministerium für Bildung und Forschung (BMBF) gefördert. Das Förderkennzeichen ist 01 IS F14. Das Akronym ArQuE steht für Architecture-centric Quality Engineering. Die Partner des Verbundprojektes sind die Organisationen: Fraunhofer Institut für Experimentelles Software Engineering (IESE) Fraunhofer-Platz 1, 67663 Kaiserslautern Alcatel-Lucent Deutschland AG Thurn-und-Taxis-Straße 10, 90411 Nürnberg Testo AG Testo-Straße 1, 79853 Lenzkirch WIKON Kommunikationstechnik GmbH Luxemburger Str. 1-3, 67657 Kaiserslautern Büren & Partner Software Design GbR Oedenberger Str. 155, 90491 Nürnberg Universität Bremen, Arbeitsgruppe Softwaretechnik (Fachbereich 3) Linzer Straße 9a, 28334 Bremen Tynos Fahrenheitstraße 1, 28237 Bremen Die Trägerschaft des Projekts liegt bei der Deutschen Gesellschaft für Luft- und Raumfahrt (DLR).

Abstract

Product line practices promise to reduce development and maintenance efforts, to improve the productivity and to reduce the time to market by systematic reuse of commonalities and variabilities. However, in order to reap the fruits of exploiting those, an upfront investment is required. This paper presents a case study, which analyzes the cost-benefit ratio for one product line discipline – variability management. Wikon GmbH – a small German development organization evolving a product line of remote monitoring and controlling devices – switched from manual, file-based conditional compilation to toolsupported decision models. We discuss experiences made and show that the break-even was reached with the 4th product derivation. Keywords:

decision model, evolution, product line engineering, software architecture, variability management

Copyright © Fraunhofer IESE 2009

v

Table of Contents

1

Introduction

1

2 2.1 2.2

Context The Wikon Products Manual Variability Management

3 3 4

3 3.1 3.2

PuLSE™ Variability Management Decision Models Variability Management with Decision Models

6 6 6

4 4.1 4.2 4.3

Implementing Explicit Variability Management Tool Support Approach Application at Wikon

8 8 8 10

5 5.1 5.1.1 5.1.2 5.1.3 5.1.4 5.2 5.3

Case Study Setup Hypothesis Operationalization Subjects Data Collection Results Threats to Validity

11 11 11 11 12 12 13 13

6 6.1 6.2 6.3 6.4 6.5

Migration Scenarios Effort Extrapolation Wikon scenario Complex Product Scenario Holistic VM Scenario Evaluation

16 16 18 18 19 20

7

Lessons learned

21

8

Conclusion

23

References

Copyright © Fraunhofer IESE 2009

24

vii

Introduction

1

Introduction

Successfully transferring new ideas and concepts from research into industrial practice is a challenge we have mastered at Wikon GmbH. In previous work [1], we reported on the introduction of architecture-centric development with strategic reuse. We established well-defined components with respect to system variants and explicit responsibilities. Wikon develops embedded systems (measurement devices that monitor technical facilities remotely) with a team of three engineers. Due to the introduction of the reuse-driven architecturecentric development, Wikon was able to save 12 person-months of development time (from 32 to 20 person-months) and 3 person-months for quality assurance (mainly testing, from 8 to 5 person-months). Hence, we consider the shift of the development paradigm from ad-hoc reuse towards pro-active, systematic reuse as successful. Meanwhile, the number of supported variants, as well as the number of variation points and the complexity of their correlations, increased. This requires the methodical and explicit management of variability to efficiently develop and derive new variants, as well as to reduce the maintenance effort for already derived products. This next step shifts Wikon closer towards a full product line organization [2]. Variability management [3] comprises all activities to explicitly model, manage and document those parts, which vary among single products of a product line [2]. This paper presents a case study where we analyzed the cost-benefit ratio for the migration from purely manual variant management (editing #ifdef-based configuration files) towards the introduction of explicit, automated, toolsupported variability management (applying decision models [4]). In particular, we investigate the research question whether or not tool support for the product line discipline “variability management” will, in terms of effort reduction, pay off for small development organizations. Wikon introduced – after having four products on the market – an explicit variability management using decision models, as defined by the PuLSE™ (Product Line Software and System Engineering)† methodology [4]. The management of decision models is supported by the Fraunhofer Decision Modeler [5] tool. We report on the data collected in introducing explicit variability management. For the activity variability resolution, we compared the effort data in sense of the manual steps to the tool-supported steps. We further extrapolated the introduction of variability management for three scenarios based on the collected sample data of Wikon but using different input parameters: (1) the Wikon scenario as it was in the case study, (2) high number of variation point

Copyright © Fraunhofer IESE 2009

1

Introduction

scenario, and (3) hierarchically structured variation points towards features. The break-even is reached at least after the 3rd product in all three scenarios. The remainder of this paper is structured as follows: Section 2 introduces the Wikon product and describes the variability management practices as performed at Wikon. Then Section 3 describes the PuLSE™ variability management approach, whereas, section 4 reports on the migration process itself. Section 5 evaluates the migration and describes the empirical case study conducted including its setup, the results, and the threats to validity. Then Section 6 presents then the cost-benefit analysis. Our lessons learned are presented in section 7, while Section 8 draws conclusions.

2

Copyright © Fraunhofer IESE 2009

Context

2

Context

2.1

The Wikon Products has been a manufacturer of remote control systems (hardware and software) for more than 15 years. Currently, the Wikon company consists of about 25 employees. One of the main applications are battery-powered remote control systems (XENON series) for the energy sector (e.g., water, gas, heat, or liquid gas meters). The systems collect data from different meter types (e.g., different protocol interfaces, pulse outputs) and transmit the data in regular intervals to the Wikon Internet server platform, called WatchMyHome, using a GSM modem. The WatchMyHome server platform aggregates data from different measurement devices. Customers can access the WatchMyHome server and view the measurement data. If necessary, customers can trigger activities based on the measurement data in order to control the remotely monitored facility (e.g., refill a tank, plan the visits of technical service personnel, etc.). The scope of the work presented in this paper addresses only the embedded system – the family of measurement devices called the XENON series. The XENON series is implemented in the programming language C (except for about 1% lines of code written in Assembler). Figure 1 shows one example product of the XENON series. The XENON variants can be distinguished by the measurement sensor, the underlying hardware, the measurement protocol (when and how often are values measured and transmitted), and the GSM message format. The development organization for XENON series comprises three persons, one acting mainly as quality engineer and architect, the other two acting as developers. However, due to the fact that the development organization is rather small, the roles are interchanged on demand. In addition to the software development, the three engineers are responsible for hardware engineering of the measurement device but the main effort is spent on software development and maintenance.

Copyright © Fraunhofer IESE 2009

3

Context

Figure 1:

2.2

XENON8 Series: Variant “GGmm” – Measurement Device for Liquid Gas Tanks

Manual Variability Management In order to realize different variants of their XENON8 system, written in C language, Wikon uses conditional compilation. Conditional compilation is one of the most widespread mechanisms to realize variability on implementation level in C/C++ systems. For this, somewhere in the code a symbolic constant is defined using the preprocessor “#define”-statement. This assigns a value to a particular keyword that can be queried later at compile time using the preprocessor “#if”-statement. Based on this value, the code within an “#if”statement is added or omitted by the compiler. Figure 2 shows an example for conditional compilation in C. Line 1 defines a symbolic constant with the name “DO_A” and the value 1. In line 10 to 14 this constant is used to decide whether the method executeA() or executeB() is executed. Thus, it is possible to create different variants of a C system out of the same code base. The “#define”-statement can also be swapped out to another file which is then included using the “#include”-statement. For further information on the C preprocessor we refer to [6]. To manage the five different product variants of the XENON8 system, Wikon is using so-called definition files. A definition file consists of a set of “#define”statements defining all symbolic constants necessary to compile a particular variant. Consequently, five variants entail five definition files, each specifying one variant. To make the symbolic constants in the definition files available to

4

Copyright © Fraunhofer IESE 2009

Context

all files in the code base, each of those files includes a project.h file, which in turn includes the corresponding definition file depending on the variant to be compiled (see Figure 3).

1: #define DO_A 1 … 10: #if DO_A == 1 11: executeA(); 12: #else 13: executeB(); 14: #endif

Figure 2:

Conditional compilation in C

Figure 3:

Manual Variant Management using Variant-Specific Conditional Compilation

A paired comparison of the definition files showed, that the definition files have 84% to 96% of the “#define”-statements in common. Moreover, it could be observed, that there are dependencies among some of the symbolic constants (choosing a particular value for one symbolic constant implies a particular value for another symbolic constant). The large overlap of the definition files proposes to integrate the definition files and to introduce an explicit variability management for managing the divergences of variants. Because the variability is mainly encapsulated in the definition files, we made the decision to focus the variability management on the definition files instead of introducing a new realization mechanism like a frame technology or aspect orientation.

Copyright © Fraunhofer IESE 2009

5

PuLSE™ Variability Management

3

PuLSE™ Variability Management

To manage variability we describe the concrete variability management practice of the PuLSE method. This requires the introduction of decision models and a description of how to use those with definition files as applied at Wikon. 3.1

Decision Models A decision model is a dedicated model solely capturing the knowledge about what in a product line varies and how it varies. This is done by structuring the variability information in decisions and constraints. Basically a decision model is a table in which each row represents a decision. A decision owns the following properties that are represented by the respective columns: • ID: A unique identifier for the decision (usually an integer value) • Question: A question which operationalizes the decision for product derivation. • Resolution set: A set of answers to the decision’s question. • Minimum resolution size: The number of answers that must be given at least • Maximum resolution size: The number of answers that can be given at most • Effect: An effect for each possible answer to the decision’s question (constraints). In general there are two different types of decisions: simple decisions and complex decisions. A simple decision’s effect does only refer to some domain aspect (i.e., a variation point in a code artefact). Complex decisions are used to aggregate simple decisions and other complex decisions. Answering all questions and thus resolving all decisions leads to a resolution model. A resolution model consists of all decisions and the answers to their questions. Thus, it constitutes a concrete member of a product line. This activity is also called decision model instantiation or derivation.

3.2

Variability Management with Decision Models For the explicit documentation of variability we decided to use decision models as proposed by [7], because they inherently support the traceability of requirements to product line artifacts and enable the modeling of the observed

6

Copyright © Fraunhofer IESE 2009

PuLSE™ Variability Management

dependencies among the symbolic constants. This has the benefit of maintaining only a single model, the variability model, from which all definition files can be generated. The resulting approach would look as depicted in Figure 4. Instead of maintaining several definition files, the variability is managed using the Fraunhofer Decision Modeler tool [5], from which a definition file is generated and included in all code files in order to derive a particular variant of the system.

Figure 4:

Explicit Variant Management Using Generated Definition Files

Copyright © Fraunhofer IESE 2009

7

Implementing Explicit Variability Management

4

Implementing Explicit Variability Management

This section presents the tool support for the PuLSE™ variability management. Moreover, we show how methodology and tool were introduced at Wikon. 4.1

Tool Support To improve the method’s efficiency, we introduce the Fraunhofer Decision Modeler [5], which is a variability management tool developed at the Fraunhofer IESE as an extension to the Eclipse platform [8]. The main contribution of the Fraunhofer Decision Modeler is the support for all necessary activities in the lifecycle of decisions models. For this, it essentially uses the meta-model as proposed in [7]. Figure 5 shows the explorer view which is the main view of the Decision Modeler. From here, decision models can be modified by adding, removing, moving, or modifying decisions, folders, and constraints. Furthermore, an existing integration framework enables the integration of the Decision Modeler with other tools (e.g., like CASE tools).

4.2

Approach The transition towards an explicit variability approach covers the four activities: Initialization, Analysis, Derivation, and Maintenance. During initialization the basic variability infrastructure for the respective product line artifact base is built up. For the kind of variability model and product line artifact base considered in this activity this is an initial decision model consisting of only simple decisions. Thus, one has to create a decision for each symbolic constant to steer the resolution on instantiation. Thereby the decision’s name has to correspond with the actual symbolic constant’s name, because it is used in the derivation process to name the symbolic constant. The decision’s set of choices results from the possible values the symbolic constant can take. Since a symbolic constant can take only one value at a time, the minimum and maximum resolution size is always to be set to 1.

8

Copyright © Fraunhofer IESE 2009

Implementing Explicit Variability Management

Figure 5:

Decision Model Explorer

After all decisions are modeled, dependencies among the symbolic constants have to be elicited in the analysis activity. The identified dependencies are then modeled using resolution constraints and complex decisions. The Decision Modeler supports constraint modeling by offering a graphical editor for creating logical expressions in a drag & drop manner (Figure 6). By dragging decisions from the model explorer on the canvas and dropping them there, the according decision is added to the resolution constraint. On the palette on the left of the editor a set of tools and logical gates is provided, which are used to connect the decisions. For this the user can select the according element from the palette and click on the canvas to create an instance of the element. Finally the user has to connect the decisions and gates using the wire tool to create a valid logical expression. In the derivation activity the product line artifact base is instantiated (i.e., concrete products are derived). This activity is guided by the decision model. By answering the decisions’ questions, and thus resolving them, a resolution model is created. In a final step the resolution model serves as input to generate the definition file, which is used to compile the desired product variant. For this activity the Decision Modeler provides a resolution wizard, which lists all available decisions, their question and an optional description, and the available choices for the selected decision. The decision can be easily resolved by checking the desired choices and clicking a “Resolve” button. Whenever the user resolves a decision, a constraint solver (that is part of the Decision Modeler) checks all resolution constraints and if necessary automatically resolves other decisions or limits their available choices. Thus, the consistency of the resulting resolution model is assured at any point in time.

Copyright © Fraunhofer IESE 2009

9

Implementing Explicit Variability Management

Figure 6:

Constraint Modeling Editor

In the subsequently running maintenance activity new properties (such as new features) are added and thus have to be available either to the entire product line or only to some variants. Hence, the decision model was changed to reflect those changes. 4.3

Application at Wikon To set up the initial decision model the Wikon engineers analyzed four of the already existent definition files from their XENON8 product line. For each symbolic constant they identified in at least one definition file they create one decision in the decision model (initialization). In a second step they modeled the dependencies among the decisions. The necessary knowledge was partially available in form of “#if-#else”-statements in the definition files and partially only in the engineers’ heads (analysis). After setting up the initial decision model, Wikon used it to derive the four variants for which they established the decision model (derivation). During the transition phase Wikon had to proved a fifth variant of their system. Thus, they extended the decision model respectively and again derived the product (maintenance). To prove that the generated outcome was identical with the one in the definition files Wikon compared those using a variant comparison tool provided by Fraunhofer IESE [9].

10

Copyright © Fraunhofer IESE 2009

Case Study

5

Case Study

The case study investigates the impact of an explicit variability management approach and presents data collected before and after the introduction of decision models. 5.1

Setup

5.1.1 Hypothesis We claim that the introduction of an explicit and tool supported variability management approach will reduce the overall effort to derive new variants in the long run. The categorical documentation of variability with decision models establishes traceability from variability to variation points in code artifacts, which will greatly improve the understandability of variability. We believe that this and the accumulation of variability information in a single dedicated location will reduce the maintenance effort. A rule of thumb says when applying product line engineering, savings start between the 2nd and 3rd developed product. Since in this paper only a single product line practice is considered, we lower our expectations to the 4th product. Hence, our hypothesis is: H: The cost-benefit ratio for the introduction of explicit and tool-supported variability management for a product line of at least four members is positive (i.e., the return justifies the overhead effort spent on the introduction). 5.1.2

Operationalization To operationalize the hypothesis, we collected effort data for all activities related to variability management in the development process. The activities distinguish between a) the manual approach and b) the tool-supported explicit approach: 1. Initialization: Creation of the variation points a. an initial definition file b. a flat decision model 2. Analysis: a. b. establish dependencies among variation points

Copyright © Fraunhofer IESE 2009

11

Case Study

3. Derivation: Deriving a product by a. clone, adapt, and error check of definition file b. resolving the variation points 4. Maintenance: Maintaining the a. definition files b. decision model The data collected was the effort (in terms of duration) required for each activity: on the one hand performed manually and on the other hand the toolsupported performance. Based on the GQM [10] approach, we define our measurement goal as shown in Table 1. 5.1.3 Subjects The subjects of the case study were three Wikon engineers developing the XENON series. Two engineers are developers and one person is mainly responsible for quality assurance. All three engineers have several years experience in the domain and in developing products of the XENON series. 5.1.4

Data Collection We collected the effort data in informal interviews with the above mentioned Wikon engineers – after the Fraunhofer Decision Modeler has been introduced. The effort data is based on estimations of the engineers. Further, we measured the effort spent on introducing the decision model concept and the tool. A posteriori, the numbers of variation points, simple decisions, and complex decisions have been distilled from the definition file and the decision model respectively. Object Purpose

Explicit variability management Evaluation of the return of investment of explicit variability management in small organizations Effort for activities related to variability Quality management (creation, analysis, Aspect instantiation and maintenance of product line infrastructures) Viewpoint Product Line Manager The Wikon XENON8 product line Context Table 1:

12

GQM Goals

Copyright © Fraunhofer IESE 2009

Case Study

5.2

Results Table 2 lists effort ratio for all activities related to the variability management process (see column one). The activities are normalized to the activity with lowest effort (present in Table 2 with 1), which is the derivation activity using tool support. All other activities show the ratios proportional to this activity . Column two represents the effort needed with the manual approach, while column three shows the effort for the explicit variability approach. The concrete values were tracked by the Wikon engineers and elicited in informal interviews with those. In the analysis activity the effort for the manual approach is zero because there, no dependencies were modeled. As can be seen from Table 2 the initial effort for establishing the decision model and creating the dependencies among the decisions (initialization & analysis activity) is 4.4 times higher than using the manual approach: it consumes more time to create a decision with all its available resolution values in the tool than just coding a simple symbolic constant getting assigned a specific value. However, deriving new products and modifying the underlying model works more efficiently with the explicit variability management approach: only 1 model for n products has to be maintained, compared to n files for n products with the manual approach. This is evidence for our assumption that the initial overhead effort pays off with an increasing number of products. The results from Table 2 build the basis for an effort extrapolation for different scenarios in section 6.1.

Table 2:

5.3

Activity

Manual Effort

Initialization Analysis Derivation Maintenance

24:1 0 3:1 241:1

Effort with explicit variability management 82:1 24:1 1:1 24:1

Normalized Activity Effort Ratio for Variability Management Activities

Threats to Validity The threats to validity cover three categories internal, construct and external, as proposed in [11]. Internal Validity: Internal validity is the degree to which independent variables have impact on dependent variables. The following threats to internal validity were identified: • The collected data were only estimations based on the experience of the participants. Therefore, the data might not be 100% accurate but due to 1 This is the effort necessary for maintaining one product

Copyright © Fraunhofer IESE 2009

13

Case Study

the high experience of the participants we believe in a high integrity of the data. • The regarded type of variation points might be inappropriate, since we handled only variation points in terms of parameterization, even though usually many other types of variation points have to be considered. However, from other experiments and case studies, we know that the approach also works for other kinds of variation points. [12, 13] Construct Validity: Construct validity is the degree to which the settings of the case study in terms of the dependent and independent variables reflect the goal of the case study. The following possible threats were identified: • The Decision Modeler tool, the constructs and notations it uses to represent variability might force the developer in a certain direction, how he models variability. Maybe he would have modeled some variability in a more efficient way. Nevertheless, we did not receive any feedback regarding such issues. • It is possible, that the measured data would be distorted due to missing experience of the developers with the Eclipse Framework, on which the Decision Modeler tool is build. However, this effect would amortize when repeating the case study after some month of work with the Decision Modeler tool. • The involvement of the Fraunhofer IESE in modeling variability was constrained to a short introduction to the basic features and use of the Decision Modeler tool. External Validity: External validity is the degree to which the results of the case study can be transferred to other people and to changed environmental settings. As with any case study, also this case study is specific to this particular context and can not be generalized, because we can identify the following threats to external validity: • The XENON series might not be representative product line system. • The size of the system can differ immensely. In this setting we observed a rather small system. • Also the size of the involved team and the organizational context greatly differs and thus is not representative in general. • The number of about 120 variation points may not be representative as it can alternate from a few hundreds to many thousands of variation points. • The transition scenario of generating definition files may not be representative.

14

Copyright © Fraunhofer IESE 2009

Case Study

To overcome the above mentioned threats the case study should be executed several more times with different settings, such as different systems and team sizes. Furthermore, another scenario could be chosen. This could for example be to operate directly on the source code instead of generating definition files.

Copyright © Fraunhofer IESE 2009

15

Migration Scenarios

6

Migration Scenarios

The results of the case study indicate that the cost-benefit ratio at Wikon is positive after deriving four products. We further want to investigate this ratio for other typical product line scenarios. Thus, we extrapolate the variability management effort for high number of variation points and hierarchically structured variation points towards features. 6.1

Effort Extrapolation To extrapolate the effort evolution for different scenarios, we assume the following formula:

E = Eini + Edep + Eres + Emaint Formula 1:

Overall variability management effort

Thereby, Eini is the effort to build up the initial definition file or decision model, Edep is the effort for establishing dependencies, Eres is the effort needed to derive one product, and Emaint is the maintenance effort for the product line. The effort formulas for the manual and tool-supported approach differ in the effort factors f1 to f4. The factors were tracked and provided by the Wikon engineers as illustrated inTable 2. and the factor for the manual approach are postfixed with m and ,respectively, the factors for the tool supported approach with t.

Eini_m =

Eini_t =

f1_m ⋅ d

f1_t ⋅ d

(d is the number of symbolic constants / decision) Formula 2:

16

Effort for setting up an initial definition file / decision model

Copyright © Fraunhofer IESE 2009

Migration Scenarios

Edep_m = 0 Edep_t = f 2 _ t

⋅c

(c is the number of complex decisions) Formula 3:

Effort for establishing dependencies

Edep in Formula 3 is zero because in the manual approach no dependencies were modeled (also see cell Analysis/Manual approach in Table 2).

Eres_m = Eres_t =

f3 _ m ⋅ d ⋅ n

(

)

f3 _ t ⋅ d − (c ⋅ a )+ c ⋅ n

(a is the average number of simple decisions constrained by one complex decision) Formula 4:

Effort for deriving n products

The term (d −(c⋅a )+ c ) in Formula 4 determines the number of decisions that are effectively to be answered.

Emaint_m =

Emaint_t =

f4 _ m ⋅ d ⋅ n

(

f4 _ t ⋅ d + c

)

(n is the number of derived products) Formula 5:

Effort for maintenance

We do not claim that the listed formulas are fully correct or complete. For instance in Formula 5, we assume a linear increase for the manual maintenance effort, even though we think the effort increases much more with an increasing number of products. Nevertheless, the formulas are supposed to show a trend for the efforts, when using different parameters. We applied Formula 1 for different scenarios (i.e. with different input parameters), which were chosen to validate the impact of different typical product line settings, such as a larger artifact base (i.e., larger and more complex products) or a better organized variability model on the overall effort for variability management. In each scenario section, the first table shows the input parameters used for the calculation. The second table shows the ratio of the manual approach to the tool supported approach. A ratio of 1 indicates an equal effort for both approaches, a ratio less than 1 indicates that the manual approach is less effort

Copyright © Fraunhofer IESE 2009

17

Migration Scenarios

intensive, and a ratio bigger then 1 that the tool supported approach is less effort intensive. 6.2

Wikon scenario In this scenario, we have chosen the input parameters as we observed them at Wikon. It can be seen, that the break even point is around the third and fourth derived product, because the ratio turns to a value greater 1. This consolidates the claim of a positive cost-benefit ratio after deriving four products. Variable d (symbolic constants / decision) c (complex decisions) a (average number of simple decisions constrained by one complex decision)

Value 124 12 5

Manual Approach f1 (initialization) f2 (analysis) f3 (derivation) f4 (maintenance) Table 3:

24 0 3 24

Tool Supported Approach 82 24 1 24

Scenario 1– Input parameter

n 1 2 3 4 5 6 7 10 ratio 0,46 0,70 0,93 1,17 1,40 1,63 1,85 2,52 Table 4:

6.3

Scenario 1 – Ratio of the manual to the tool supported approach

Complex Product Scenario In this scenario, we significantly increased the number of symbolic constants, to simulate a larger product line repository and thus more variation points, as it would be the case for a large, complex product line. We could observe that the total effort is almost independent of the product line’s repository size. This is because we assume a linear increase of the maintenance effort with a growing size of the product line. However, with an increasing number of products, we

18

Copyright © Fraunhofer IESE 2009

Migration Scenarios

think the overall effort for the manual approach would increase much more than linear, since we have many more models a human has to grasp. Variable d (symbolic constants / decision) c (complex decisions) a (average number of simple decisions constrained by one complex decision)

Value 1000 12 5

Manual Approach f1 (initialization) f2 (analysis) f3 (derivation) f4 (maintenance) Table 5:

24 0 3 24

Tool Supported Approach 82 24 1 24

Scenario 2 – Input parameter

n 1 2 3 4 5 6 7 10 ratio 0,47 0,72 0,96 1,20 1,43 1,66 2,10 2,53 Table 6:

6.4

Scenario 2 – Ratio of the manual to the tool supported approach

Holistic VM Scenario Scenario 1 is based on a quite flat decision model (i.e., little number of complex decisions). Therefore, scenario 3 simulates the existence of a more hierarchically structured decision model. This would be the case when managing variability in a more systematically and holistic way, i.e., in all phases of the software development lifecycle up to the requirements level. This results in more complex decision and thus a lower total number of “low level” decisions to be resolve during product derivation. The results of applying Formula 1 are shown in Table 7 and Table 8. We could make the same observations as in section 6.3.

Copyright © Fraunhofer IESE 2009

19

Migration Scenarios

Variable d (symbolic constants / decision) c (complex decisions) a (average number of simple decisions constrained by one complex decision)

Value 124 20 5

Manual Approach f1 (initialization) f2 (analysis) f3 (derivation) f4 (maintenance) Table 7:

24 0 3 24

Tool Supported Approach 82 24 1 24

Scenario 3 – Input parameter

n 1 2 3 4 5 6 7 10 ratio 0,45 0,68 0,91 1,15 1,38 1,61 1,83 2,51 Table 8:

6.5

Scenario 3 – Ratio of the manual to the tool supported approach

Evaluation Figure 7 compares the output of the three scenarios. It can be seen, that the curves are almost identical. That means, that the variables we changed in the scenarios do not have an impact such that the curve becomes steeper in a certain case. From this, we conclude that the main influence on the ratio is the maintenance overhead (which is equal for all scenarios) created by the number of derived products. Even though there are several threats to our conclusion as mention in section 5.3, the data we collected serve as a first guideline for prognostication. However, to support the data with statistic evidence more data (especially in different settings) has to be collected and evaluated.

20

Copyright © Fraunhofer IESE 2009

Lessons learned

7

Lessons learned

To introduce the explicit variability management at Wikon, we gave a half day tutorial about decision models and the usage of the Decision Modeler tool. According to the Wikon engineers two issues hamper the usage of the decision models. • First, the initial effort to become acquainted with the tool support and the methodology was considered as high. In proportion to the currently low number of variants produced at Wikon, they did not perceive the actual cost-benefit ratio as shown by the data. • Second, the Fraunhofer Decision Modeler is yet another tool to work with. The engineers would appreciate a tool that is integrated into their particular development environment, which does not contain the Eclipse platform. However, they also stated that they see the advantage of the tool in deriving and managing more variants, since then it reduces the workload necessary to keep the different definition files consistent. 3,00

Effort ratio

2,50 2,00 1,50

breakeven

1,00 0,50 0,00 1

2

3

4

5

6

7

8

9

10

#Products Scenario 1 Figure 7:

Scenario 2

Scenario 3

Effort ratio for different scenarios

To actually migrate from the manual to the tool supported approach Wikon needed 1.5 days. The migration encompassed the consolidation and transfer of five definition files to one decision model (1 day) and its optimization (0.5 day). The output of the migration was a decision model consisting of 124 decisions. In the case of Wikon, we were able to observe a positive cost-benefit ratio after

Copyright © Fraunhofer IESE 2009

21

Lessons learned

the release of the fourth product. Besides, we noted several quality improvements at Wikon: • Wikon architects were able to derive their products faster than before (within minutes). • The derivation happened in a more consistent way, because dependencies were checked automatically and feedback on forbidden combinations of values for symbolic constants was given directly during the derivation process. • The domain expert workload was reduced. As soon as the initial decision model is established, domain knowledge is only required for change requests or problems in deriving products. Before, the domain expert was needed all the time. • The automated validation of the decision model (all dependencies are checked for consistency whenever the model changes) is expected to reduce resolution errors that occurred in the past when manually editing the decision files. • Moreover, it is possible to store configurations for several, different variants, which can then be managed as a whole at any later point in time. • The most decisive contribution added by the explicit variability management is the reduced maintenance effort. Whenever a system change affected some variability management concern, only the decision model, instead of many definition files, had to be changed.

22

Copyright © Fraunhofer IESE 2009

Conclusion

8

Conclusion

Variability management – as all product line practices – requires an upfront investment in order to pay off for later product derivations. This paper delivers first empirical evidence that variability management is worthwhile investment for small development organizations. Supported by Fraunhofer IESE, the Wikon team of XENON8 developers (comprising only three engineers) adopted explicit variability management using decision models. By systematically collecting data on the variability management activities before (i.e., resolving products manually) and after (resolving products using the Fraunhofer Decision Modeler) the transition, we can claim that the cost-benefit ratio is positive after the fourth product. In short, this paper presents a success story of transferring variability management into industrial practice. The investments made are worthwhile, in terms of overall effort reduction and quality improvements. We bridged the challenging gap from research to practice and we hope that our lessons learned encourage other (small) development organization to start reasoning about the introduction of product line engineering. Of course, the experiences made at Wikon are not fully applicable in other contexts, because of the vast amount of context parameters and their influence on each. Considering all parameters an effort calculation becomes impossible, because of its complexity. That is why some sort of abstraction has to be done. However, as the discussion of typical migration scenarios shows, a positive cost-benefit ratio was given in all scenarios. In future, Fraunhofer IESE continues collecting further empirical data on the introduction of explicit variability management in different settings with different context information to back up the observations made in this paper.

Copyright © Fraunhofer IESE 2009

23

References

References

[1]

H.-J. Beyer, D. Hein, C. Schitter, J. Knodel, D. Muthig, and M. Naab, "Introducing Architecture-Centric Reuse into a Small Development Organization," in International Conference on Software Reuse Beijing, 2008, pp. 1-13.

[2]

P. Clements, L. Northrop, and L. Northrop, Software Product Lines: Practices and Patterns: {Addison-Wesley Professional}, 2001.

[3]

K. Pohl and A. Metzger, "Variability management in software product line engineering," ACM Press, 2006, pp. 1049--1050.

[4]

J. Bayer, O. Flege, P. Knauber, R. Laqua, D. Muthig, K. Schmid, T. Widen, and J.-M. DeBaud, "PuLSE: A Methodology to Develop Software Product Lines," in Symposium on Software reusability (SSR '99) Los Angeles, California, United States, 1999 pp. 122-131.

[5]

T. Kruse, "Managing Decision Model Constraints in Product Line Engineering," in Department of Computer Science. vol. Diploma Kaiserslautern: Technical University of Kaiserslautern, 2004.

[6]

B. W. Kernighan and D. M. Ritchie, Programmieren in C. München: Carl Hanser Verlag, 1990.

[7]

D. Muthig, "A Light-weight Approach Facilitating an Evolutionary Transition Towards Software Product Lines," Fraunhofer IRB Verlag, Stuttgart, 2002.

[8]

The Eclipse Foundation, "Eclipse Homepage." vol. 2005.

[9]

S. Duszynski, J. Knodel, M. Naab, D. Hein, and C. Schitter, "Variant Comparison - A Technique for Visualizing Software Variants," in 15th Working Conference on Reverse Engineering. WCRE'2008 - Proceedings: IEEE Computer Society, Los Alamitos, 2008, pp. 229-233.

[10] V. Basili, "Software modeling and measurement: the Goal/Question/Metric paradigm," University of Maryland at College Park 1992.

24

Copyright © Fraunhofer IESE 2009

References

[11] C. Wohlin, P. Runeson, M. Höst, M. C. Ohlsson, B. Regnell, and A. Wesslén, Experimentation in Software Engineering: An Introduction: Kluwer Academic Publishers, 2000. [12] K. Schmid and I. John, "A customizable approach to full lifecycle variability management," Science of Computer Programming, vol. 53, pp. 259–284, 2004. [13] K. Yoshimura, T. Forster, D. Muthig, and D. Pech, "Model-Based Design of Product Line Components in the Automotive Domain," in Software Product Line Conference, Limerick, 2008, pp. 170-179.

Copyright © Fraunhofer IESE 2009

25

Document Information

Title:

Variability Management in Small Development Organizations – Experiences and Lessons Learned from a Case Study

Date: Report: Status: Distribution:

June 2009 IESE-077.09/E Final Public Limited

Copyright 2011, ArQuE consortium. All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means including, without limitation, photocopying, recording, or otherwise, without the prior written permission of the publisher.