+353-1-416-8900REST OF WORLD
+44-20-3973-8888REST OF WORLD
1-917-300-0470EAST COAST U.S
1-800-526-8630U.S. (TOLL FREE)

Advanced Testing of Systems-of-Systems, Volume 2. Practical Aspects. Edition No. 1

  • Book

  • 304 Pages
  • December 2022
  • John Wiley and Sons Ltd
  • ID: 5839237

As a society today, we are so dependent on systems-of-systems that any malfunction has devastating consequences, both human and financial. Their technical design, functional complexity and numerous interfaces justify a significant investment in testing in order to limit anomalies and malfunctions.

Based on more than 40 years of practice, this book goes beyond the simple testing of an application - already extensively covered by other authors - to focus on methodologies, techniques, continuous improvement processes, load estimates, metrics and reporting, which are illustrated by a case study. It also discusses several challenges for the near future.

Pragmatic and clear, this book displays many examples and references that will help you improve the quality of your systemsof-systems efficiently and effectively and lead you to identify the impact of upstream decisions and their consequences.

Advanced Testing of Systems-of-Systems 2 deals with the practical implementation and use of the techniques and methodologies proposed in the first volume.

Table of Contents

Dedication and Acknowledgments xiii

Preface xv

Chapter 1 Test Project Management 1

1.1 General principles 1

1.1.1 Quality of requirements 2

1.1.2 Completeness of deliveries 3

1.1.3 Availability of test environments 3

1.1.4 Availability of test data 4

1.1.5 Compliance of deliveries and schedules 5

1.1.6 Coordinating and setting up environments 6

1.1.7 Validation of prerequisites - Test Readiness Review (TRR) 6

1.1.8 Delivery of datasets (TDS) 7

1.1.9 Go-NoGo decision - Test Review Board (TRB) 7

1.1.10 Continuous delivery and deployment 8

1.2 Tracking test projects 9

1.3 Risks and systems-of-systems 10

1.4 Particularities related to SoS 11

1.5 Particularities related to SoS methodologies 11

1.5.1 Components definition 12

1.5.2 Testing and quality assurance activities 12

1.6 Particularities related to teams 12

Chapter 2 Testing Process 15

2.1 Organization 17

2.2 Planning 18

2.2.1 Project WBS and planning 19

2.3 Control of test activities 21

2.4 Analyze 22

2.5 Design 23

2.6 Implementation 24

2.7 Test execution 25

2.8 Evaluation 26

2.9 Reporting 28

2.10 Closure 29

2.11 Infrastructure management 29

2.12 Reviews 30

2.13 Adapting processes 31

2.14 RACI matrix 32

2.15 Automation of processes or tests 33

2.15.1 Automate or industrialize? 33

2.15.2 What to automate? 33

2.15.3 Selecting what to automate 34

Chapter 3 Continuous Process Improvement 37

3.1 Modeling improvements 37

3.1.1 PDCA and IDEAL 38

3.1.2 CTP 39

3.1.3 SMART 41

3.2 Why and how to improve? 41

3.3 Improvement methods 42

3.3.1 External/internal referential 42

3.4 Process quality 46

3.4.1 Fault seeding 46

3.4.2 Statistics 46

3.4.3 A posteriori 47

3.4.4 Avoiding introduction of defects 47

3.5 Effectiveness of improvement activities 48

3.6 Recommendations 50

Chapter 4 Test, QA or IV&V Teams 51

4.1 Need for a test team 52

4.2 Characteristics of a good test team 53

4.3 Ideal test team profile 54

4.4 Team evaluation 55

4.4.1 Skills assessment table 56

4.4.2 Composition 58

4.4.3 Select, hire and retain 59

4.5 Test manager 59

4.5.1 Lead or direct? 60

4.5.2 Evaluate and measure 61

4.5.3 Recurring questions for test managers 62

4.6 Test analyst 63

4.7 Technical test analyst 64

4.8 Test automator 65

4.9 Test technician 66

4.10 Choose our testers 66

4.11 Training, certification or experience? 67

4.12 Hire or subcontract? 67

4.12.1 Effective subcontracting 68

4.13 Organization of multi-level test teams 68

4.13.1 Compliance, strategy and organization 69

4.13.2 Unit test teams (UT/CT) 70

4.13.3 Integration testing team (IT) 70

4.13.4 System test team (SYST) 70

4.13.5 Acceptance testing team (UAT) 71

4.13.6 Technical test teams (TT) 71

4.14 Insourcing and outsourcing challenges 72

4.14.1 Internalization and collocation 72

4.14.2 Near outsourcing 73

4.14.3 Geographically distant outsourcing 74

Chapter 5 Test Workload Estimation 75

5.1 Difficulty to estimate workload 75

5.2 Evaluation techniques 76

5.2.1 Experience-based estimation 76

5.2.2 Based on function points or TPA 77

5.2.3 Requirements scope creep 79

5.2.4 Estimations based on historical data 80

5.2.5 WBS or TBS 80

5.2.6 Agility, estimation and velocity 81

5.2.7 Retroplanning 82

5.2.8 Ratio between developers - testers 82

5.2.9 Elements influencing the estimate 83

5.3 Test workload overview 85

5.3.1 Workload assessment verification and validation 86

5.3.2 Some values 86

5.4 Understanding the test workload 87

5.4.1 Component coverage 87

5.4.2 Feature coverage 88

5.4.3 Technical coverage 88

5.4.4 Test campaign preparation 89

5.4.5 Running test campaigns 89

5.4.6 Defects management 90

5.5 Defending our test workload estimate 91

5.6 Multi-tasking and crunch 92

5.7 Adapting and tracking the test workload 92

Chapter 6 Metrics, KPI and Measurements 95

6.1 Selecting metrics 96

6.2 Metrics precision 97

6.2.1 Special case of the cost of defaults 97

6.2.2 Special case of defects 98

6.2.3 Accuracy or order of magnitude? 98

6.2.4 Measurement frequency 99

6.2.5 Using metrics 99

6.2.6 Continuous improvement of metrics 100

6.3 Product metrics 101

6.3.1 FTR: first time right 101

6.3.2 Coverage rate 102

6.3.3 Code churn 103

6.4 Process metrics 104

6.4.1 Effectiveness metrics 104

6.4.2 Efficiency metrics 107

6.5 Definition of metrics 108

6.5.1 Quality model metrics 109

6.6 Validation of metrics and measures 110

6.6.1 Baseline 110

6.6.2 Historical data 111

6.6.3 Periodic improvements 112

6.7 Measurement reporting 112

6.7.1 Internal test reporting 113

6.7.2 Reporting to the development team 114

6.7.3 Reporting to the management 114

6.7.4 Reporting to the clients or product owners 115

6.7.5 Reporting to the direction and upper management 116

Chapter 7 Requirements Management 119

7.1 Requirements documents 119

7.2 Qualities of requirements 120

7.3 Good practices in requirements management 122

7.3.1 Elicitation 122

7.3.2 Analysis 123

7.3.3 Specifications 123

7.3.4 Approval and validation 124

7.3.5 Requirements management 124

7.3.6 Requirements and business knowledge management 125

7.3.7 Requirements and project management 125

7.4 Levels of requirements 126

7.5 Completeness of requirements 126

7.5.1 Management of TBDs and TBCs 126

7.5.2 Avoiding incompleteness 127

7.6 Requirements and agility 127

7.7 Requirements issues 128

Chapter 8 Defects Management 129

8.1 Defect management, MOA and MOE 129

8.1.1 What is a defect? 129

8.1.2 Defects and MOA 130

8.1.3 Defects and MOE 130

8.2 Defect management workflow 131

8.2.1 Example 131

8.2.2 Simplify 132

8.3 Triage meetings 133

8.3.1 Priority and severity of defects 133

8.3.2 Defect detection 134

8.3.3 Correction and urgency 135

8.3.4 Compliance with processes 136

8.4 Specificities of TDDs, ATDDs and BDDs 136

8.4.1 TDD: test-driven development 136

8.4.2 ATDD and BDD 137

8.5 Defects reporting 138

8.5.1 Defects backlog management 139

8.6 Other useful reporting 141

8.7 Don’t forget minor defects 141

Chapter 9 Configuration Management 143

9.1 Why manage configuration? 143

9.2 Impact of configuration management 144

9.3 Components 145

9.4 Processes 145

9.5 Organization and standards 146

9.6 Baseline or stages, branches and merges 147

9.6.1 Stages 148

9.6.2 Branches 148

9.6.3 Merge 148

9.7 Change control board (CCB) 149

9.8 Delivery frequencies 149

9.9 Modularity 150

9.10 Version management 150

9.11 Delivery management 151

9.11.1 Preparing for delivery 153

9.11.2 Delivery validation 154

9.12 Configuration management and deployments 155

Chapter 10 Test Tools and Test Automation 157

10.1 Objectives of test automation 157

10.1.1 Find more defects 158

10.1.2 Automating dynamic tests 159

10.1.3 Find all regressions 160

10.1.4 Run test campaigns faster 161

10.2 Test tool challenges 161

10.2.1 Positioning test automation 162

10.2.2 Test process analysis 162

10.2.3 Test tool integration 162

10.2.4 Qualification of tools 163

10.2.5 Synchronizing test cases 164

10.2.6 Managing test data 164

10.2.7 Managing reporting (level of trust in test tools) 165

10.3 What to automate? 165

10.4 Test tooling 166

10.4.1 Selecting tools 167

10.4.2 Computing the return on investment (ROI) 169

10.4.3 Avoiding abandonment of tools and automation 169

10.5 Automated testing strategies 170

10.6 Test automation challenge for SoS 171

10.6.1 Mastering test automation 171

10.6.2 Preparing test automation 173

10.6.3 Defect injection/fault seeding 173

10.7 Typology of test tools and their specific challenges 174

10.7.1 Static test tools versus dynamic test tools 175

10.7.2 Data-driven testing (DDT) 176

10.7.3 Keyword-driven testing (KDT) 176

10.7.4 Model-based testing (MBT) 177

10.8 Automated regression testing 178

10.8.1 Regression tests in builds 178

10.8.2 Regression tests when environments change 179

10.8.3 Prevalidation regression tests, sanity checks and smoke tests 179

10.8.4 What to automate? 180

10.8.5 Test frameworks 182

10.8.6 E2E test cases 183

10.8.7 Automated test case maintenance or not? 184

10.9 Reporting 185

10.9.1 Automated reporting for the test manager 186

Chapter 11 Standards and Regulations 187

11.1 Definition of standards 189

11.2 Usefulness and interest 189

11.3 Implementation 190

11.4 Demonstration of compliance - IADT 190

11.5 Pseudo-standards and good practices 191

11.6 Adapting standards to needs 191

11.7 Standards and procedures 192

11.8 Internal and external coherence of standards 192

Chapter 12 Case Study 195

12.1 Case study: improvement of an existing complex system 195

12.1.1 Context and organization 196

12.1.2 Risks, characteristics and business domains 198

12.1.3 Approach and environment 200

12.1.4 Resources, tools and personnel 210

12.1.5 Deliverables, reporting and documentation 212

12.1.6 Planning and progress 213

12.1.7 Logistics and campaigns 216

12.1.8 Test techniques 217

12.1.9 Conclusions and return on experience 218

Chapter 13 Future Testing Challenges 223

13.1 Technical debt 223

13.1.1 Origin of the technical debt 224

13.1.2 Technical debt elements 225

13.1.3 Measuring technical debt 226

13.1.4 Reducing technical debt 227

13.2 Systems-of-systems specific challenges 228

13.3 Correct project management 229

13.4 DevOps 230

13.4.1 DevOps ideals 231

13.4.2 DevOps-specific challenges 231

13.5 IoT (Internet of Things) 232

13.6 Big Data 233

13.7 Services and microservices 234

13.8 Containers, Docker, Kubernetes, etc 235

13.9 Artificial intelligence and machine learning (AI/ML) 235

13.10 Multi-platforms, mobility and availability 237

13.11 Complexity 238

13.12 Unknown dependencies 238

13.13 Automation of tests 239

13.13.1 Unrealistic expectations 240

13.13.2 Difficult to reach ROI 241

13.13.3 Implementation difficulties 242

13.13.4 Think about maintenance 243

13.13.5 Can you trust your tools and your results? 244

13.14 Security 245

13.15 Blindness or cognitive dissonance 245

13.16 Four truths 246

13.16.1 Importance of Individuals 247

13.16.2 Quality versus quantity 247

13.16.3 Training, experience and expertise 248

13.16.4 Usefulness of certifications 248

13.17 Need to anticipate 249

13.18 Always reinvent yourself 250

13.19 Last but not least 250

Terminology 253

References 261

Index 267

Summary of Volume 1 269

Authors

Bernard Homes IEEE Standards Association; TESSCO sas.