Software Testing And Analysis: Process, Principles, And .

Transcription

Software Testing and Analysis:Process, Principles, andTechniques

Software Testing and Analysis:Process, Principles, andTechniquesMauro PezzèUniversità di Milano BicoccaMichal YoungUniversity of Oregon

PUBLISHERSENIOR PRODUCTION EDITOREDITORIAL ASSISTANTCOVER DESIGNERCOVER PHOTOW ILEY 200 TH ANNIVERSARY LOGO DESIGNDaniel SayreLisa WojcikLindsay MurdockMadelyn LesureRick Fischer/MasterfileRichard J. PacificoThis book was typeset by the authors using pdfLATEXand printed and boundby Malloy Lithographing. The cover was printed by Phoenix Color Corp.This book is printed on acid free paper. Copyright c 2008 John Wiley & Sons, Inc. All rights reserved. No part of thispublication may be reproduced, stored in a retrieval system or transmitted in any form orby any means, electronic, mechanical, photocopying, recording, scanning or otherwise,except as permitted under Sections 107 or 108 of the 1976 United States Copyright Act,without either the prior written permission of the Publisher, or authorization throughpayment of the appropriate per-copy fee to the Copyright Clearance Center, Inc. 222Rosewood Drive, Danvers, MA 01923, website www.copyright.com. Requests to thePublisher for permission should be addressed to the Permissions Department, John Wiley& Sons, Inc., 111 River Street, Hoboken, NJ 07030-5774, (201) 748-6011, fax(201) 748-6008, website http://www.wiley.com/go/permissions.To order books or for customer service please, call 1-800-CALL WILEY (225-5945).ISBN-13 978-0-471-45593-6Printed in the United States of America10 9 8 7 5 6 4 3 2 1

ContentsList of FiguresxiList of TablesxvIFundamentals of Test and Analysis1Software Test and Analysis in a Nutshell1.1 Engineering Processes and Verification . . . . . . . . . .1.2 Basic Questions . . . . . . . . . . . . . . . . . . . . . . .1.3 When Do Verification and Validation Start and End? . . .1.4 What Techniques Should Be Applied? . . . . . . . . . . .1.5 How Can We Assess the Readiness of a Product? . . . . .1.6 How Can We Ensure the Quality of Successive Releases? .1.7 How Can the Development Process Be Improved? . . . .335571011112A Framework for Test and Analysis2.1 Validation and Verification . . . . . . . . . . . . . . . . . . . . . . .2.2 Degrees of Freedom . . . . . . . . . . . . . . . . . . . . . . . . . .2.3 Varieties of Software . . . . . . . . . . . . . . . . . . . . . . . . . .151518233Basic Principles3.1 Sensitivity .3.2 Redundancy .3.3 Restriction .3.4 Partition . . .3.5 Visibility . .3.6 Feedback . .292932333536364Test and Analysis Activities Within a Software Process4.1 The Quality Process . . . . . . . . . . . . . . . . .4.2 Planning and Monitoring . . . . . . . . . . . . . . .4.3 Quality Goals . . . . . . . . . . . . . . . . . . . . .4.4 Dependability Properties . . . . . . . . . . . . . . .4.5 Analysis . . . . . . . . . . . . . . . . . . . . . . .393941424346.v.1.

viCONTENTS4.64.74.8IITesting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Improving the Process . . . . . . . . . . . . . . . . . . . . . . . . .Organizational Factors . . . . . . . . . . . . . . . . . . . . . . . . .Basic Techniques484950535Finite Models5.1 Overview . . . . . . . . . . . .5.2 Finite Abstractions of Behavior5.3 Control Flow Graphs . . . . . .5.4 Call Graphs . . . . . . . . . . .5.5 Finite State Machines . . . . . .5555585963656Dependence and Data Flow Models6.1 Definition-Use Pairs . . . . . . . . . . . . . .6.2 Data Flow Analysis . . . . . . . . . . . . . . .6.3 Classic Analyses: Live and Avail . . . . . . .6.4 From Execution to Conservative Flow Analysis6.5 Data Flow Analysis with Arrays and Pointers .6.6 Interprocedural Analysis . . . . . . . . . . . .777782859194967Symbolic Execution and Proof of Properties7.1 Symbolic State and Interpretation . . . . . .7.2 Summary Information . . . . . . . . . . . .7.3 Loops and Assertions . . . . . . . . . . . . .7.4 Compositional Reasoning . . . . . . . . . .7.5 Reasoning about Data Structures and Classes.1011021041051081098Finite State Verification8.1 Overview . . . . . . . . . . . . . . . . . . . . .8.2 State Space Exploration . . . . . . . . . . . . .8.3 The State Space Explosion Problem . . . . . . .8.4 The Model Correspondence Problem . . . . . .8.5 Granularity of Modeling . . . . . . . . . . . . .8.6 Intensional Models . . . . . . . . . . . . . . . .8.7 Model Refinement . . . . . . . . . . . . . . . .8.8 Data Model Verification with Relational Algebra.113113116126129131134138140III9.Problems and MethodsTest Case Selection and Adequacy9.1 Overview . . . . . . . . . . .9.2 Test Specifications and Cases9.3 Adequacy Criteria . . . . . .9.4 Comparing Criteria . . . . . .149.151151152154157

CONTENTS10 Functional Testing10.1 Overview . . . . . . . . . . . . . . . . . .10.2 Random versus Partition Testing Strategies10.3 A Systematic Approach . . . . . . . . . .10.4 Choosing a Suitable Approach . . . . . . .16116116216717411 Combinatorial Testing11.1 Overview . . . . . . . . . . .11.2 Category-Partition Testing . .11.3 Pairwise Combination Testing11.4 Catalog-Based Testing . . . .179180180188194.12 Structural Testing12.1 Overview . . . . . . . . . . . . . . .12.2 Statement Testing . . . . . . . . . . .12.3 Branch Testing . . . . . . . . . . . .12.4 Condition Testing . . . . . . . . . . .12.5 Path Testing . . . . . . . . . . . . . .12.6 Procedure Call Testing . . . . . . . .12.7 Comparing Structural Testing Criteria12.8 The Infeasibility Problem . . . . . .21121221521721922222923023013 Data Flow Testing13.1 Overview . . . . . . . . . . . . . . . . . . . .13.2 Definition-Use Associations . . . . . . . . . .13.3 Data Flow Testing Criteria . . . . . . . . . . .13.4 Data Flow Coverage with Complex Structures .13.5 The Infeasibility Problem . . . . . . . . . . .23523623623924124314 Model-Based Testing14.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . .14.2 Deriving Test Cases from Finite State Machines . . . . .14.3 Testing Decision Structures . . . . . . . . . . . . . . .14.4 Deriving Test Cases from Control and Data Flow Graphs14.5 Deriving Test Cases from Grammars . . . . . . . . . . .24524524625125725715 Testing Object-Oriented Software15.1 Overview . . . . . . . . . . . . . . . . . .15.2 Issues in Testing Object-Oriented Software15.3 An Orthogonal Approach to Test . . . . . .15.4 Intraclass Testing . . . . . . . . . . . . . .15.5 Testing with State Machine Models . . . .15.6 Interclass Testing . . . . . . . . . . . . . .15.7 Structural Testing of Classes . . . . . . . .15.8 Oracles for Classes . . . . . . . . . . . . .15.9 Polymorphism and Dynamic Binding . . .271271272280282282286293298301.vii

viiiCONTENTS15.10 Inheritance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .15.11 Genericity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .15.12 Exceptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .16 Fault-Based Testing16.1 Overview . . . . . . . . . . . . . .16.2 Assumptions in Fault-Based Testing16.3 Mutation Analysis . . . . . . . . .16.4 Fault-Based Adequacy Criteria . . .16.5 Variations on Mutation Analysis . .303306308.31331331431531932117 Test Execution17.1 Overview . . . . . . . . . . . . . . . . . .17.2 From Test Case Specifications to Test Cases17.3 Scaffolding . . . . . . . . . . . . . . . . .17.4 Generic versus Specific Scaffolding . . . .17.5 Test Oracles . . . . . . . . . . . . . . . . .17.6 Self-Checks as Oracles . . . . . . . . . . .17.7 Capture and Replay . . . . . . . . . . . . .32732732832933033233433718 Inspection18.1 Overview . . . . . . .18.2 The Inspection Team .18.3 The Inspection Process18.4 Checklists . . . . . . .18.5 Pair Programming . .34134134334434535119 Program Analysis19.1 Overview . . . . . . . . . . . . . . . . . . .19.2 Symbolic Execution in Program Analysis . .19.3 Symbolic Testing . . . . . . . . . . . . . . .19.4 Summarizing Execution Paths . . . . . . . .19.5 Memory Analysis . . . . . . . . . . . . . . .19.6 Lockset Analysis . . . . . . . . . . . . . . .19.7 Extracting Behavior Models from Execution.355355356358359360363365IV.Process20 Planning and Monitoring the Process20.1 Overview . . . . . . . . . . . . .20.2 Quality and Process . . . . . . . .20.3 Test and Analysis Strategies . . .20.4 Test and Analysis Plans . . . . .20.5 Risk Planning . . . . . . . . . . .20.6 Monitoring the Process . . . . . .373.375375376377382386389

CONTENTS20.7 Improving the Process . . . . . . . . . . . . . . . . . . . . . . . . .20.8 The Quality Team . . . . . . . . . . . . . . . . . . . . . . . . . . .39439921 Integration and Component-based Software Testing21.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .21.2 Integration Testing Strategies . . . . . . . . . . . . . . . . . . . . . .21.3 Testing Components and Assemblies . . . . . . . . . . . . . . . . .40540540841322 System, Acceptance, and Regression Testing22.1 Overview . . . . . . . . . . . . . . . . . . . . .22.2 System Testing . . . . . . . . . . . . . . . . . .22.3 Acceptance Testing . . . . . . . . . . . . . . . .22.4 Usability . . . . . . . . . . . . . . . . . . . . .22.5 Regression Testing . . . . . . . . . . . . . . . .22.6 Regression Test Selection Techniques . . . . . .22.7 Test Case Prioritization and Selective Execution .41741741842142342742843423 Automating Analysis and Test23.1 Overview . . . . . . . . . . . . . .23.2 Automation and Planning . . . . .23.3 Process Management . . . . . . . .23.4 Static Metrics . . . . . . . . . . . .23.5 Test Case Generation and Execution23.6 Static Analysis and Proof . . . . . .23.7 Cognitive Aids . . . . . . . . . . .23.8 Version Control . . . . . . . . . . .23.9 Debugging . . . . . . . . . . . . .23.10 Choosing and Integrating Tools . .43943944144144344544544844944945124 Documenting Analysis and Test24.1 Overview . . . . . . . . . . . . . . .24.2 Organizing Documents . . . . . . . .24.3 Test Strategy Document . . . . . . .24.4 Analysis and Test Plan . . . . . . . .24.5 Test Design Specification Documents24.6 Test and Analysis Reports . . . . . .455455456458458460462Bibliography467Index479ix

xCONTENTS

List of Figures1Selective reading . . . . . . . . . . . . . . . . . . . . . . . . . . . .xxi1.1Analysis and testing activities . . . . . . . . . . . . . . . . . . . . .92.12.2Validation and verification . . . . . . . . . . . . . . . . . . . . . . .Verification trade-off dimensions . . . . . . . . . . . . . . . . . . . .16193.13.2Unpredictable failure and predictable failure . . . . . . . . . . . . .Initialize before use problem . . . . . . . . . . . . . . . . . . . . . .31344.1Dependability properties . . . . . . . . . . . . . . . . . . . . . . . ion coalesces execution states . . . . . . . . . . . . . . . .Constructing control flow graphs . . . . . . . . . . . . . . . . . . .Java method to collapse adjacent newline characters . . . . . . . .Statements broken across basic blocks . . . . . . . . . . . . . . . .Linear-code sequence and jump (LCSAJ) . . . . . . . . . . . . . .Over-approximation in a call graph . . . . . . . . . . . . . . . . .Context sensitivity . . . . . . . . . . . . . . . . . . . . . . . . . .Exponential explosion of calling contexts in a call graph . . . . . .Finite state machine specification of line-end conversion procedureCorrectness relations for a finite state machine model . . . . . . . .Procedure to convert among Dos, Unix, and Macintosh line ends . .Completed FSM specification of line-end conversion procedure . .96.10GCD calculation in Java . . . . . . . . . . . . . . . . . . .Control flow graph of GCD method . . . . . . . . . . . . .Data dependence graph of GCD method . . . . . . . . . . .Calculating control dependence . . . . . . . . . . . . . . .Control dependence tree of GCD method . . . . . . . . . .Reaching definitions algorithm . . . . . . . . . . . . . . . .Available expressions algorithm . . . . . . . . . . . . . . .Java method with potentially uninitialized variable . . . . .Control flow with definitions and uses . . . . . . . . . . . .Annotated CFG for detecting uses of uninitialized variables.78798081828486878889xi.

xiiLIST OF FIGURES6.11 CGI program in Python with misspelled variable . . . . . . . . . . .6.12 Powerset lattice . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6.13 Spurious execution paths in interprocedural analysis . . . . . . . . .9193977.17.2Binary search procedure . . . . . . . . . . . . . . . . . . . . . . . .Concrete and symbolic tracing . . . . . . . . . . . . . . . . . . . . 8.148.158.168.17Finite state verification . . . . . . . . . . . . . . . . . . .Misapplication of the double-check initialization pattern .FSM models from Figure 8.2 . . . . . . . . . . . . . . . .Promela finite state model . . . . . . . . . . . . . . . . .Excerpts of Spin verification tool transcript . . . . . . . .Spin guided simulation trace describing race condition . .A graphical interpretation of Spin guided simulation traceDining philosophers in Promela . . . . . . . . . . . . . .A simple data race in Java . . . . . . . . . . . . . . . . .Coarse and fine-grain models of interleaving . . . . . . .Lost update problem . . . . . . . . . . . . . . . . . . . .OBDD encoding of a propositional formula . . . . . . . .OBDD representation of transition relation . . . . . . . .Data model of a simple Web site . . . . . . . . . . . . . .Alloy model of a Web site. . . . . . . . . . . . . . . . . .Alloy model of a Web site (continued) . . . . . . . . . . .A Web site that violates the “browsability” property . . 459.1A Java method for collapsing sequences of blanks . . . . . . . . . . .15510.1 A Java class for finding roots of a quadratic equation . . . . . . . . .10.2 A quasi-partition of a program’s input domain . . . . . . . . . . . . .10.3 The functional testing process . . . . . . . . . . . . . . . . . . . . .16516716911.111.211.311.4Specification of Check configuration . . . . . . . . . . . . .Specification of cgi decode . . . . . . . . . . . . . . . . . .Elementary items of specification cgi decode . . . . . . . .Test case specifications for cgi decode generated after step 2.18219519820112.112.212.312.412.512.612.712.8The C function cgi decode . . . . . . . . . . . . . . . . . . . . . . .Control flow graph of function cgi decode . . . . . . . . . . . . . . .The control flow graph of C function cgi decode0 . . . . . . . . . . .Deriving a tree from a control flow graph for boundary/interior testingBuggy self-organizing list . . . . . . . . . . . . . . . . . . . . . . .Control flow graph of C function search . . . . . . . . . . . . . . . .Tree of boundary/interior sub-paths for C function search . . . . . .Subsumption relations among structural test adequacy criteria . . . .21321421822322422522623113.1 The C function cgi decode . . . . . . . . . . . . . . . . . . . . . . .13.2 A C procedure with a large number of DU paths . . . . . . . . . . .237241.

LIST OF FIGURES13.3 Pointer arithmetic . . . . . . . . . . . . . . . . . . . . . . . . . . . .24214.1 Functional specification of feature Maintenance . . . . . . . . . .14.2 The finite state machine corresponding to Maintenance . . . . . .14.3 Functional specification of feature Pricing . . . . . . . . . . . . .14.4 Decision table for Pricing . . . . . . . . . . . . . . . . . . . . .14.5 Set of test cases corresponding to the modified adequacy criterion14.6 Functional specification of Process shipping order . . . . . . . .14.7 Control flow model of Process shipping order . . . . . . . . . .14.8 Node-adequate test suite . . . . . . . . . . . . . . . . . . . . . .14.9 Branch-adequate test suite . . . . . . . . . . . . . . . . . . . . .14.10 Functional specification of Advanced search . . . . . . . . . . . .14.11 BNF description of Advanced search . . . . . . . . . . . . . . .14.12 XML schema for Product configuration . . . . . . . . . . . . . .14.13 BNF description of Product configuration . . . . . . . . . . . . .14.14 Test case for feature Advanced Search . . . . . . . . . . . . . . .14.15 The BNF description of Product Configuration . . . . . . . . . .14.16 Sample seed probabilities for the BNF of Product Configuration 5.1 Part of a Java implementation of class Model . . . . . . . . . . .15.2 More of the Java implementation of class Model . . . . . . . . . .15.3 Class diagram for the LineItem hierarchy. . . . . . . . . . . . . .15.4 Part of a Java implementation of class Account. . . . . . . . . . .15.5 Impact of object-oriented design on analysis and test. . . . . . . .15.6 Statechart specification of class Model . . . . . . . . . . . . . . .15.7 Finite state machine corresponding to the statechart in Figure 15.615.8 Statechart specification of class Order . . . . . . . . . . . . . . .15.9 Finite state machine corresponding to the statechart in Figure 15.815.10 Class diagram of the Chipmunk Web presence . . . . . . . . . .15.11 Use/include relation for the class diagram in Figure 15.10 . . . .15.12 Sequence diagram for configuring an order . . . . . . . . . . . .15.13 Partial intraclass control flow graph for class Model . . . . . . . .15.14 Summary information for structural interclass testing . . . . . . .15.15 Polymorphic method call . . . . . . . . . . . . . . . . . . . . . .15.16 Part of a Java implementation of the abstract class LineItem . . . .15.17 Part of a Java implementation of class CompositeItem . . . . . . 0716.116.216.316.4Program transduce . . . . . . . . . . .Sample mutation operators for C . . . .Sample mutants for program TransduceEdit distance check . . . . . . . . . . .31731832032517.117.217.317.4JUnit tests in JFlex . . . . . . . . . . . . . . .Test harness with comparison-based test oracleTesting with self-checks . . . . . . . . . . . .Structural invariant as run-time self-check . . .331333334336.xiii

xivLIST OF FIGURES18.1 Detailed description referenced by a checklist item. . . . . . . . . . 36436536736836937020.1 Alternative schedules . . . . . . . . . . . . . . . . . . . . . . . . . .20.2 A sample A&T schedule . . . . . . . . . . . . . . . . . . . . . . . .20.3 Typical fault distribution over time . . . . . . . . . . . . . . . . . . .38538739221.1 Chipmunk Web presence hierarchy . . . . . . . . . . . . . . . . . 1 CodeCrawler code size visualization . . . . . . . . . . . . . . . . . .45024.1 Sample document naming conventions . . . . . . . . . . . . . . . . .456A C program invoking cgi decode . . . . . . . . . . .Purify verification tool transcript. . . . . . . . . . . .Model of memory states . . . . . . . . . . . . . . . .Concurrent threads with shared variables . . . . . . .Lockset state transition diagram . . . . . . . . . . . .A Java method for inserting a node in an AVL tree . .Sample set of predicates for behavior program analysisTest cases for an AVL tree . . . . . . . . . . . . . . .Behavioral models for method insert . . . . . . . . . .Version 1.0 of the C function cgi decode . . . . . . . . . . .Version 2.0 of the C function cgi decode . . . . . . . . . . .Coverage of structural test cases for cgi decode . . . . . . . .Control flow graph of function cgi decode version 2.0 . . . .New definitions and uses for cgi decode . . . . . . . . . . . .Flow graph model of the extended shipping order specification.

List of Tables11.111.211.311.411.511.611.711.8Example categories and value classes . . . . . . .Test case specifications for Check configuration . .Parameters and values for Display control . . . . .Pairwise coverage of three parameters . . . . . . .Pairwise coverage of five parameters . . . . . . . .Constraints for Display control . . . . . . . . . . .A test catalog . . . . . . . . . . . . . . . . . . . .Summary of catalog-based test cases for cgi decode.18718919019119219320220512.1 Test cases for cgi decode . . . . . . . . . . . . . . . . . . . . . . . .21513.1 Definitions and uses for C function cgi decode . . . . . . . . . . . .13.2 DU pairs for C function cgi decode . . . . . . . . . . . . . . . . . .23823914.1 A test suite derived from the FSM of Figure 14.2 . . . . . . . . . . 1 Standard severity levels for root cause analysis . . . . . . . . . . . . .39721.1 Integration faults. . . . . . . . . . . . . . . . . . . . . . . . . . . . .407Test cases to satisfy transition coverage criterion . . . . .Simple transition coverage . . . . . . . . . . . . . . . .Equivalent scenarios . . . . . . . . . . . . . . . . . . .Pairwise combinatorial coverage of polymorphic bindingTesting history for class LineItem . . . . . . . . . . . . .Testing history for class CompositeItem . . . . . . . . .xv.

xviLIST OF TABLES

PrefaceThis book addresses software test and analysis in the context of an overall effort toachieve quality. It is designed for use as a primary textbook for a course in softwaretest and analysis or as a supplementary text in a software engineering course, and as aresource for software developers.The main characteristics of this book are: It assumes that the reader’s goal is to achieve a suitable balance of cost, schedule, and quality. It is not oriented toward critical systems for which ultra-highreliability must be obtained regardless of cost, nor will it be helpful if one’s aimis to cut cost or schedule regardless of consequence. It presents a selection of techniques suitable for near-term application, with sufficient technical background to understand their domain of applicability and toconsider variations to suit technical and organizational constraints. Techniquesof only historical interest and techniques that are unlikely to be practical in thenear future are omitted. It promotes a vision of software testing and analysis as integral to modern software engineering practice, equally as important and technically demanding asother aspects of development. This vision is generally consistent with currentthinking on the subject, and is approached by some leading organizations, but isnot universal. It treats software testing and static analysis techniques together in a coherentframework, as complementary approaches for achieving adequate quality at acceptable cost.Why This Book?One cannot “test quality into” a badly constructed software product, but neither can onebuild quality into a product without test and analysis. The goal of acceptable qualityat acceptable cost is both a technical and a managerial challenge, and meeting the goalrequires a grasp of both the technical issues and their context in software development.xvii

xviiiPrefaceIt is widely acknowledged today that software quality assurance should not be aphase between development and deployment, but rather a set of ongoing activities interwoven with every task from initial requirements gathering through evolution of thedeployed product. Realization of this vision in practice is often only partial. It requirescareful choices and combinations of techniques fit to the organization, products, andprocesses, but few people are familiar with the full range of techniques, from inspectionto testing to automated analyses. Those best positioned to shape the organization andits processes are seldom familiar with the technical issues, and vice versa. Moreover,there still persists in many organizations a perception that quality assurance requiresless skill or background than other aspects of development.This book provides students with a coherent view of the state of the art and practice,and provides developers and managers with technical and organizational approaches topush the state of practice toward the state of the art.Who Is This Book For?Students who read portions of this book will gain a basic understanding of principlesand issues in software test and analysis, including an introduction to process and organizational issues. Developers, including quality assurance professionals, will find avariety of techniques with sufficient discussion of technical and process issues to support adaptation to the particular demands of their organization and application domain.Technical managers will find a coherent approach to weaving software quality assurance into the overall software process. All readers should obtain a clearer view of theinterplay among technical and nontechnical issues in crafting an approach to softwarequality.Students, developers, and technical managers with a basic background in computerscience and software engineering will find the material in this book accessible withoutadditional preparation. Some of the material is technically demanding, but readers mayskim it on a first reading to get the big picture, and return to it at need.A basic premise of this book is that effective quality assurance is best achievedby selection and combination of techniques that are carefully woven into (not graftedonto) a software development process for a particular organization. A software qualityengineer seeking technical advice will find here encouragement to consider a widercontext and participate in shaping the development process. A manager whose faithlies entirely in process, to the exclusion of technical knowledge and judgment, willfind here many connections between technical and process issues, and a rationale for amore comprehensive view.How to Read This BookThis book is designed to permit selective reading. Most readers should begin withPart I, which presents fundamental principles in a coherent framework and lays thegroundwork for understanding the strengths and weaknesses of individual techniquesand their application in an effective software process. Part II brings together basic tech-

xixnical background for many testing and analysis methods. Those interested in particularmethods may proceed directly to the relevant chapters in Part III of the book. Wherethere are dependencies, the Required Background section at the beginning of a chapter indicates what should be read in preparation. Part IV discusses how to design asystematic testing and analysis process and incorporates it into an overall developmentprocess, and may be read either before or after Part III.Readers new to the field of software test and analysis can obtain an overview by readingChapters1241011141517181920Software Test and Analysis in a nutshellA Framework for Test and AnalysisTest and Analysis Activities within a Software ProcessFunctional TestingCombinatorial TestingModel-Based TestingTesting Object-Oriented SoftwareTest ExecutionInspectionProgram AnalysisPlanning and Monitoring the ProcessNotes for InstructorsThis book can be used in an introductory course in software test and analysis or as asupplementary text in an undergraduate software engineering course.An introductory graduate-level or an undergraduate level course in software test andanalysis can cover most of the book. In particular, it should include All of Part I (Fundamentals of Test and Analysis), which provides a completeoverview. Most of Part II (Basic Techniques), which provides fundamental background,possibly omitting the latter parts o

Software Testing and Analysis: Process, Principles, and Techniques Mauro Pezze Universita di Milano Bicocca Michal Young University of Oregon. PUBLISHER Daniel Sayre SENIOR PRODUCTION EDITOR Lisa Wojcik EDITORIAL ASSISTANT Lindsay Murdock . This