//0 N95-16456 - Core

Transcription

'iii .https://ntrs.nasa.gov/search.jsp?R 19950010041 2020-06-16T08:51:10 00:00Z.//0o /AStudyofSoftwareStandardsUsedKelly J. ithinthepastdecade,softwarehasbecomean increasinglycommonelementincomputingsystems.In particular,the roleof softwareusedin the lapplications,is ive /:!7 ,i : ,/.i 7i:77 iii':[/ .!ii)i/! :, ,!i. ij!il)[i 'i'i):intheN95- 16456AvionicsIndustryBranchGroupexperimentsover the past1 5 yearswith thegoalof dunderstandingof the failurebehaviorofsoftware:,improvedmethods.for n be developed,As part of this program,theeffectivenessof currentindustrystandardsfor:thedevelopmentof :avionicssoftwareis beinginvestigated!This studyinvolvesthegenerationofa son S} projectinvOlvesthe establishmentofan experimentationtesttbedto methodsand collectdatathatcan be usedto makestatisticalinferencesaboutthe effectivenessof thosemethods,Thistest-bedallowsthe developmentandsimulatedoperationaltestingof multipleimplementationsof a guidanceand centphaseof theVikinglander.Thetest-bedis comprisedofsoftwarerequirementsfor theguidanceandcontrolapplication,a configurationmanagementanddatacollectionsystem,and a softwaresimulatorto run the controltechniquesfor achievingand verifyingthereliabilityof tprocesses,andtechniquesare mandatedby governmentregulatingagenciessuchas entof Defense,no one methodologyhasbeenshownto consistentlyproducereliablesoftware.The notreachedthematurityof itshardwarecounterpart.To date,existingsoftwaredevelopmentmethodsand standardshave been acceptedlargelybasedon allycollectedfroma softwaredevelopmentprocessincludea descriptionandsomeclassificationof omastatisticalperspective,thisrepresentsa singlereplicateof einsightcouldbegainedintothefeasibilityandimpactof thesoftwaredevelopmentmethodon thatparticularimplementationof software.However,thesinglereplicatedoesnot provideenoughinformationto makestatisticalinferenceswith confidenceaboutthe effectivenessofthe developmentmethodin tionalbehaviorof the tificallyevaluateand improvesoftwareprocessesand countsfortheperformanceof hasconducteda .The simulatoris designedtoallowone or moreimplementationsof theGCS to run in a multitaskingenvironmentand to collectdata on the comparisonof theresultsfrom multipleimplementations.Thistest-bedprovidesa capabilityforempiricallyinvestigatingthe investigatingthe reliabilityof the resultantsoftware.Currently,the GCStest-bedisbeingusedto tcomplywith neSystemsand dby vendorsincecompliancewith85JKJ-

i : :i :: faultsfoundat differentdevelopmentstagesand the relationshipamongthe classesoffaultsfoundby coupledwiththeeffortdatafor all meinsightinto the effectivenessof the variousdevelopmentand ionalenvironmentto helpidentifyany ightinto the ityof the final softwareproducts.Due to the extentof the datacollectionand configurationmanagementproceduresusedin the test-bed,any phasein thelifecycleof theGCS implemer tationscan bereproduced.Thisgivesa researcherthecapabilityto go backto anyoneof thestagesof the ificationtechniqueto the software,and dimplementation.Hence,the GCSdevelopmentandverificationenvironmentcan serveas a test-bedfor the ngsoftwareexperimentsduringthecourseof thisstudyof theDO-178Bguidelines.A primarylessonis thatasimplecasestudyis notanadequateexperimentdesignto evaluatean entiresoftwaredevelopmentprocess.Conductinga r,wouldrequiresignificantresourcesin termsof time andman-power.Developmentof the GCS testbed,though,is a steptowardconductingthe experimentationnecessaryto providethe empiricaldatawe needto nd productquality.The presentationprovidesfurtherdetailaboutthe studyof the DO- 178B guidelinesandtheeffortto conductvalidsoftwaretheseguidelinesis requiredby the FAA fordevelopingsoftwareto be usedin aft.The purposeof the DO-178Bdocumentis to provideguidelinesfor the productionofsoftwarefor airbornesystemsthat performsitsintendedfunctionwitha levelof J?confidencein safetythatcomplieswithairworthinessrequirements.It is hopedthatfollowingtheguidelinesin DO-178Bwillensuretheproductionof reliablesoftwarethatis documented,traceable,testable,and maintainable.The guidelines,however,do not stipulatespecificreliabilityrequirementsfor the stimationtechniquesdo not provideresultsin whichconfidencecanbeplacedto thelevelrequiredfor certificationpurposes.The threemajorprocesses:a es,andintegralprocesses.The softwareplanningprocessdefinesandcoordinatesall of theprojectactivities.The softwaredevelopmentprocessesare t.Theseincludethe requirements,design,code, essesensurethe correctness,control,and confidenceof the management,qualityassurance,and certificationliaisonprocesses.To studytheeffectivenessof theDO178Bguidelinesonthequalityof thesoftware,a simplecasestudyin whichtwoGCS implementationsare beingdevelopedis beingconducted.Two teamsconsistingof a programmerand a verificationanalysthaveeachbeentaskedto developanimplementationof theGCSfollowingtheDO-178Bguidelineswithinthe 178Bdevelopmentprocess.Thisdataincludes:a descriptionof the softwareerrorsfound;the activitywhenthe errorwasdetected,suchas actiontakenwithrespectto the error.This data will allow usto not onlylook at thenumberof faultsdetectedbut, more importantly,the class ofexperiments.86

A Study of Software StandardsUsed in the Avionics Industryr "Kelly J. HayhurstAssessmentResearchTheTechnologyand TechnologyRole of ComputersJunein LaRCBranchGroupR&DWorkshop15, 1994Outline Background Software!i! GuidanceStandardsand ControlSoftware Summary87Project

Backgroundri. ii 'i i i i Softwareis usedin a widevariety video gamesi answering machines,cars, automatic teller machines, . Softwarehardwarehas many benefitscounterpart: allows for more complex providesincreasedof applications:anti-lockcomparedbrakes ontoitslogicflexibility easier to modify Useof softwarecriticalis increasingin life- andsafety-applications avionics,Airbus 320 control of nuclear power plantsSoftware Engineering Softwareelementis a logical Softwareris developedratherthana physicalor "engineered"system-- not manufacturedIprinciplesto soundsoftwarethat worksefficientlyon real engineeringmachines productionscienceof a systematicand ,mathematics,of a product,system88process,or

ill/, iReliable Software Achieving/,reliable no one knowssoftwareis a globalhow to generateperfectproblemsoftwarei Many'64)proposed delslife-critical(sincesoftwaremost consider reliability growth based on faults founddevelopment,: as opposed to operationalreliability Oftenbasedon simplistic- constantassumptionsfailure rates- stochastic Little existing(unverified)inindependencedata availableto validatemodelsSoftware DilemmaSoftware can significantlysystem capabilityexpand ?--,,-"C Since we don'tperfectisoftwareknow how to build-- Risk,How do we deal with these risks?89@

Software Standards" There i i '!Iareuseda numberof softwareguidelines/standardsin industry DO-178B,used by the Federal Aviation DoD-2167A,used by the DepartmentAdministration(FAA)of Defense ISO 9000Providethatthe performs i !-guidelinesits intendedwith somelevelrequirements11fortheproductionof softwarefunctionof confidencethatcomplieswith thegivenSoftware Standards ! Many softwaredevelopmenttechniques,standardsexist and are in use most have been acceptedor anecdotal evidence".weneedto codifyengineeringshouldhowever,-- justbe.makeandlargely based on logical argumentsstandardas soonpracticesmattersforas we theyby evidence,worse."o- from Digital Woes (Why We Should not Depend on Software),by Lauren Ruth Wiener90

Focus, ,i!We need to become".informedby evidence"!LII :;, Conduct scientificsoftwareexperimentsto understand:failure- need to examine operational behavior of softwareii?!i A the effect of different software development techniques- relate that understanding to process models and standards71 ;Conduct Experiments!!Collect Empirical Evidence!]SoftwareExperimentsin ATBfGOALEstablisha controlledenvironmentscientificexperimentsto address:thethetoconductreliabilityof softwareandeffectivenessof softwaredevelopmentmethodsJ GuidanceandControlSoftware study of the RTCA/DO-178BConsiderationsin AirborneCertification)i "sponsored"sby the FAA91(GCS)Projectguidelines (SoftwareSystems and Equipment

RTCA/DO-178B FAA i- dedDO-178Bcommercialforsoftwareaircraftequipmenti i ,i ! ' iii ,i, softwaresoftwaredesigners must take a disciplineddevelopment Givesgeneraland verificationA-E. !i '.guidelinesaccording behaviorcauses catastrophic E: anomalouscapacitybehaviorhas no effect on operationalProcess:softwaredevelopment Software SoftwarefailureLife Cycle ProcessesPlanningDevelopmenttodevelopmentlevels"-- A: anomalousconditionSoftware for softwareto sProcesses:RequirementsProcessDesign Process Software Coding Process Integration Process IntegralProcesses:confidenceensure Softwareverification SoftwareCohfigurationcorrectness,ProcessManagement Software Quality Assurance Process Certification Liaison Process92Processcontroland

Development:/& VerificationIntegrate,/De/ e opm:nt sDof ewlapi!iill,:. sDoeftSiwgane C sCf dereRqmts.,i; . lopmentIntegralIntegrated/ IOperationa!lCode: ,2lReview"'lnteTgra ' jt DO-178BLife Cycle Process,l"IUnitsc(DesignRqmts. IIActivities"S ftwarelVerification'ModulesI IS ftwarek/llFlowLife Cyc!e DataLife Cycle DataPlan for Aspects of maryRequirementsDataDesign DescriptionSource CodeExecutableObject uresResults& CasesConfigurationManagement PlanConfigurationManagement RecordsDevelopment Environmeqt ConfigurationConfigurationIndexQuality Assurance PlanQuality Assurance RecordsProblem Reports93Index

CASE Tools,:i ,, ,i, i "'i:i !!:!i i CASE tools can be usedairborne softwarein the developmentof Any tool used must be qualified Qualificationis done Software Developmentairborne softwareby type:Tools:whose outputis part of thetools that cannotintroduce- ex. source code generator SoftwareVerificationTools:errors -- but may fail to detect them- ex. analysis of complexity toolCASE Tools Qualification For SoftwareDevelopment show that the developmentequivalentTools:process used for the tool isto that used for the airborne softwareFor SoftwareVerificationTools: show that the tool compliesrequirementswith its operationalunder normal operating94conditions

Study of DO-178BGuidelines Workwith the FAA to evaluatewith the DO.178Bguidelinesmethodsthat Base study on earlier work done at the ResearchInstitute to study the DO-178A guidelines iExperimentOneDesign:ShotCasecomplyTriangleStudy XApplyGuidanceDO 178Bandseewhatyougetand Control SoftwareProject Developsoftware use a guidance completethrough Provideaccordingto DO-178Band control applicationthe life cycle starting from software requirementsintegrationa controlledenvironment extensivedocumentation extensivedata collectionand configuration- failure data- effort and cost data Simulate operationof the determineremaining determinereliabilityfaults95softwareto:control

96rUMOC] ),uaoseo e ,nqoeJedJo],oe(eJ1 ueosec]leu!tuJelsJelN oI uo!ss!Lu Japue'l 5U! l!A9L61eql jo sseoons jo/ 1!l!qeqoJd eq/ pn so pasn LueJSo d uo!telnm!se uo peseq aJe s uauJeJ!nbeElSeo! ep6u! !eoeJ O OJ,tueosepeqt tnoqost!puoelo!qe UO!,IOLUJOJU!/ osues etOO!UnLULUOOo"( )eooJJnS s,teuold eql ot:lueosepIOU!LuJell u!Jnp elo!qe I u!puDI /uo euold:o jo lOJ UOO eu!6ue puo eouop!n6ep!AOJd ([),:esodJnduop, eo!iddvSO0 eq/i

olUnitssoftwarewhichis composedaredividedinto:i:3 Subframes:Sensor'iii: i1 Frame-1 Trajectory1 iteration 2000of theframesGCS Development Producing2 GCS3 subframesProcessesimplementations each implementationhas a designated& verification analyst tedusestherequirementsusingImplementations projected' size:coded /iinspectionin FORTRAN1500 - 2000 lines of code conduct code review using formal samedocumentteamwork conduct design review using formalprocedures programmerprocedures9?inspectionof:

Integrationi* Code is integratedProcessat 4 levels:functionalsubframesframesunitstrajectory* Testingconducted demonstrateat all 4 levelsto:that the software satisfiesits requirements demonstrate (with high confidence) that errors which couldlead to unacceptable failure conditions have been removed L, * 100%coverageforrequirements-basedo equirementsTDesignDesigYCode CodeReviewUnit ageProducts

:: -::: .Y .L. :.Y :: / ;i!: //:.:i! :i.% :?:: :i:,:'v:y:::, ?j:::: 'Y :::: :: :::: ::::::U:::::: :: : :L :x:. :?:::::. ? :: ' : : : : : : : :''' : : : : : : : :. : : : : : : : : il, i :More ProductsI RequirementsSoftwareTi !i !?i: :iDesignDesign: ----ReviewCodeCodeReviewDesignm one0Designmilestone3Codemilestone1.T,Unit TestSubframei:i: I,% FrameTestTestTrajectoryTest Designmilestone4 Codemilestone2DesignJlestOne5 Codemilestone3Designmilestone6 Codemilestone4Designmilestone71-, Codemilestone51SoftwareLP rod uct s Each software product (requirements,design, code,test cases, documentation)is placed underconfigurationcontrol starting with the initial version the CodeCorp.Managementis beingSystem(CMS)by DigitalEquipmentused Each subsequentchange to a software productcontrolledand capturedby the configurationmanagementsystem All versions)f any softwareand can be reproduced:i: -99productisare preserved

Experiment/i .,i , IndependentlyGCSL generate each mentmethodologyof thedefinedin Collect effort/costdata for all developmentandverificationactivitiesfor each implementation Collectdata on all faultsidentifiedin the softwareproductsthroughoutthe developmentverificationprocesses Collect dataoperationon all faultsidentifiedandin simulatedGCS Simulator Providesinputs (aboutsensor processing Performscontrolresponse" Receivesdataenvironmentmodeling& lander)forfor the ceControli,GCS ImplementationF espd se LModelsend dataProcessingrecortldatalsend datarecortlLaw Processinc//send dataIdata ---)recortldata --IJ100ii

GCS Simulator Serves as a testbed for back-to-backtestingmultiple GCS implementations(up to 28) For back,to-backtesting,designatedas the "driver"one implementationimplementation The results of all implementationsthe end of each subframe.i i i r isare checked for limit errors, comparing each variable againstpredeterminedvalid range;i : ofatits for accuracy errors, comparing results of eachimplementationwith results of the driver implementation All miscomparisonsare recorded and investigatedto determinethe source of the problemOperationalFailure CharacterizationDr1 , 1mplemGeCtStiExamineIJGCSImplementati '2i[Implementati YnGcsIGCS Simulator Use the softwarefailuredata to estimate reliability of final version of each implementation determineeffectivenessof the development methodology101

Understandingthe FailureData. i i ,i Questionsof Interest-- How many faultsin the set?i-- What typesof faults?-- Are thereany critical-- Are thereclassesfaults?of faultsfoundduring random testing that aredifferent than those found sGCS SimulatorGCSco0eSimulatoreviewedersionsAre thesefaultI VersionsFinalsets equivalent?-- Is the integrationprocessmore effective (or efficient)comparedto other fault detectionmethods?102

i GCS Project Statusi: :i Thefollowingprojectartifactshave Requirementsfor the guidance Configurationmanagementbeenand controldeveloped:applicationsystem GCS simulator Data collectionsystem Project documentation 2 implementationsdevelopment Plan'94to completearein theDesigndevelopmentphaseby endofof DecemberLessons Learned Be prepareddocument Allowto documentsufficientdocumentation Toolscantimecanup frontof thoseforplanning-- and-- andplansbe helpful can help you organize Tools- and documentand track items more efficientlybe hurtful it takes time ( ) to learn all about new tools and how to usethem- allow for such time while planning everyone involved with the outputneeds to understand that tool103of a developmenttool

!i i ii! . . .::: :. :More LessonsoComplyingwith theDO-178Bguidelinesis notcheap developingcriticalsoftwaredocumentationintensiveis time,man-power,and{QCollectingdata -- softwareeffort data -- is difficult softwareproblems changescan reluctancefailureare oftenimpactmanyto accuratelydata and cost/complexprojectaccount:artifactsfor developmenteffortSummaryIGathering-empiricalBur IMPORTANT!!!evidenceisdifficult1 GCS project providesa controlledenvironmentobserve and collect empiricaldata on softwaredevelopmentmethods Realisticguidance industry-standardApplyingandcontrol . softwareprocesses improvereliabilityestimationinputand practicesunderstandingand the qualityimprove provideapplicationguidelinesProvide data to increasedevelopmentprocessesproductsfor improving104toof softwareof their& product'qualitymethodssoftwarestandards

Project Plans Makethe ImprovestatisticalGCStestbedavailablethe experimentanalysisdesignto otherto allowresearchersmore///GCS Package//Softwareire RequirementslequirIntermediateVerification& Final Development Producim Devehq .heckliProducts:s (Checklists,test casesSimulator"ettDocumentation//105

RTCA/DO-178B Guidelines FAA requires compliance with DO-178B for software developed for embedded commercial aircraft equipment software designers must take a disciplined approach to software development Gives general guidelines for software development and verification according to "software levels" --A-E