Digital Engineering Metrics - SERC

Transcription

Digital Engineering MetricsSponsor: OUSD(R&E)ByTom McDermott, Nicole Hutchison, Mark Blackburn, Annie Yu, Neal Chen (Stevens)Eileen Van Aken, Alejandro Saledo, Kaitlin Henderson (Va Tech)11th Annual SERC Sponsor Research ReviewNovember 19, 2019FHI 360 CONFERENCE CENTER1825 Connecticut Avenue NW, 8th FloorWashington, DC 20009www.sercuarc.orgSSRR 2019November 19, 20191

Enterprise Modeling of the DoD DigitalInformation Exchange Process 2018: SERC Project RT-182 conceptually modeled the 5 goals of the DoD DEStrategy to identify necessary acquisition enterprise changes 2019: Addressing multiple OUSD/RE research priorities:―DE Metrics (WRT-1001), determine critical ROI measures and improved SEvalue indicatorsChange the nature of AoAs―Model CurationChange whatChange howSystemscurate(WRT-1009), curation weEngineers dodata andmodelspractices, enablers,and technicalinnovationopportunitiesChange―DE WorkforceReqsprocess(WRT-1006), DEcompetency modelfor DAU―DE Policy – buildingPresented at inauguralDEIXWG meeting, INCOSE IS2018a model using DocGenSSRR 2019November 19, 20192

WRT-1001 Digital Engineering MetricsProject Guiding Questions If you had a “Program Office Guide to Successful DE Transition” whatwould that look like?―Extend previous SERC work on DE enterprise transformation to theprogram office level. How can the value and effectiveness of DE be described and measured?―Determine appropriate metrics for evaluating the benefits of DEtransformation. Are there game-changing methods and/or technologies that wouldmake a difference?―Analyze the DE Innovation System (methods, processes, and tools) toidentify gaps and challenges and potential paths for innovation. Can we describe an organizational performance model for DEtransformation?―Generalize the data and results.SSRR 2019November 19, 20193

WRT-1001 DE Metrics Project ActivitiesHorizonScanningInnovationSystem Analysis Digital BusinessTrends DevOps trends Technologies &MetricsRT-182Results Initial enterprisetransition model AcquisitionOutcomes Focus Candidate MetricsLiteratureReview Problem Spacedefinition Innovation Areas InnovationEnablers/BarriersDE/MBSEMaturity Survey PublishedDE/MBSEvalue – benefit – gain Candidate Metrics Assessment ofrelative maturity value - benefits Candidate MetricsProgramSuccessGuidanceSSRR 2019DoDWorkshopsNovember 19, 2019 DoD ProgramOffice Focus Evaluation ofMetricsEnterprisePerformanceAnalysis4

DoD Digital Engineering StrategySTRATEGIC MANAGEMENT SYSTEMDigital Engineering (DE) Vision: Modernize how the Department designs, develops, delivers,operates, and sustains systems.WAYS /I NITIATIVESUse technologicalinnovation toimprove engineeringpracticesInfrastructureand environmentssupport improvedcommunication andcollaborationTransform cultureand workforceengineering acrossthe lifecycleFormalize the planningfor models to supportengineering activitiesand decision makingacross the lifecyclePlan and develop theauthoritative sourceof truthInfuse technologicalinnovations to enablethe end-to-end digitalenterpriseDevelop, mature, anduse digitalengineering ITinfrastructuresImprove the digitalengineeringknowledge baseFormally develop,integrate,and curate modelsGovern theauthoritative sourceof truthMake use of data toimprove awareness,insights, and decisionmakingDevelop, mature, anduse digitalengineeringmethodologiesLead and supportdigital engineeringtransformation effortsUse models to supportengineering activitiesand decision makingacross the lifecycleUse the authoritativesource of truth acrossthe lifecycleAdvance humanmachine interactionsSecure ITinfrastructure andprotect intellectualpropertyBuild and prepare theworkforceSSRR 2019MEANSMEANSAn enduring,authoritativesource of truth isused over thelifecycleWAYS / INITIATIVESModels are used toinform enterpriseand programdecision makingENDSENDSDE Mission: Securely and safely connect people, processes, data, and capabilities across an endto-end digital enterprise.Process, Methods, Tools, Technology, PeopleNovember 19, 20195

Focus: Enterprise Transformation Measures Gartner: “Select just 5 to 9 metrics to track, report and act on. Thevalue of a metric lies in its ability to influence business ogress/ The best metrics:―Have a defined and defensible causal relationship to a business outcome―Work as a leading, not lagging, indicator―Address a specific defined audience – in a way they can understand―Drive action when they change from green to yellow to red There are no universal metrics – must be enterprise specific Good Digital Transformation metrics have some traits―Measure people adoption, and enterprise process adoption―Analyze breadth of usability, and issues with usability―Measure productivity indicators―Generate new value to the enterprise (revenue, operational efficiency, etc.)SSRR 2019November 19, 20196

HorizonScanningTime to Market: Frequency Lead time Deployment time Customer tickets VolumeTraction: Adoption/Scale Engagement Active Usage Productivity Talent & SkillsSSRR 2019Agile/DevOps and Digital BusinessAgile/DevOpsUser Experience: Features System usage Limitations Change detection Availability Recovery time SecurityDigital BusinessNovember 19, 2019Predictability: Escapes FailedDeployments Error rates CostsFinance: Change inBusiness orOperating Model New Value New revenues7

RT-182ResultsModels are used toinform enterpriseand programdecision making An enduring,authoritativesource of truth isused over thelifecycleDE Success Measures from RT-182Use technologicalinnovation toimproveengineeringpracticesInfrastructureand environmentssupport improvedcommunication andcollaborationTransform cultureand workforceengineering acrossthe lifecycleKnowledge Transfer:Data/Model reuseLink to Mission EngDepth of reviewExpanded VisualizationInnovation Quality: Defects/Design Escapes AoA coverage Design space explored SE rigorCMUser Experience:Velocity/Agility: Collaboration Data/model reuse Automation Decision times Interoperability Cycle time/Agility Data search time StandardsSSRR 2019November 19, 2019 Adoption:Pace of adoptionInfrastructure investmentEnterprise process & toolintegrationTool/model interoperabilityRole/Skill transition8

RT-182ResultsOutcome driven metrics from RT-217 Reduce defects & design escapes Manage Complexity – knowledgesharing & transfer More robust, SoS-based AoAs Repeatable link to mission levelsimulation, capabilities Increase design space exploration Improved SE rigor & maturity Expanded visualizations for patternanalysis & decision making SE has more time for quality control Comprehensive, effective CM Increase speed of finding & using data Improve agility Amount of system reviewed in SETRprocess Enable innovation More rapid program decision cycles Pace of DE/MBSE adoption Data & model reuse Infrastructure investment Incorporation of standards Enterprise process & tool integration Improved collaboration acrossdisciplines & locations Tool/model interoperability successes Improved reasoning/inference – leadsto better automation Replace tech writers with modelers Improve modularity and interoperabilitySSRR 2019November 19, 20199

LiteratureReviewInitial Literature Review Results Searched 20 journals and conference proceedings for any paperthat mentions Model-Based Systems Engineering. Identifiedpapers that mention a benefit of MBSE and what the source ofthat benefit was: measured gains, observed gains, perceived gains(no source for benefit), reference―Total Papers that mention MBSE: 852o Papers that mention benefits: 361―Measured gains: 3―Observed gains: 27―Perceived gains: 236―Reference: 114―Misc.: 2*Kaitlin Henderson (VT) PhD studiesSSRR 2019November 19, 201910

DE/MBSEMaturity SurveyObjective Method MBSE Benchmarking SurveyAssess value and effectiveness of MBSE adoption for improving business outcomes(gov’t, industry) – benefits vs. traditional methods. Develop a profile of MBSE use andmeeting expectations across the life cycle.Where are we as organizations, and as an industry? Building models, or using models?Applying what we learn.Enable adopters to conduct a qualitative or quantitative assessment of their progressagainst MBSE best practices and guidance on developing an improvement roadmapConduct an industry survey of MBSE capability. Align with INCOSE draft DE CapabilitiesDefinition matrix.Characterizing MBSE practices, capability, value, benefits.Probe alignment and integration with other adopter initiatives (e.g., PLM, DevOps,cross-discipline)Collect and share best practices and assets on MBSE benefits/value from communityOrganizationalInvolvement Participation call through industry associations: INCOSE (lead), NDIA, Government sponsorship and support: DoD (OUSD R&E), FFRDCs (SERC)Survey administration by DoD SERC (Stevens Institute) - “honest broker” to protectproprietary data.Schedule Survey: define (Sep-Oct); instrument (Oct-Nov); distribute (Dec-Jan); analyze (Feb-Mar)Core Team INCOSE: Garry Roedler; Troy PetersonNDIA: M&S Committee (Chris Schreiber); SE Division (Joe Elm, Geoff Draper; GarryRoedler)SERC: Tom McDermott, Nicole Hutchinson SSRR 2019November 19, 201911

DE/MBSEMaturity SurveyMBSE Survey OverviewTopicsSummary of Survey QuestionsTopicsSummary of Survey Questions1. MBSEUsage1. MBSE strategy documented at enterprise level2. MBSE processes & tools integrated, inform enterprisestaff3. Q: Primary value of cross-functional MBSE integration?7. ModelSharing andReuse2. ModelManagement4. Taxonomy for modeling across organization5. Well-defined processes/tools for model management.6. Standard org guidance for model management/tools7. Q: Business value from consistent model management?19. Teams establish, share, reuse org model libraries20. Org interface around models for stakeholder use21. Shared models used to consistently manageprograms across lifecycle22. Q: org implementation for data/model discovery,reuse?8. ModelingEnvironments3. TechnicalManagement8. Modeling basis for enterprise org processes9. MBSE process support for technical reviews10. Q: Value of MBSE (or digital engrg) in technicalreviews?23. Modeling environment security24. Modeling environment protects IP25. Cross-discipline processes for tools, datainteroperability26. Q: value from collaborating on models acrossdisciplines9.OrganizationalImplementation27. Q: most challenging org obstacles for MBSE?28. Q: Best organizational enablers for MBSE?29. Q: Biggest changes our org needs forMBSE?10. Workforce30. Organization defined critical roles to support MBSE31. Q: Top MBSE roles in your organization?32. Org staffing adequate to fill MBSE-related roles?11. MBSE Skills33. Defined critical skills for MBSE34. Q: The most critical skills for MBSE?12.DemographicsOrganizational size, domain, MBSE experience4. Metrics5. ModelQuality6. DataManagement11. Modeling provides measurable improvement acrossprojects12. Consistent metrics across programs/enterprise?13. Q: Most useful metrics?14. Defined processes/tools for V&V of models15. Defined processes/tools for data/model qualityassurance16. Org approach for data interface between tools17. Data managed independent of tools for portability18. Q: Data management roles/processes?Survey content is derived from the draft INCOSE Digital Engineering Capabilities DefinitionSSRR 2019November 19, 201912

DE/MBSEMaturity SurveyDraft INCOSE DE CapabilitiesDefinition Matrix Basis from The Aerospace Corporation MBSE Community Roadmap and the NASA MSFCMBSE Maturity Matrix Developed through a series of workshops with INCOSE and NDIA to form a proposedcomprehensive Model-Based Enterprise Capability Matrix Will complete and release as an INCOSE DE Capabilities Document, Jan 2020Model-BasedCapability StagesStage 0Stage 1Tools & IT InfrastructureE-mail,telecom.System ModelFile Exchange.Stage 2Stage 3Stage 4Various organizationsworking on different partsof model. Full modelPartial On-line, real-timeintegrated by a singlecollaboration amongstorganizations.distributed teamsOn-line, real-time collaborationamongst distributed teamsFully Federated w/ standardDisparateTool-to-Tool, ad Partial Federated Database Main tools interoperable. "plug-and-play" interfaces.Database/ToolManagement SystemhocSupporting tools interactData is interchanged amonginteroperabilityNoneinteroperability (FDBMS)through file transfer.toolsInter-Database/Tool Data Itemassociations among all dataInterInter-Database/Tool Data items defined, captured,Database/ToolItem associations among all managed, and traceable whereDatabases/to Data ItemInter-Database/Tool Data data items defined,changes in one data sourceInter-Database/Toolols areItem associations defined, captured, managed, andassociationsalerts owners of other dataData Item Associations independent definedcaptured, managedtraceablesources of intended updatesUser IF,UI draws from multipleUI supports Interrogation;Viewpoint/ViewsN/ADoc GenUI draws from Model app models/DBsmultiple configsCollaborationSSRR 2019November 19, 201913

DE/MBSEMaturity SurveyOrganizational Maturity Capabilities DE incorporated into organization policy &work instructions Appropriate tools, environments, methods,resources available Organizational lexicon & taxonomiesintegrated into data repositories Model management activities in place DE basis incorporated into SE processdescriptions for each phase Model CM applied Digital Threads and Digital Twin artifactbaselines maintained in process Requirements traceable across programs atthe enterprise level Model development practices in place DE basis incorporated into SETR criteria,processes, and artifacts Have a quality and improvement programthat incorporates modeling Have a DE-based SE metrics programincluding model metrics Model reuse standards, processes, andactivitiesSSRR 2019 Model development standards, processes, andactivities Model assurance standards, processes, andactivities Fully federated data & IT infrastructure Data interchange across tools Data change notification and traceability Data and model libraries established andshared Data interrogation standards, processes, andactivities Simulation standards, processes, and activities Data & information supports discoverableknowledge (data-driven decision processes) Enterprise planning & decisions from digitalartifacts (threads and twins) Tool research & improvement forums in place Model exchange standards, processes, andactivities with acquirers and subcontractors Training programs Model development standards Roles, competencies, and skills in placeNovember 19, 201914

DE/MBSEMaturity SurveyWRT 1008 Model Curation –INCOSE SE Leading Indicators Guide Requirements – growth, volatility,discovery Process Compliance - discrepancies System Definition Change Backlog –rate, resolution time, closure rate Defects/Errors – discovery, closure,phase containment Facility & Equipment Availability Interface Trends - discovery Requirements Validation - completed System Affordability – cost & confidence Requirements Verification - completed Architecture – base measures, maturityby review cycle Work Product Approval - in-work,rework Schedule & Cost Pressure – budget,schedule, risks Review Action Item Closure - burndown Risk Exposure – number, burndown Risk Treatment - #actions Technology Maturity – TRLs and change* Effected by DE Technical Measurement - TPMs SE Staffing & Skills – plan/actualSSRR 2019November 19, 201915

HorizonScanningWhich of These Digital Innovations willTransform the Engineering Disciplines? 5G mobility – enhanced bandwidth and connectivity mobile services Collaborative telepresence – Highly realistic, haptics enabled video conferences AI and ML – Artificial Intelligence and Machine Learning Immersive Realities – Human, Augmented, and/or Virtual Reality technology integration Blockchain – Blockchain derived technologies to manage workflows Cloud Evolution – Evolving cloud computing architectures NL/Chatbots/social robots – true human realistic natural language interfaces IoT – Internet of Things sensors and architectures Low-code SW – Domain specific design languages/visual composition design methods DevSecOps – Secure Continuous development and deployment environments Quantum computing – Evolving non-binary computing architectures Advanced Manufacturing – Rapid programming/realization of hardware design DNA-based Data Storage – High speed, ultra-high capacity storage devices Digital Identities – Computer (not human) determines identity verification* SERC Project WRT-1001 presented to the Digital Engineering Working Group 08/2019SSRR 2019November 19, 201916

InnovationSystem AnalysisCompeting Views of Innovation SystemEvolutionThe wave model(Dahmann, etal.)Multilevel View oftechnology transitions(Geels)SSRR 2019November 19, 201917

InnovationSystem AnalysisGPI Application GeneratorStart with a pressing challenge, then generate ideaschallenge:How might we improve tool and model interoperability?GPI CardsSSRR 2019November 19, 201918

Summary and CompletionHorizonScanningInnovationSystem Analysis Digital BusinessTrends DevOps trends Technologies &MetricsRT-182Results Initial enterprisetransition model AcquisitionOutcomes Focus Candidate MetricsLiteratureReview Problem Spacedefinition Innovation Areas InnovationEnablers/BarriersDE/MBSEMaturity Survey PublishedDE/MBSEvalue – benefit – gain Candidate Metrics Assessment ofrelative maturity value - benefits Candidate MetricsProgramSuccessGuidanceSSRR 2019DoDWorkshopsNovember 19, 2019 DoD ProgramOffice Focus Evaluation ofMetricsEnterprisePerformanceAnalysis19

Questions?Thank you!SSRR 2019November 19, 201920

through file transfer. Fully Federated w/ standard "plug-and-play" interfaces. Data is interchanged among tools. Inter-Database/Tool Data Item Associations Databases/to ols are independent Inter-Database/Tool Data Item associations defined: Inter-Database/Tool Data Item associations defined, captured, managed Inter-Database/Tool Data