Introduction To Scientific Machine Learning

Transcription

Introduction to ScientificMachine LearningJian Taojtao@tamu.eduHPRC Short Course03/26/2021

LAB MEMBERSPilot Project 1 - Microstructure InformaticsPilot Project 2 - Reservoir SimulationPilot Project 3 - Thermonuclear SupernovaeTensorDiffEq is a python package builton top of Tensorflow to providescalable and efficient PINN solvers.TensorDiffEq’s primary purpose is forscalable solving of PINNs (inference)and inverse problems (discovery).Additionally, TensorDiffEq is the onlypackage that fully supports andimplementsSelf-AdaptivePINNsolvers and is the only Multi-GPUPINN solution suite that is fullyopen-source.Raymundo Arroyave, Ulisses Braga-Neto, Levi McClenny, Vahid AttariEduardo Gildin, Ulisses Braga-Neto, Yalchin Efendievhttps://sciml.tamids.tamu.edu/Lifan Wang, Jian Tao, Lisa PerezLevi McClenny, Ulisses Braga-Neto

Upcoming TAMIDS SciML Lab Talk (April 14)https://github.com/ChrisRackauckas

Upcoming Hackathon on Material Design with GraphLearning (April 19 - 23)One week long Hackathon to explore potential applications ofgraphical learning in material design.Please contact jtao@tamu.edu if you are interested.https://github.com/divelab/DIG/

Upcoming Tutorial on TensorDiffeq (Early May)https://github.com/tensordiffeq/TensorDiffEq

Introduction to Scientific Machine LearningPart III03Hands-on Demos(30 mins)Part ISetting up a workingenvironment on Terra(15 mins)Q&A(5 mins/part)0102Part IIIntroduction toScientific MachineLearning (90 mins)

Part I. Working EnvironmentHPRC Portal

Login HPRC Portal (Terra)

Terra Shell Access - I

Terra Shell Access - II

Using Pre-installed Julia ModuleStep 1. Find the module to be loadedStep 2. Load the module module spider teSparseJulia/1.0.5-linux-x86 64Julia/1.4.1-linux-x86 64Julia/1.5.1-linux-x86 64. module load Julia/1.5.1-linux-x86 64(You can also use the web-based interface to findsoftware modules available on HPRC systems. Forexperienced users, please jump to Step 2.)SW:Julia - TAMU HPRCStep 3. Start Julia REPL julia

Using Your Own Julia InstallationStep 1. Find the version to be installedStep 2. Download& UnzipPaste thelink here. cd SCRATCH wget https://./julia-1.5.4-linux-x86 64.tar.gz tar zxvf julia-1.5.4-linux-x86 64.tar.gzRight click andcopy the link.Step 3. Start Julia Shell cd SCRATCH/julia-1.5.4/bin; ./julia(You can find different versions of Julia at Download Julia,The latest stable version of Julia is highly recommended.)SW:Julia - TAMU HPRC(Ctrl D or type exit() to quit Julia shell)

Install Julia Packages# Set Julia Depot path under SCRATCH export JULIA DEPOT PATH SCRATCH/.julia# start Julia cd SCRATCH/julia-1.5.4/bin; ./julia# type ']' to open Pkg REPLjulia ](@v1.5) pkg add Plots

Julia - QuickstartThe julia program starts the interactive REPL. You will be immediatelyswitched to the shell mode if you type a semicolon. A question markwill switch you to the help mode. The TAB key can help withautocompletion.julia versioninfo()julia VERSIONSpecial symbols can be typed with the escape symbol and TAB , butthey might not show properly on the web-based terminal.julia \sqrt TAB julia for i 1:10 println(i) end#\in TAB

Part II. Introduction to ScientificMachine Learning (SciML)Workshop Report on Basic Research Needs for Scientific Machine Learning:Core Technologies for Artificial Intelligenceby Baker, Nathan, et. al.https://doi.org/10.2172/1478744Physics-informed neural networks: A deep learning framework for solving forwardand inverse problems involving nonlinear partial differential equations.by Raissi, M., Perdikaris, P., & Karniadakis, G. E.Journal of Computational Physics, 378, ciML Scientific Machine Learning Softwareby Chris Rackauckas et. al.https://sciml.ai/

Data-Driven vs Theory-DrivenTheory-Driven Modeling (Numerical ta-Driven Modeling (Supervised uterPredictionModelData

Balance between Data and TheoryTheory-Driven ModelData-Driven / ML Model01010101010100001011001001Less Data,More on* Data assimilation is somewhere in between but not necessarily balanced.More Data,Less Theory,Noisy,Realistic,Blackbox

Middle Ground - Data AssimilationSimplistic Overview of Reanalysis Data Assimilation Methods NCAR

SciML - Best of the Two WorldsFeedingScenario Studies,Model Validation,Parameter SearchReplacingPost Processing,Neural Solver,Surrogate ModelMachine Learning ModelNestingInverse Problems,Faster Program,NASTheory-Driven Model

Theory / Numerical Modeling

Data-Driven Model - Supervised LearningWhen both input variables - X and output variables - Y are known, one canapproximate the mapping function from X to Y.Training DataML AlgorithmStep 2: TestingModelStep 1: TrainingTest Data

Artificial Neural NetworkInput(Image Credit: Wikipedia)Hidden LayersOutput

Supervised Deep Learning with Neural NetworksInputHidden LayersFrom one layer to the nextX1f is the activation function,Wi is the weight, and bi isthe bias.X2W1W2Y3X3W3Output

Activation FunctionsImage Credit: towardsdatascience.com

Training - Minimizing the LossThe loss function with regard to weights andbiases can be defined asInputW1, b1OutputX1Y3The weight update is computed by moving astep to the opposite direction of the costgradient.W2, b2LW3, b3Iterate until L stops decreasing.X2X3

Deep Neural Network as a Universal ApproximatorInputOutputUniversal Approximation Theorem(Cybenko, 1989)Backward PropagationX1Universal approximation theorems implyy1that neural networks can represent awide variety of functions.X2y2Pinkus TheoremX3(Pinkus, 1999)Forward PropagationPinkus theorems imply that neuralnetworks can represent directives of afunction simultaneously. Training: given input and output, find best-fit F Inference: given input and F, predict output

SciML - Best of the Two WorldsFeedingScenario Studies,Model Validation,Parameter SearchReplacingPost Processing,Neural Solver,Surrogate ModelMachine Learning ModelNestingInverse Problems,Faster Program,NASTheory-Driven Model

Sample Workflow of a SciML ApplicationNested FunctionLoss FunctionMachine Learning ModelTheory-Driven ModelGradient Descent

Automatic Differentiationw5An autodiff system converts the programinto a sequence of primitive operationsto compute derivatives.w4ForwardGivenNumerical:(Finite Difference)Symbolic:w3w1w2

Physics-Informed Neural Networks (PINNs)Lu Lu, Xuhui Meng, Zhiping Mao, George E.Karniadakis, DeepXDE: A deep learninglibrary for solving differential equations(https://arxiv.org/pdf/1907.04502.pdf)

Hands-on SessionGetting Started with NeuralPDE.jl

Install Julia Packages# Set Julia Depot path under SCRATCH export JULIA DEPOT PATH SCRATCH/.julia# start Julia cd SCRATCH/julia-1.5.4/bin; ./julia# type ']' to open Pkg REPLjulia ](@v1.5) pkg add Plots, NeuralPDE, Flux, ModelingToolkit, GalacticOptim, Optim,DiffEqFlux# type 'Backspace' to get back to REPL and paste the code directly into the shell.julia using NeuralPDE, Flux, ModelingToolkit, GalacticOptim, Optim, DiffEqFlux

ODE with a 3rd-Order Derivativeusing NeuralPDE, Flux, ModelingToolkit, GalacticOptim, Optim, DiffEqFlux@parameters x@variables u(.)Dxxx Differential(x) 3Dx Differential(x)# ODEeq Dxxx(u(x)) cos(pi*x)using UnicodePlots# Initial and boundary conditionsbcs [u(0.) 0.0,u(1.) cos(pi),Dx(u(1.)) 1.0]analytic sol func(x) (π*x*(-x (π 2)*(2*x-3) 1)-sin(π*x))/(π 3)# Space and time domainsdomains [x IntervalDomain(0.0,1.0)]dx 0.05xs [domain.domain.lower:dx/10:domain.domain.upper for domain in domains][1]u real [analytic sol func(x) for x in xs]u predict [first(phi(x,res.minimizer))for x in xs]# Neural networkchain FastChain(FastDense(1,8,Flux.σ),FastDense(8,1))x plot collect(xs)plot(x plot ,u real,title "real")plot!(x plot ,u predict,title "predict")ODE with a 3rd-Order Derivative · NeuralPDE.jldiscretization PhysicsInformedNN(chain, QuasiRandomTraining(20))pde system PDESystem(eq,bcs,domains,[x],[u])prob discretize(pde system,discretization)cb function (p,l)println("Current loss is: l")return falseendres GalacticOptim.solve(prob, ADAM(0.01); cb cb, maxiters 2000)phi discretization.phi

Machine Learning Pilot Project 1 - Microstructure Informatics Pilot Project 2 - Reservoir Simulation Pilot Project 3 - Thermonuclear Supernova