Skip to main content

Advanced LS-OPT®: Deterministic & Probablistic Optimization

Instructor

  • Anir­ban Ba­sud­har, Ph.D.

Price

  • $600
  • $300 for students

What's Included

  • Three days of in­struc­tion
  • Class notes
  • Thir­ty-day de­mo li­cense
  • Con­ti­nen­tal break­fasts
  • Lunch­es
  • One class din­ner

Advanced LS-OPT®: Deterministic & Probablistic Optimization

Prerequisites

  • Re­quired: Ba­sic knowl­edge about meta­mod­el-based op­ti­miza­tion and re­sult analy­sis us­ing LS-OPT.
  • Strong­ly rec­om­mend­ed: In­tro­duc­tion to LS-OPT class since it pro­vides a foun­da­tion for some of the ad­vanced top­ics.
  • Rec­om­mend­ed but not re­quired: An in­tro­duc­to­ry class in LS-DY­NA® for fa­mil­iar­i­ty with a few key­words.

Syllabus

This course is in­tend­ed to help en­gi­neers with a ba­sic knowl­edge of LS-DY­NA and LS-OPT to be­come pro­fi­cient in ad­vanced op­ti­miza­tion and prob­a­bilis­tic de­sign meth­ods. With this course we hope for you to be­come more pro­duc­tive at de­sign and pa­ra­me­ter iden­ti­fi­ca­tion of com­plex sys­tems, such as mul­ti­dis­ci­pli­nary sys­tems with com­pet­ing ob­jec­tives, ad­vanced ma­te­r­i­al test­ing and mod­els, and sys­tems with dis­con­tin­u­ous re­spons­es. We will al­so pro­vide in­sight in­to re­li­a­bil­i­ty and ro­bust­ness in or­der to fa­cil­i­tate high­er qual­i­ty prod­uct de­sign. Ad­di­tion­al­ly, we will in­tro­duce clas­si­fi­ca­tion-based adap­tive sam­pling con­straints as a tool for en­hanc­ing the ef­fi­cien­cy.

In this course, we will dis­cuss both the the­o­ret­i­cal and prac­ti­cal as­pects of de­sign. We will al­so cov­er ad­vanced top­ics, such as mul­ti-ob­jec­tive and col­lab­o­ra­tive op­ti­miza­tion, dig­i­tal im­age cor­re­la­tion, sta­tis­ti­cal clas­si­fi­ca­tion, and prob­a­bilis­tic op­ti­miza­tion. Dur­ing work­shop ses­sions, we will ap­ply the dis­cussed the­o­ret­i­cal top­ics. We will use the LS-OPT Ver­sion 6.0 graph­i­cal user in­ter­face to teach in­put prepa­ra­tion and post-pro­cess­ing. We will al­so em­pha­size in­ter­fac­ing with LS-DY­NA.

Content

Day 1

  • Course Out­line
  • In­tro­duc­tion to De­sign Op­ti­miza­tion & LS-OPT Ba­sic Fea­tures Sum­ma­ry
  • The­o­ry: Pa­ra­me­ter Iden­ti­fi­ca­tion
    • Noisy Da­ta: Fil­ter­ing Com­put­ed Curves
    • Dy­nam­ic Time Warp­ing (DTW)
    • Dig­i­tal Im­age Cor­re­la­tion (DIC)
  • Ex­am­ples: Set up, run and post-process pa­ra­me­ter iden­ti­fi­ca­tion ex­am­ples
    • GISS­MO fail­ure mod­el ex­am­ple
    • Defin­ing mul­ti-point his­to­ries for spa­tial da­ta
    • Full field cal­i­bra­tion us­ing DIC da­ta
  • The­o­ry: Col­lab­o­ra­tive Op­ti­miza­tion
    • Mul­ti­dis­ci­pli­nary Op­ti­miza­tion (MDO)
    • Mul­ti­level Op­ti­miza­tion
  • Ex­am­ples: Set up, run and post-process col­lab­o­ra­tive op­ti­miza­tion ex­am­ples
    • Mode track­ing
    • Vari­able screen­ing
    • MDO us­ing a re­duced set of vari­ables
    • Mul­ti­level Op­ti­miza­tion
  • The­o­ry: Clas­si­fi­ca­tion-Based Con­straint Han­dling
    • Dis­con­tin­u­ous and bi­na­ry re­spons­es
    • Clas­si­fi­ca­tion-based con­straint bound­ary de­f­i­n­i­tion
    • Sup­port Vec­tor Ma­chine Clas­si­fi­ca­tion (SVC)
  • Ex­am­ple: Op­ti­miza­tion with Dis­con­tin­u­ous Con­straint Re­sponse
    • Defin­ing a clas­si­fi­er
    • Op­ti­miza­tion us­ing a con­straint de­fined by an SVC clas­si­fi­er

Day 2

  • The­o­ry: Mul­ti­ob­jec­tive Op­ti­miza­tion (MOO)
    • Pare­to front de­f­i­n­i­tion and MOO al­go­rithm
    • An­a­lyz­ing the Pare­to front us­ing the View­er
  • Ex­am­ple: Set­ting up, run­ning and post-pro­cess­ing MOO ex­am­ple
    • Cre­ate Pare­to Op­ti­mal front
    • Trade-off Plot, Par­al­lel Co­or­di­nate Plot (PCP), Self Or­ga­niz­ing Maps (SOM), Hy­per Ra­di­al Vi­su­al­iza­tion (HRV)
  • The­o­ry: Prob­a­bilis­tic Analy­sis
    • Sta­tis­tics fun­da­men­tals
    • Prob­a­bilis­tic analy­sis meth­ods
  • Ex­am­ple: Di­rect Monte Car­lo Analy­sis
    • Un­cer­tain­ty quan­tifi­ca­tion us­ing noise vari­ables and sta­tis­ti­cal dis­tri­b­u­tions
    • Latin Hy­per­cube Sam­pling
    • Fail­ure prob­a­bil­i­ty cal­cu­la­tion
    • Sta­tis­ti­cal post-pro­cess­ing tools
    • DY­NAS­tats
  • Ex­am­ple: Meta­mod­el-Based Monte Car­lo Analy­sis
    • Re­li­a­bil­i­ty cal­cu­la­tion with noise vari­ables and con­trol vari­ables
    • Sta­tis­ti­cal post-pro­cess­ing tools
    • Sto­chas­tic con­tri­bu­tion
    • DY­NAS­tats
  • The­o­ry: Prob­a­bilis­tic Op­ti­miza­tion
    • Re­li­a­bil­i­ty-based de­sign op­ti­miza­tion (RB­DO)
    • Ro­bust de­sign

Day 3

  • Ex­am­ple: Re­li­a­bil­i­ty-Based De­sign Op­ti­miza­tion
    • Op­ti­miza­tion of Con­trol Vari­ables
    • Tar­get prob­a­bil­i­ty of fail­ure
  • Ex­am­ple: Ro­bust De­sign
    • Noise and Con­trol vari­ables
    • Stan­dard de­vi­a­tion com­pos­ite
    • Min­i­mize ef­fect of noise vari­ables
  • Ex­am­ple: Se­quen­tial Meta­mod­el-Based Monte Car­lo Analy­sis
  • Ex­am­ple: Se­quen­tial Monte Car­lo Analy­sis with Clas­si­fi­er-Based Adap­tive Sam­pling Con­straint
  • Sto­chas­tic Fields
  • Out­lier Analy­sis (op­tion­al)
  • Met­al Form­ing (op­tion­al)
  • Tol­er­ance Op­ti­miza­tion (op­tion­al)