The workshop is free of charge
Online via MS Teams
24. November 2021 - 25. November 2021
In materials and component research, artificial intelligence methodologies will lead to massive upheavals in the coming years. The processes of material development, material processing, lifetime prediction and material characterization will change significantly. By combining AI methods and new forms of knowledge representation, the data-based management of product life cycles will take on new qualities. To address this emerging field of research Fraunhofer IWM set up the online workshop »AI Methods for Fatigue Behavior Assessment and Component Lifetime Prediction« on November 24 and 25, 2021. We are pleased to invite you to participate.
Manufacturers and operators of facilities and plants are faced with the challenge of ensuring and reconciling performance and economic efficiency as well as the reliability and safety of their systems. This requires suitable monitoring and maintenance concepts plus valid decision-making fundamentals for adapting operating points to changing operating conditions. Prerequisites for this are material models for service life assessment, methods for the qualification of critical components and a sound database.
The combination of AI methods and knowledge graphs introduces new possibilities for the data-based management of product life cycles. With a view to assessing the fatigue behavior of materials and predicting the service life of components, this results in a new quality of predictions and new starting points for reducing failure costs and increasing plant and systems availability.
In the workshop, renowned experts from science and industry will present corresponding concepts as well as how methods of artificial intelligence and digitalization of materials can be integrated into product development and systems and facilities operation.
Prof. Chris Eberl, Deputy Director, Fraunhofer IWM
|Collecting and handling of operational data
How can the gap between sensor data and material information be closed? What are relevant data streams? How can information on materials behaviour be extracted?
|01:30 p.m.||The need to increase efficiency in the generation and evaluation of fatigue data from operations
Dr. Matthias Funk, Schaeffler Technologies, Herzogenaurach
A lot of fatigue testing data is generated in component assessment tests. Unfortunately, it is not sustainably stored in standardized formats in most cases. Even though, this would be needed for a truly holistic evaluation of that data in order to gain knowledge out of it.
The presented solution is a stack of browser-based tools that demonstrate Schaefflers vision in the field of fatigue strength modelling and lifetime prediction in operations. This contains a tool for machine readable documentation, a tool for lifetime prediction which is based on machine learning models and a test assistant that supports test program planning for a confident test result at a minimum number of specimens.
||Collecting and handling of operational data - status quo and development needs
|Material Data Structures for AI Applications
What are critical features that foster AI performance? How relates AI to the scale of data structures?
|02:30 p.m.||Representation Learning and Knowledge Graphs
Dr. Mehwish Alam, Leibniz Institute for Information Infrastructure, FIZ Karlsruhe
Knowledge Graphs (KGs) constitute a large network of real-world entities and their relationships. KGs have recently gained attention in many tasks such as recommender systems, question answering, etc. Due to automated generation and open-world assumption, these KGs are never complete. This talk will focus on the problem of KG completion, i.e., link prediction and entity typing based on KG embeddings as well as language models. It will also discuss one of the recently published benchmark dataset for KG completion based on Wikidata.
|03:30 p.m.||A new AI/ML Framework for materials innovation
Prof. Surya R. Kalidindi, Georgia Tech, Atlanta
A novel information gain-driven Bayesian ML framework is presented with the following main features: (i) explicit consideration of the physics parameters (describing deformation/damage mechanisms) as inputs (i.e., regressors) in the formulation of process-structure-property (PSP) surrogate models needed to drive materials improvement workflows; (ii) information gain-driven autonomous workflows for training efficient ML surrogates to otherwise computationally expensive physics-based simulations; (iii) versatile feature engineering for multiscale material internal structure using the formalism of n-point spatial correlations; (iv) amenable to a broad suite of surrogate model building approaches (including Gaussian Process regression (GPR), convolutional neural networks (CNN)); and (v) Markov chain Monte Carlo (MCMC)-based computation of posteriors for physics parameters using all available experimental observations (usually disparate and sparse). The benefits of this framework in capturing material response in monotonic and cyclic loading in heterogeneous materials will be demonstrated using multiple case studies.
|AI supported Development Processes
How are process aspects from the manufacture of components taken into account in the development with the aim of achieving a high fatigue strength? How can existing data be used? How are fatigue properties predicted in development
|04:00 p.m.||Deep Learning in Materials Science and Technology
Dr. Tim Dahmen, German Research Center for AI, DFKI, Saarbrücken
|04:30 p.m.||AI-Accelerated Alloy Design
Bryce Meredig, Ph.D., Citrine Informatics, Redwood City, California
AI has emerged as a key tool for alloy design, as it can augment the ability of metallurgists to rapidly develop materials with improved properties. AI has several unique advantages that enable this accelerated development, including low computational cost and the ability to model complex, multiscale behavior such as fatigue and corrosion. These characteristics of AI further suggest a compelling connection with the integrated computational materials engineering (ICME) modeling toolset. We will discuss how AI fits with ICME and is poised to enable a new frontier of ICME capabilities.
|05:00 p.m.||Get Together|
|06:00 p.m.||End of day 1|
Dr. Christoph Schweizer, Fraunhofer IWM
|Evaluating the service life of critical components
Which models need which information? How can existing data be linked (data fusion)? How do testing methods and AI complement each other
|02:00 p.m.||Coping with materials variance using transfer learning
Ali Riza Durmaz, Fraunhofer IWM
Materials' microstructures exist in pronounced variety as they are signatures of the alloying composition and processing route. As materials become increasingly intricate and their development is accelerated, deep learning (DL) becomes relevant for the automated and objective microstructure constituent quantification. While DL frequently outperforms classical techniques by a large margin, shortcomings are poor data-efficiency and inter-domain generalizability, which inherently opposes expensive data annotation and materials diversity.
To alleviate this issue, we propose to apply a sub-class of transfer learning methods called unsupervised domain adaptation (UDA). This class of learning algorithms addresses the task of finding domain-invariant features when supplied with annotated source data and unannotated target data, such that performance on latter distribution is optimized despite the absence of annotations. This study addresses different surface etchings and imaging modalities in a complex phase steel segmentation task. The UDA approach surpasses the generalizability of supervised trained models by a large extent.
||Bayesian geometric learning as a step towards nonparametric metamodeling in multiscale solid mechanics
Pierre Kerfriden, Ph.D., MINES ParisTech
In this talk, I propose to give an overview of the recent deep learning meta-modelling methodology developed in our group to build multiscale surrogate models for multiscale stress analysis, i.e. accelerate repeated and expensive multiscale simulations by making use of tailored regression algorithms. Typically, our approach is to solve coarse scale equations online, and correct the resulting solutions using (graph) convolutional neural networks. In this way, the corrections may only be performed locally, allowing us to deploy efficient networks with relatively low receptive field size. Our networks are Bayesian, which is important to deploy methods of active learning whereby only meaningful multiscale simulations are to be performed to build the meta-model.
|03:00 p.m.||Digital twins for monitoring purposes
Dr. Jörg F. Unger, Federal Institute for Materials Research and Testing, BAM
A safe and robust performance is a key criterion when building and maintaining structures and components. Ensuring this criterion at different stages of the lifetime can be supported by applying continuous monitoring concepts in combination with a numerical model, resulting in a digital twin of the structure. In this presentation, the advantages and challenges of model updating of a digital twin using monitoring data using Bayesian methods are discussed, including data structures, Bayesian methods for parameter estimation including a quantification of the uncertainties as well as efficient modelling techniques in the context of reduced order modelling. The methods are illustrated using different examples from lab scale up to a structural level.
|The path to a lifespan app
How can material models, AI methods and operating data be linked for lifetime predictions? What would be scenarios of apps in use?
|04:00 p.m.||On the future role of digital knowledge bases/expert systems to support fatigue lifetime predictions
Dr. Christoph Schweizer, Fraunhofer IWM Freiburg
The fatigue life prediction of components is usually based on structural-mechanical models, which are often fed by material parameters from standards. Here, batch-specific influences are generally neglected, since the characterization effort is only worthwhile in exceptional cases. This paper discusses the future role of digital knowledge bases and expert systems, which have the potential to link relevant information from component production with quality assurance data in order to enable more precise service life evaluations.
|04:30 p.m.||Final discussion|
|05:00 p.m.||End of the workshop|