Session: 16-03-04: Foundational Framework II
Paper Number: 172629
A Simulation Workflow for Additive Manufacturing
A simulation workflow captures the steps necessary to form a satisfactory representation of a physical process using physics-based and statistical models as well as calibration data. Assessing the quality of the experimental data and the confidence of a modeling hierarchy is critical before deploying the simulation applications. This presentation will share an example of the uncertainty quantification and trustworthiness assessments of a simulation workflow for powder bed fusion laser-based (PBF-LB) additive manufacturing (AM).
PBF-LB AM has demonstrated its capability in building near-net-shape products, but it requires a complex design of the manufacturing process that may result in high variability in mechanical properties of the AM built parts. The predictive capability of processing-microstructure-properties relationship is essential for AM practitioners to evaluate the sensitivity of the as-built product properties to the AM process parameters as a guideline for the process design. Furthermore, the uncertainty of a single model may not directly impact the predictive accuracy, but may be amplified through the modeling hierarchy and increase the workflow uncertainty. It may imply lower trustworthiness of the workflow to some engineering applications. To demonstrate the trustworthiness assessment, including the verification, validation, and uncertainty quantification plan (VVUQ), this work adopts software tools to simulate the laser-matter interaction during the PBF-LB AM process.
The proposed assessment plan includes four steps to clarify and assess the confidence of the workflow:
(1) Function definition is a procedure to define the purpose of implementing or adopting a computational tool
(2) Data engineering refers to the process of managing data and metadata, including analyzing the data quality
(3) Credibility evidence requires different processes using various metrics to assess the trustworthiness of the workflow for the purpose of use
(4) Sensitivity analysis identifies the important components of the workflow to the impacts of the decision-making
A use case demonstrates the use of AM Bench data and signal processing tools, such as phase shift, noise reduction, and image processing, with statistical functions to assess the variability of the AM manufacturing process. The thermography and melt pool images are analyzed for the melt pool features and laser-matter interactions. A pipeline of computations contains commercial and open-source software packages to predict the melt pool features, microstructure characters, and mechanical properties such as the yield strength, ultimate tensile strength etc. These software tools are modularized for a workflow management tool, Snakemake, to streamline the working procedures. This presentation will also demonstrate a strategy for model calibration, verification, validation, and uncertainty quantification using the National Institute of Standards and Technology (NIST)’s AM Bench data.
Presenting Author: Shengyen Li NIST
Presenting Author Biography: Dr. Li is interested in materials development and process optimizations at the system engineering level for additive manufacturing using science-based models assisted by data-driven methods and data informatics. To create a computational system for this purpose, Dr. Li created integrated computational materials engineering (ICME) and Digital Twin frameworks, which provide a cost-effective approach to managing data from multiple sources, conducting statistical analyses, and executing a series of simulation modules. Integrating data, models, and tools in an automated process can optimize the controlled parameters to meet the project objectives. Hybrid repositories using XML and HDF data formats were developed to archive the research data, including image files, phase-based information, and mechanical properties with an interactive user interface. He implemented simulation modules for quantitative predictions to approach the phase transformations during heat treatments and evaluate the material properties under service conditions. Dr. Li also used machine learning algorithms to parameterize the models and optimize the targeted variables. He used this framework for different projects, such as developing high-strength superalloys and optimizing the machining conditions for carbon steels.
Dr. Li's extensive experience and education in materials engineering and science are vital to establishing his credibility. He received his Ph.D. from the Department of Mechanical Engineering at Texas A&M University. From 2008 to 2013, he studied the phase transformation of the transformation-induced plasticity (TRIP) steels and designed treatments to maximize the work-to-necking. In 2013, he joined the Material Genome Initiative (MGI) program at the National Institute of Standards and Technology (NIST). In just five years, he developed an ICME framework, the Material Design Toolkit, to assist in decision-making for material design. From 2018 to 2023, Dr. Li joined the Department of Materials Engineering at Southwest Research Institute in San Antonio, Texas, where he continued to work on process optimization and quality assurance for industrial sectors, including additive manufacturing. Since 2023, Shengyen has served as a research staff member of NIST, working on data integration and digital twin implementation for additive manufacturing.
Authors:
Shengyen Li NISTJonathan Guyer National Institute of Standards and Technology
Daniel Wheeler National Institute of Standards and Technology
Dilip Banerjee National Institute of Standards and Technology
A Simulation Workflow for Additive Manufacturing
Paper Type
Technical Presentation