Digital Validation & Beyond

By on September 10, 2018 in PROCESSES & FABRICATING

Lightweighting, Safety and Dimensional Compliance

Driven by lightweighting and safety needs, the application of press hardening steels to Body-in-White components has exploded in recent years. Product design, manufacturing engineering and production technologies have witnessed tremendous innovation toward producing parts with optimally developed and tailored properties. Digital engineering and validation tools have also kept pace with strong innovations of their own to actively support this complex process.

The pre-eminent demand that is placed on digital engineering tools is to facilitate early and reliable decisions on critical aspects of the design and manufacturing process. This demand translates into these concrete needs:

1. Enable design assessments starting very early during design, minimize design-triggered downstream production issues
2. Dovetail design process seamlessly into development and maturation of the manufacturing process
3. Account for all necessary thermomechanical sheet, die and process conditions in simulation
4. Provide reliable and detailed feedback on all required quality metrics
5. Provide detailed diagnostic and issue-resolution tools
6. Enable cost-quality-time balanced decisions based on exhaustive pre-computed what-if studies


Vehicle/Product Development Process

Figure 1 shows a generic, commonly recognized progression of the vehicle and product development process, from program kickoff to start of production, and the associated engineering tasks relating to sheet-metal product and process engineering.

Digital engineering, validation and diagnostic tools have traditionally been engaged late in the game, during process engineering. Recent years, however, have clearly acknowledged an indispensable role, start to finish over the entire development process, for capable technologies: those that incorporate all the needed capabilities listed earlier.

Fig. 1 – Generic vehicle/product development process

This article illustrates an application of interactive digital engineering, diagnostic and validation tools, integrated into the engineering of hot stamped products and the hot stamping process.


A State-of-the-Art Application to Hot Forming

Each phase of the development process brings different types and quality of data to the engineering table and demands different work-product outcomes. An early process feasibility assessment on a hot stamped part requires:

Current shape of product, which is susceptible to numerous changes early on
Plausible assumptions on die and process elements
Reliable, though generic, characterization of all thermomechanical die and process conditions

Expected outcome is a conservative judgment on the feasibility of product shape to acceptable quality metrics.

Fig. 2 – Early feasibility outcome on B-pillar

Figure 2 shows the feasibility outcome on an early version of a B-pillar. This assessment was carried out based on historically driven assumptions on blank and tool temperatures, transport and die cycle times, etc. Thinning levels are seen far exceeding acceptable levels (17 percent).

Each phase of the development process brings different types and quality of data to the engineering table and demands different work-product outcomes.

This is an issue that cannot wait to be addressed until the product is released and may require changing the product’s shape to countermeasure it. However, prior to concluding that a product change is necessary, with ramifications to assembled components and performance, it is important to examine whether adjustments to the die and production process can mitigate the feasibility concerns.

A review of die, process and production conditions requires escalation to, and collaboration with, processing/ manufacturing engineers, in-house or at the die source. Such collaborative reviews have become commonplace in our industry. The processing engineer brings discipline and clarity to die and process conditions that are most viable in production environments, discarding the vagueness of plausible conditions assumed in early assessments.

In conventional simulation-based engineering, experience and intuition drive a trial-and-error, one-at-a-time approach to trying out different process adjustments. The outcome from one trial guides adjustments that follow. This is a slow and sequential approach, often leaving the best alternatives unexplored and is very expensive—in particular if a viable resolution cannot be arrived at after lengthy trials.

State-of-the-art technology today is capable of much better. It takes a systematic approach in which

relevant die, process and timing elements are designated as design variables
comprehensive, and even conflicting types of, quality targets on final part are defined upfront
a viable process, or process window, for achieving the above targets is arrived at through a systematic and exhaustive exploration of design variables (and their combinations) over physically meaningful and realizable ranges

This is shown in schematic view in Figure 3.

Critical limits and target areas on sheet require the expertise of product and manufacturing engineers, and also enable compliance with standards. Stochastic methods are leveraged to run multiple simulations, unsupervised, and to explore combinations of different process adjustments.

Simulation results are statistically processed toward one of the following major outcomes: i) a process window of die and process conditions within which acceptable parts can be made and ii) the determination that no viable process exists for making acceptable parts.

In this systematic approach:

the experience and expertise of design and manufacturing engineers are critical to success and to ensuring that theoretically feasible outcomes are also practically viable
loss of expert resources toward persistent monitoring and tracking, inherent in the conventional, sequential approach, is minimized or even eliminated

Fig. 3 – A systematic approach to designing and improving processes

 

This systematic process assessment was carried out on the B-pillar shown earlier, attempting to improve thinning outcome: blank and tool temperatures, die quenching force and timing, and transport times were selected as design variables and were allowed to be adjusted over respectively meaningful ranges. (See Figure 4.)

Fig. 4 – Design variables selected and their ranges, areas of concern to be addressed in the systematic approach

Simulation results were statistically processed to automatically search for a solution within defined ranges of the design variables—in other words, to resolve the critical issues identified on the part. In the above case, it became clear quickly that no amount of adjustments to the design variables—individually or in combination—was capable of producing a feasible condition.

This outcome, although negative, was clear and unambiguous, and is, therefore, still very valuable. It was arrived at systematically with minimal loss of expert resources. The conventional approach may have come to the same conclusion, albeit after considerable sequential and manual adjustments, besides active monitoring.

Based on the above, available alternatives were to modify product, or to change the production process. It was decided to keep product shape intact to minimize impact on product performance and assembly implications, and to change the production sequence. Instead of laser trim following forming, it was decided to fully develop the initial blank shape to produce final part shape from the forming operation. This seemed both viable and also made sense, given the location of the observed thinning concerns.

Fig. 5 – Optimization setup for blank shape

Figure 5 shows a schematic of the simulation setup toward blank adjustment or optimization. The original blank produces a part showing large deviation from nominal part boundary. This deviation, if adjusted at the marked corners, in the vicinity of the thinning concerns, is also expected to mitigate these concerns. This blank adjustment process lends itself to automation and requires just the following inputs: target part boundary and actual sheet-edge tolerance relative to this part boundary that needs to be achieved. Simulations are run in sequence, with blank shape adjustments made to each simulation based on the tolerance outcome of the previous one.

The outcome after automatic optimization of the blank shape is shown in Figure 6. Besides minimal deviation from the target part boundary, this figure also shows, as expected, a complete mitigation of thinning levels at the corners.

This mitigation of thinning issues opened up opportunities for energy, time and cost savings, without compromising part quality.

These opportunities were explored in another systematic process investigation. What is the least expensive set of tools, thermal energy and timing conditions that can produce parts meeting the following critical quality considerations?

Thinning below the maximum acceptable (17 percent) over the entire part
Minimal wrinkling and compression in critical areas
Full martensite, and tensile strength, development over the entire part

Fig. 6 – Tolerance and simulation outcomes following blank optimization

Wider, yet meaningful, latitude was provided in this second stochastic run for the following design variables representing important die, thermal and timing elements:

Binder gap: 0.2-0.6 mm
Maintained tool temperature: 60-85 degrees Celsius
Temperature to which blank is heated: 800-950 degrees Celsius
In-die quenching time: 20-40 s
Transport time from furnace to press: 7-11 s
In-die quenching force

This systematic investigation was run with 128 individual realizations, each of which represented a random combination of specific settings for each of these parameters within their designated ranges. On top of these controllable conditions, blank thickness and lubrication were allowed to vary, representing noise conditions that cannot be controlled.

Areas of concern were marked on the sheet for each of the three quality considerations listed earlier. Foremost consideration was given to formability, or thinning. Martensite and strength development was a very close second, followed by minimized wrinkling/compression.

As observed earlier, prior blank optimization already provided a very formable nominal condition. The outcome of this subsequent study indicated that thinning could be contained within necessary limits, nearly across the entire range of settings for the different design variables.

This allowed for parameter settings to be specifically optimized for martensite development without compromising formability. How high can the tools be allowed to heat up, how much can transport and quench times be shortened, what is the lowest temperature to which the blank can be heated and yet meet the 98-percent martensitic volume fraction requirement? This digital trial was executed based on the rich data generated over the 128 individual realizations and starting from an automatically determined combination: tool temperature 60 degrees Celsius, transport and quench times 7 s and 27 s respectively, and blank temperature 912 degrees Celsius. (See Figure 7.)

Fig. 7 – Tool and blank temperatures, automatically determined, for required martensite development

 

Fig. 8 – Consequence of higher tool temperature on martensite development

This initial trial also showed that adequate martensite volume fraction could be developed over an extended range of settings for different parameters. Taking advantage of this observation, it was decided to explore further, in trial mode, if a combination of sheet and tool temperature settings exists when binder gap, quench and transport times were locked at practically favorable settings: fixed binder gap, with transport and quench times at their respective shortest durations, and quench force at minimum tonnage. The automatically identified tool and blank temperatures and the resulting prediction of martensitic volume fraction are shown in Figure 7.

This trial indicated that sheet temperature needed to be above 861 degrees Celsius for martensite formation. Additional exploration, in trial mode, indicated that higher tool temperatures required higher levels of blank heating temperatures. While this is easy to explain, the big benefit here was to find the minimum temperature to which a blank needed to be heated if tools were going to be maintained at the highest end of the temperature range. This is important since tools are expected to heat up over the forming process, and factoring in high tool temperatures eases the burden on complex cooling system design. Figure 8 shows that higher tool temperatures, without countermeasures in terms of higher blank temperature, lead to inadequate martensite following quenching.

The colors on the Blank Temperature slider in Figure 8 may be interpreted as follows: If the tool temperature is set at its highest level of 85 degrees Celsius, the blank needs to be heated up to a temperature of at least over 885 degrees Celsius to produce the necessary volume fraction of martensite, this temperature representing the right edge of the red zone. Ideally, the blank needs to be heated more than at least the left edge of the green zone, 900 degrees Celsius. The green zone represents the process window for blank temperature, and the red zone shows the range of temperatures over which no useful resolution is possible under the parameter settings already imposed on the other design variables. Figure 9 shows a prediction of martensite volume fraction at the lowest useful blank temperature and lists the settings of other die and process conditions.

Fig. 9 – Martensitic volume fraction predicted at a blank temperature of 900 degrees Celsius

The above finalized set of die, blank and timing conditions was arrived at in a systematic, step-by-step process, starting from wide-open options and progressing through a prioritized set of decisions driven by practical needs of overall cycle time and process control capabilities.

The next step was to examine wrinkling/compression status for the set of die and process conditions that had been arrived at systematically to achieve optimal martensite and formability outcomes. Minor waviness was predicted on the sidewalls with minimal wrinkling concerns in the more critical regions close to the header. This condition was accepted as a reasonable compromise in favor of the two more important quality considerations.

How well does this solution stand the test of noise variations—production conditions that typically cannot be controlled—in terms of blank thickness and lubrication? This was examined in the variation of martensite and thinning levels driven by the above noise variations. Cpk, the widely used process capability index, was applied as the metric for this purpose. To compute Cpk, specification limits for thinning and martensite volume fraction were defined respectively as 17 percent and 98 percent. Figure 10 shows the Cpk distribution on the B-pillar for thinning. It is clear that the systematically determined process is perfectly capable also of repeatability— producing parts that are all within required tolerance.

It was also important to review the ability of the systematically developed process to produce parts within acceptable dimensional tolerances and to validate that this tolerance is repeatable, that the process is robust, despite uncontrolled variations in thickness and lubrication. These results are shown validated in Figure 11, based on the assumption of an acceptable tolerance band of +/- 1mm.


Thermomechanical Engineering of the Hot Forming System

In all of the study above, it was assumed that tool temperatures hold steady at a constant level over multiple hits. This is not the case in the real world, as tools heat up upon contact to the blank and are expected to arrive at steady state thermal conditions after a few hits at a constant rate. Any digital validation of the hot forming process needs to explore this temperature rise in tools. The most efficient method to accomplish this is within the thermomechanical system used for validation of forming outcomes. Although disconnected, or independent, computational fluid dynamics (CFD) analyses are capable of generating accurate and detailed tool surface-temperature maps, a fully coupled assessment is necessary to predict evolution of temperature over typical production cycle times and to evaluate the critical influence of tool temperatures on the capability of the process toward desired forming outcomes.

Fig. 10 – Thinning variation quantified using Cpk

 

Fig. 11 – Shape distortion following quenching and cooling for the systematically developed process

 

Fig. 12 – 3-D tool properties for coupled thermomechanical simulation of the hot forming system

Essential inputs to this coupled thermomechanical assessment include the following:

3-D geometry of tools, not just the 3-D tool surfaces
Thermal properties of tool materials such as heat capacity and thermal conductivity
Ambient temperature
Cooling channel topology
Flow conditions and temperature of the cooling fluid
Thermal properties of the fluid-tool interfaced (heat transfer)

These are shown in Figure 12.

The above tool conditions need to be fully embedded into the simulation of the hot forming process. The hot forming process may then be simulated in repeated cyclic fashion, starting from the loading of the first blank into the press, the forming and quenching process, followed by opening of dies and loading of the next blank, and so on until tool temperatures evolve naturally to a steady state distribution on tool surfaces. “Steady state” implies the condition where subsequent forming operations produce no further change in tool temperatures and represents a balance between the heat flux from sheet to tools and the transport of heat from the tools via the cooling channels. All timing elements, such as idle times when the formed sheet is waiting pickup off the open die and the waiting time for the loading of the next blank, are important to this repeated cyclic simulation.

Outcomes from such simulations are of critical practical significance and go far beyond the capabilities of limited evaluations of just the form-ing process:

Temperature rise and surface temperature distributions on tools, identification of tool hot spots
Consequent formability and developed mechanical properties on sheet, such as martensite (or other phase) volume fractions, tensile strength and hardness

Fig. 13 – Punch surface temperature distribution after 16 cycles; chart shows temperature history over the 16 cycles

Together with useful diagnostic conditions such as current and critical cooling rates, history of temperature distribution and phase transformations, this self-contained and coupled process provides meaningful opportunity to comprehensively optimize overall cycle time necessary to produce acceptable parts and to simultaneously have these conditions reliably validated in terms of desired forming outcomes: formability, strength and hardness, panel distortion.

Figure 13 shows the surface temperature distribution on the punch at steady state and the evolution of temperature at a sample location on tool surface over the entire history of 16 cycles it took to arrive at steady state. Peak temperature was determined to be more than 165 degrees Celsius, far above the 85 degrees Celsius assumed in the earlier study. This, in turn, leads to lack of martensite development expected over the full part geometry.


Summary

Digital engineering and validation tools need to faithfully represent the physical die and process elements of production conditions. They need to be capable of processing complex forming conditions. And they need to provide all necessary diagnostics, as well as meaningful turnaround and interactivity in order to empower, even embolden, engineers toward product and process innovations. This article reviewed the state-of-the-art capabilities available today and illustrated a step-by-step application of these capabilities to a hypothetical, but commonplace, engineering challenge.

 

By Adithya Ramamurthy

Adithya Ramamurthyhas been an application engineer with AutoForm Engineering, USA, since July 2015. Apart from the core software knowledge, he specializes in the AutoForm Sigma module, which provides a robust and systematic approach to the sheet-metal forming process. He also conducts training on the compensator module, which deals with the know-hows of measuring and compensating for springback in sheet metals.

Prior to joining AutoForm, Mr. Ramamurthy graduated with a master’s degree in mechanical engineering from the University of Missouri, with a specialization in nonlinear finite element structural analysis. He also completed his bachelor’s degree in mechanical engineering, after which he worked for a year in the offshore development team of Jaguar and Land Rover in India.

Top