|
FINITE ELEMENT ANALYSIS: Post-processing
by Steve Roensch, President, Roensch & Associates
...Last in a four-part series
      
|
 |
|
After a finite element model has been prepared and checked,
boundary conditions have been applied, and the model has
been solved, it is time to investigate the results of the
analysis. This activity is known as the post-processing
phase of the finite element method.
Post-processing begins with a thorough check for problems
that may have occurred during solution. Most solvers
provide a log file, which should be searched for warnings or
errors, and which will also provide a quantitative measure
of how well-behaved the numerical procedures were during
solution. Next, reaction loads at restrained nodes should
be summed and examined as a "sanity check". Reaction loads
that do not closely balance the applied load resultant for a
linear static analysis should cast doubt on the validity of
other results. Error norms such as strain energy density
and stress deviation among adjacent elements might be looked
at next, but for h-code analyses these quantities are best
used to target subsequent adaptive remeshing.
Once the solution is verified to be free of numerical
problems, the quantities of interest may be examined. Many
display options are available, the choice of which depends
on the mathematical form of the quantity as well as its
physical meaning. For example, the displacement of a solid
linear brick element's node is a 3-component spatial vector,
and the model's overall displacement is often displayed by
superposing the deformed shape over the undeformed shape.
Dynamic viewing and animation capabilities aid greatly in
obtaining an understanding of the deformation pattern.
Stresses, being tensor quantities, currently lack a good
single visualization technique, and thus derived stress
quantities are extracted and displayed. Principal stress
vectors may be displayed as color-coded arrows, indicating
both direction and magnitude. The magnitude of principal
stresses or of a scalar failure stress such as the Von Mises
stress may be displayed on the model as colored bands. When
this type of display is treated as a 3D object subjected to
light sources, the resulting image is known as a shaded
image stress plot. Displacement magnitude may also be
displayed by colored bands, but this can lead to
misinterpretation as a stress plot.
An area of post-processing that is rapidly gaining
popularity is that of adaptive remeshing. Error norms such
as strain energy density are used to remesh the model,
placing a denser mesh in regions needing improvement and a
coarser mesh in areas of overkill. Adaptivity requires an
associative link between the model and the underlying CAD
geometry, and works best if boundary conditions may be
applied directly to the geometry, as well. Adaptive
remeshing is a recent demonstration of the iterative nature
of h-code analysis.
Optimization is another area enjoying recent advancement.
Based on the values of various results, the model is
modified automatically in an attempt to satisfy certain
performance criteria and is solved again. The process
iterates until some convergence criterion is met. In its
scalar form, optimization modifies beam cross-sectional
properties, thin shell thicknesses and/or material
properties in an attempt to meet maximum stress constraints,
maximum deflection constraints, and/or vibrational frequency
constraints. Shape optimization is more complex, with the
actual 3D model boundaries being modified. This is best
accomplished by using the driving dimensions as optimization
parameters, but mesh quality at each iteration can be a
concern.
Another direction clearly visible in the finite element
field is the integration of FEA packages with so-called
"mechanism" packages, which analyze motion and forces of
large-displacement multi-body systems. A long-term goal
would be real-time computation and display of displacements
and stresses in a multi-body system undergoing large
displacement motion, with frictional effects and fluid flow
taken into account when necessary. It is difficult to
estimate the increase in computing power necessary to
accomplish this feat, but 2 or 3 orders of magnitude is
probably close. Algorithms to integrate these fields of
analysis may be expected to follow the computing power
increases.
In summary, the finite element method is a relatively recent
discipline that has quickly become a mature method,
especially for structural and thermal analysis. The costs
of applying this technology to everyday design tasks have
been dropping, while the capabilities delivered by the
method expand constantly. With education in the technique
and in the commercial software packages becoming more and
more available, the question has moved from "Why apply FEA?"
to "Why not?". The method is fully capable of delivering
higher quality products in a shorter design cycle with a
reduced chance of field failure, provided it is applied by a
capable analyst. It is also a valid indication of thorough
design practices, should an unexpected litigation crop up.
The time is now for industry to make greater use of this and
other analysis techniques.
© 2008-2013 Roensch & Associates. All rights reserved.
      1.   Introduction
      2.   Pre-processing
      3.   Solution
      4.   Post-processing
|
This four-article series was published in a newsletter of
the American Society of Mechanical Engineers (ASME).
It serves as an introduction to the recent analysis discipline
known as the finite element method. The author
is an engineering consultant and expert witness specializing
in finite element analysis.
|
|