A New Testing Paradigm For Today's Product Development Process - Part 2
A New Testing Paradigm For Today's Product Development Process - Part 2
A New Testing Paradigm For Today's Product Development Process - Part 2
Test Definition
A modern modal analysis test can easily comprise more than
1000 degrees of freedom. In particular, when validating and
updating finite-element (FE) models, a sufficiently dense grid
of test points is needed. The FE model grid is much finer than
the test, requiring a substantial reduction. Selecting a minimal
set of needed test and excitation points and their location is
very valuable in optimizing test efficiency. The FE model itself can be used for such a pretest analysis, maximizing test
efficiency, reducing test costs and limiting unavailability of the
prototype.
Test preparation can even identify target modes in view of
the subsequent FE correlation to be performed. An optimal set
of excitation and response nodes can also be identified for these
modes. With aerospace applications in particular, FE models
are always available before the tests are conducted and the tests
are so expensive that a proper test preparation (and even virtual test dry run using the FE model) is a prerequisite before
setting up actual instrumentation and testing.
In all applications, another tedious task is establishing precise test geometry. This also relies on the FE model and the capability of the operator to instrument exactly at predefined locations (which is unrealistic in most cases), or one has to
measure the geometry of the test structure by hand. Novel methods are being investigated to reduce this time using coordinate
measurement systems or an articulated arm equipped with
electromagnetic or electro-optical sensing. Developed solutions
include ultrasonic photogrammetry, grid projection and triangulation techniques. An example of the photogrammetry approach using a digital camera to measure the 3D geometry of
an aircraft propeller is shown in Figure 1. The recently developed scanning technique is very convenient and is based on
linking each measurement location with an external reference
frame through automated image processing.
Based on a paper presented at ISMA2004, the 2004 International
Conference on Modal Analysis, Noise and Vibration Engineering,
Leuven, Belgium, September 2004.
18
Test Instrumentation
The increasing complexity and size of the tests that are conducted mean that instrumentation setup takes up a significant
part of the total test duration. State-of-the-art instrumentation
consists of multichannel data-acquisition front ends with local conditioning, analog-to-digital conversion and signal processing connected to powerful computers. Channel counts
range between 16 and 64, up to as much as 1000, depending
on the application. Signal bandwidths typically range from 100
Hz to 20 kHz. Operational measurements require mobile systems, which in the case of large-scale structures like aircraft,
need to consist of multiple, distributed, but time-synchronized
units.
As errors in cabling and sensor identification become more
difficult to be noticed and corrected, automated procedures,
including the use of smart transducers with embedded information in terms of calibration, position, etc., become more
widespread. The concept of a transducer electronic data
sheet, or TEDS, embedded in the transducer is covered by the
IEEE 1451 standard.
Another important efficiency gain is that the structure under test can be pre-instrumented (including sensor calibration)
outside the test room. The sensor remembers its location,
calibration factor, history, and identification, only requiring
blind cable connections to start the actual test. This optimizes
the occupancy and throughput of costly test rooms (for example, semi-anechoic rooms for studying vibro-acoustic effects) and significantly reduces cabling errors.
limited that redoing the test (even partially) is not feasible, but
at least one should not use (or use with caution) any flagged
data.
Essentially, it boils down to providing test engineers with
easy on line and postanalysis data validation tools. Upon completing the test, a report with an assessment of the test data
plausibility should be available. Figure 2 shows a global coherence plot, projecting FRF coherence on the geometry.
high-speed data capacity up to tens of gigabytes of data. Adequate acquisition settings must be used to cover not only the
current analysis requirements, but also those of potential secondary uses. In engine testing, this may lead to conflicting requirements on the level of sampling (octave band analysis,
engine order tracking analysis and narrow-band spectral analysis) necessitating a return to applied methodologies.
Automated Analysis
A further concern with test data analysts is that many processing techniques are very user intensive, requiring interaction with expert users. Modal analysis is a typical example,
requiring the selection of several analysis parameters and the
interpretation of stabilization diagrams. This complex process
is needed to isolate mathematical poles from physical system
poles and to assess the uncoupling of individual poles. While
this procedure proves to be adequate for interactive selection
of valid system solutions, the challenge is to automate the process. For systems with clearly isolated poles, this is addressed
by rather simple methods, but for systems with coupled poles,
this is still an open question. Researched solutions include:
estimation methods that are much more robust with respect to
the appearance of spurious poles; fully automated, self-tuning,
algorithms such as the maximum-likelihood estimator; and also
implementing heuristic pole selection rules in an automated
procedure. The objective of the latter solution, for example, was
to automate the modal identification that has to be done on
each Space Shuttle after landing. While a full automation of
this measurement process still seems too ambitious, it would
be a major achievement.
An example of applying a more robust parameter estimation
method, such as the LMS PolyMAX (least-squares complex
frequency domain) method, is shown in Figure 3 where a
ground vibration test (GVT) analysis of an aircraft is presented.
The LMS PolyMAX stabilization diagram is compared to the
standard LSCE (least-squares complex exponential) results. The
clearer stabilization behavior and reduced analysis complexity is obvious.
Future research will involve both intelligent decision and
19
learning mechanisms for capturing the users expertise and fundamental investigation into the nature of mathematical poles
and identification criteria.
20
Figure 3. (a) LSCE stabilization diagram. (b) LMS PolyMAX GVT stabilization diagram.
Optical Measurements
One of the instrumentation constraints with modal analysis
is the effect of the transducer mass on the structure. With increasing frequency, the influence of transducer mass increases
(larger inertia effect), while the number of transducers has to
increase (higher spatial complexity of the modes). Noncontact
optical vibration measurement techniques have been developed to counteract this problem.
Essentially two approaches are used. One is based on laser
Doppler vibrometers (LDVs), which sequentially scan the vibrating surface. This approach supports broadband and sinusoidal testing, is integrated in standard test systems and procedures and is frequently used. The second approach is based
on full-field electronic speckle pattern interferometry (ESPI).
This approach uses continuous-strobe or pulsed lasers and requires use of sinusoidal excitation. The imaging takes place
using a special charge-coupled-device (CCD) camera. Specific
image processing transforms the qualitative interferograms to
quantitative FRF values at the excitation frequency. The complete FRF is then built up frequency by frequency. Special data
reduction methods allow reduction of pixel density vibration
response fields to the spatial resolution needed for a proper
modal analysis. The first industrial applications of this approach on cars have been documented.
High-frequency modal analysis or response analysis is essential when studying vibro-acoustic problems. Structural modeling at acoustically relevant frequencies is not straightforward,
and for trimmed structures, not many successful applications
are documented. Updating the structural models by test or
using the test models in hybrid combination with numerical
acoustic models is a practical alternative.
21
account the environmental influences on system behavior (prestress of suspensions, load-induced stiffening, aero-elastic interaction, etc.).
Another characteristic of in-operation modal analysis stems
from the fact that in many cases (swell excitation of off-shore
platforms, traffic/wind excitation of civil construction), forcedexcitation tests are very difficult to conduct, and operating data
are often all that are available. Therefore, a considerable interest exists in extracting valid models directly from operating
data. Finally, in many cases (car road tests, aircraft/spacecraft
flight tests), large in-operation data sets are measured anyway
for level verification, for operating field shape analysis, and
other purposes. Extending classical operating data analysis
procedures with modal parameter identification allows a better exploitation of these data.
Only response data are measurable in most cases, while actual loading conditions are unknown. Consequently, over recent years, several modal parameter estimation techniques have
been proposed and studied for modal parameter extraction
from output-only data. They include auto-regressive moving
averaging (ARMA) models, natural excitation technique
(NexT), and stochastic realization methods. The approach is
becoming well accepted in the industry. Recently the LMS
PolyMAX method was also extended to output-only data. Figure 7 shows mode shapes from a test on a highway bridge excited by the wind and traffic. Despite the low vibration levels
and the fact that only response data were available, the resulting mode shapes are of high quality.
Conclusions
The pressure on shifting product performance optimization
to earlier stages of the development process has been relieved
by a revolution in computer-aided engineering methods, resulting in a virtual-prototype engineering approach to product development. But contrary to the belief that this would eliminate
testing from the development process, this has resulted in new
22
Figure 9. (a) Predicted FRF without residual modes. (b) Predicted FRF
with residual modes.
demands for testing that are more stringent than ever. While
testing was traditionally executed in a context of test-analyzeand-fix strategy, experimental data collection and analysis are
now integrated throughout the various phases of the virtual
product development process, from benchmarking and target
setting through component model validation to establishing
true hybrid product models.
Test data are no longer retained in isolated islands of excellence, but play an essential role throughout the development
process and are used throughout the extended organization.
This requires that each of the applied test procedures (operational data collection, modal analysis, acoustic testing, noise
source identification, etc.) has to be critically revisited in view
of the new requirements of virtual prototype refinement.
The authors may be contacted at: herman.vanderauweraer@lms.be.