Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

A New Testing Paradigm For Today's Product Development Process - Part 2

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

A New Testing Paradigm for Todays

Product Development Process Part 2


Herman Van der Auweraer and Jan Leuridan, LMS International, Leuven, Belgium
Modern product development increasingly relies on simulation methods to optimize functional performance. The testanalyze-fix approach of physical prototypes has moved to
computer-aided, engineering-based, virtual prototyping. Contrary to the belief that this would reduce the demands for testing, it has opened new applications, new challenges and opportunities. Part 2 of this article covers these techniques.
(See page 14 of the September 2005 issue of Sound & Vibration for Part 1 of this article.)
The demands on speed and ease-of-use of testing procedures
have become very severe. Testing a component, subsystem or
full structure will always be in the critical path of the development project. The number of prototypes to be tested will certainly dwindle, leaving less time for each test. Furthermore,
since the analyzed structure will be in use for other tests immediately after completion of the current one, it is essential
that all necessary data are acquired and validated during the
short period that the structure was available, leading to a testright-first-time demand. With testing becoming a supporting
technical commodity instead of a highly specialized task, it will
be increasingly delegated to technicians and operators instead
of being performed by engineering specialists.

Test Definition
A modern modal analysis test can easily comprise more than
1000 degrees of freedom. In particular, when validating and
updating finite-element (FE) models, a sufficiently dense grid
of test points is needed. The FE model grid is much finer than
the test, requiring a substantial reduction. Selecting a minimal
set of needed test and excitation points and their location is
very valuable in optimizing test efficiency. The FE model itself can be used for such a pretest analysis, maximizing test
efficiency, reducing test costs and limiting unavailability of the
prototype.
Test preparation can even identify target modes in view of
the subsequent FE correlation to be performed. An optimal set
of excitation and response nodes can also be identified for these
modes. With aerospace applications in particular, FE models
are always available before the tests are conducted and the tests
are so expensive that a proper test preparation (and even virtual test dry run using the FE model) is a prerequisite before
setting up actual instrumentation and testing.
In all applications, another tedious task is establishing precise test geometry. This also relies on the FE model and the capability of the operator to instrument exactly at predefined locations (which is unrealistic in most cases), or one has to
measure the geometry of the test structure by hand. Novel methods are being investigated to reduce this time using coordinate
measurement systems or an articulated arm equipped with
electromagnetic or electro-optical sensing. Developed solutions
include ultrasonic photogrammetry, grid projection and triangulation techniques. An example of the photogrammetry approach using a digital camera to measure the 3D geometry of
an aircraft propeller is shown in Figure 1. The recently developed scanning technique is very convenient and is based on
linking each measurement location with an external reference
frame through automated image processing.
Based on a paper presented at ISMA2004, the 2004 International
Conference on Modal Analysis, Noise and Vibration Engineering,
Leuven, Belgium, September 2004.

18

Test Instrumentation
The increasing complexity and size of the tests that are conducted mean that instrumentation setup takes up a significant
part of the total test duration. State-of-the-art instrumentation
consists of multichannel data-acquisition front ends with local conditioning, analog-to-digital conversion and signal processing connected to powerful computers. Channel counts
range between 16 and 64, up to as much as 1000, depending
on the application. Signal bandwidths typically range from 100
Hz to 20 kHz. Operational measurements require mobile systems, which in the case of large-scale structures like aircraft,
need to consist of multiple, distributed, but time-synchronized
units.
As errors in cabling and sensor identification become more
difficult to be noticed and corrected, automated procedures,
including the use of smart transducers with embedded information in terms of calibration, position, etc., become more
widespread. The concept of a transducer electronic data
sheet, or TEDS, embedded in the transducer is covered by the
IEEE 1451 standard.
Another important efficiency gain is that the structure under test can be pre-instrumented (including sensor calibration)
outside the test room. The sensor remembers its location,
calibration factor, history, and identification, only requiring
blind cable connections to start the actual test. This optimizes
the occupancy and throughput of costly test rooms (for example, semi-anechoic rooms for studying vibro-acoustic effects) and significantly reduces cabling errors.

Test Data Plausibility Validation


The key concern with acquiring test data is that these data
must be valid. Errors in the setup or during the test must be
detected and identified before the test is closed and the setup
dismantled. Often there is no second chance to redo part of the
tests, and this would be very expensive and disrupt the testing schedule. Testing-right-first-time is the paradigm. Detecting test data problems has to be done on several levels:
Transducers: Are they properly connected to the structure?
Are they properly connected to the test system? Are they
functioning properly?
Are the test conditions in accordance to the specifications
(excitation, engine speed profiles, references, suspension or
boundary conditions)?
Is the quality of the measurement signals adequate (levels,
spectrum, linearity, noise, digital signal processing (DSP) errors)?
Is it possible to automatically correct identified errors or test
data problems (drop-outs, bias, drift, spikes)?
Can problem areas be easily identified in terms of structural
locations, frequency ranges, and test conditions?
Is the consistency between the signals adequate in view of
the analysis correlations, coherences, frequency response
functions (FRFs), etc.
Are preliminary, on-line, analysis results available to monitor the test process (RMS levels, coherences, statistical parameters, order curves, modal parameters).
To yield data with maximum usability, indicators regarding
the various test plausibility factors have to be defined, calculated, and monitored on line. Deviations have to be immediately recognized, and decisions must be made on whether to
redo part of the test, correct data afterward, or just flag data as
being invalid or with decreased plausibility. Time can be so

SOUND AND VIBRATION/NOVEMBER 2005

Figure 2. (a) Overall coherence value. (b) Coherence in critical frequency


band.
Figure 1. (a) Aircraft modal test setup. (b) Aircraft propeller geometry
from photogrammetry.

limited that redoing the test (even partially) is not feasible, but
at least one should not use (or use with caution) any flagged
data.
Essentially, it boils down to providing test engineers with
easy on line and postanalysis data validation tools. Upon completing the test, a report with an assessment of the test data
plausibility should be available. Figure 2 shows a global coherence plot, projecting FRF coherence on the geometry.

Optimal Data Exploitation


Most current applications either provide on-line measurement of dedicated functions, or the data are stored on some
recording device for deferred processing. In the latter case, this
is done because of safety reasons or because of insufficient online processing capability.
Directly linked to the requirement for condensing test programs are requests from test performers to combine on-line
processing with storage of raw test data for later processing.
This is necessary so that analyses can be done using other parameters (frequency resolution, order analysis methods, in
depth time-domain analysis, for example) or to launch additional analyses without having to set up a new test. The considerations are similar to those discussed above for data validation: minimal use of lab time, minimal constraints on the
testing routine and minimal test time. In-flight tests or jet engine tests are very expensive and the total number of such tests
should be minimized to get as much information out of a single
test as possible. In some cases, as with aerospace vibration testing, it may even be impossible to set up a second test, since
maximum loading requirements were depleted in the first test.
Also, for systems with a time-variant behavior (temperature effects due to heating of an exhaust or of an engine, for example),
redoing a test in comparable conditions involves long delays
related to cooling the structure.
The solution is to store the raw time data using the data processing system. In the case of an in-flight test, this requires

SOUND AND VIBRATION/NOVEMBER 2005

high-speed data capacity up to tens of gigabytes of data. Adequate acquisition settings must be used to cover not only the
current analysis requirements, but also those of potential secondary uses. In engine testing, this may lead to conflicting requirements on the level of sampling (octave band analysis,
engine order tracking analysis and narrow-band spectral analysis) necessitating a return to applied methodologies.

Automated Analysis
A further concern with test data analysts is that many processing techniques are very user intensive, requiring interaction with expert users. Modal analysis is a typical example,
requiring the selection of several analysis parameters and the
interpretation of stabilization diagrams. This complex process
is needed to isolate mathematical poles from physical system
poles and to assess the uncoupling of individual poles. While
this procedure proves to be adequate for interactive selection
of valid system solutions, the challenge is to automate the process. For systems with clearly isolated poles, this is addressed
by rather simple methods, but for systems with coupled poles,
this is still an open question. Researched solutions include:
estimation methods that are much more robust with respect to
the appearance of spurious poles; fully automated, self-tuning,
algorithms such as the maximum-likelihood estimator; and also
implementing heuristic pole selection rules in an automated
procedure. The objective of the latter solution, for example, was
to automate the modal identification that has to be done on
each Space Shuttle after landing. While a full automation of
this measurement process still seems too ambitious, it would
be a major achievement.
An example of applying a more robust parameter estimation
method, such as the LMS PolyMAX (least-squares complex
frequency domain) method, is shown in Figure 3 where a
ground vibration test (GVT) analysis of an aircraft is presented.
The LMS PolyMAX stabilization diagram is compared to the
standard LSCE (least-squares complex exponential) results. The
clearer stabilization behavior and reduced analysis complexity is obvious.
Future research will involve both intelligent decision and

19

learning mechanisms for capturing the users expertise and fundamental investigation into the nature of mathematical poles
and identification criteria.

Data Quality and Consistency


One of the most involved aspects of the new role of testing
is that much higher accuracy is expected from experimental
data. Key to this is that test results are not only used to perform a qualitative analysis or troubleshooting, but also that
these data will contribute to building hybrid models. All test
aspects related to data quality (which is something different
than the basic validity of the data) have to be treated seriously.
This involves constraints induced by boundary conditions of
the test (suspension influences, location of rigid body modes,
shaker-induced nonmeasurable secondary loads such as moments or lateral forces), influence of transducers, noise levels
on orders or FRFs, leakage errors, frequency resolution limitations and so on.
A key problem often originates from the lack of consistency
between the data due to small shifts in setup constraints, different mass loading, or temperature variations. While these
errors are minor and would perhaps not really affect the quality of each individual measurement, global processing of these
data may be subjected to severe errors. It is recognized that in
FRF substructuring, the frequency inconsistency between individual FRFs is one of the major errors.
Automated procedures are being developed to integrate
modal estimation results from slightly inconsistent data
patches. The rationale is that small changes in some global
modal parameters may be far less influential for the end result
than the inconsistency itself.

Service Load Simulation


Testing of a vehicle test drives on public roads and test
tracks is still the ultimate challenge for vehicle performance.
For durability, such tests can become very expensive and time
consuming, and significant gains can be obtained by simulating the measured service loads in a laboratory environment
consisting of multi-axial test rigs.
The advantage of simulating service loads on a multi-axial
test rig in a laboratory environment is that it allows testing 24
hours a day, 7 days a week, without a driver in the car and
without adverse influence from weather or traffic. It offers
much better surveillance of the test and an earlier detection of
fatigue cracks.
A major advantage of the service load simulation approach
is the possibility of running the durability test with compressed
target load time histories that preserve the same damage potential as the originally measured signals. This so-called fatigue-sensitive data reduction dramatically reduces the length
of the target signals and therefore also reduces time spent during the service load simulation phase.
Compared with test track durability testing, another major
advantage of service load simulation on test rigs is that it is
possible to run a more customer-relevant durability test. In
an attempt to speed up testing, special test tracks have been
designed to accelerate the test program by focusing on the fatigue-relevant events and by omitting smaller nondamaging
load cycles. However, fatigue loading on test tracks is often
significantly more severe, certainly in terms of maximum amplitudes. Such a generic test track cannot be customer relevant for all types of cars and drivers. As a result, test track
driving is mostly used for design optimization, during which
different designs are compared to each other with respect to
durability performance. On the other hand, service load simulation testing allows a more realistic durability test program
by running an optimized mix of test track subsections. Figure
4 shows the setup of such a multi-axial test rig.
Similar approaches are adopted for noise and vibration comfort studies. Figure 5 shows a high-frequency, 6-degree-of-frequency (DOF) test platform used for suspension and road noise
testing at Katholieke Universiteit Leuven.

20

Figure 3. (a) LSCE stabilization diagram. (b) LMS PolyMAX GVT stabilization diagram.

Figure 4. Multi-axial test rig for service load simulation.

On the level of NVH or human comfort testing, research is


conducted to synthesize mission-equivalent test conditions
representative of the original road data in the relevant metrics.
Next to the process improvements discussed above, it is essential to make testing fit the requirements of todays advanced

SOUND AND VIBRATION/NOVEMBER 2005

Figure 6. Acoustic mode shape of an automobile interior.

Figure 5. Six-degree-of-frequency shaker table vehicle setup.

applications. Some specific innovations have been introduced,


expanding the limits of structural testing and making testing
specifically fit the needs of virtual prototyping.

Optical Measurements
One of the instrumentation constraints with modal analysis
is the effect of the transducer mass on the structure. With increasing frequency, the influence of transducer mass increases
(larger inertia effect), while the number of transducers has to
increase (higher spatial complexity of the modes). Noncontact
optical vibration measurement techniques have been developed to counteract this problem.
Essentially two approaches are used. One is based on laser
Doppler vibrometers (LDVs), which sequentially scan the vibrating surface. This approach supports broadband and sinusoidal testing, is integrated in standard test systems and procedures and is frequently used. The second approach is based
on full-field electronic speckle pattern interferometry (ESPI).
This approach uses continuous-strobe or pulsed lasers and requires use of sinusoidal excitation. The imaging takes place
using a special charge-coupled-device (CCD) camera. Specific
image processing transforms the qualitative interferograms to
quantitative FRF values at the excitation frequency. The complete FRF is then built up frequency by frequency. Special data
reduction methods allow reduction of pixel density vibration
response fields to the spatial resolution needed for a proper
modal analysis. The first industrial applications of this approach on cars have been documented.
High-frequency modal analysis or response analysis is essential when studying vibro-acoustic problems. Structural modeling at acoustically relevant frequencies is not straightforward,
and for trimmed structures, not many successful applications
are documented. Updating the structural models by test or
using the test models in hybrid combination with numerical
acoustic models is a practical alternative.

Figure 7. Two mode shapes of bridge output-only data.

coupled behavior of structure and acoustics. In general, they


cannot be separated in a purely acoustical or purely structural
cause. The eigen vectors, of course, each have a specific acoustic and structural part. Special scaling considerations related
to the vibro-acoustic reciprocity principle have to be taken into
account. Figure 6 shows an example of an acoustic mode of a
car interior.
The acoustic pressure in the measured vehicle sections can
be represented by gray levels or color scales or by wire-frame
motion. The modal behavior is clearly seen.

In-Operation Modal Analysis


Vibro-Acoustic Modal Analysis
In many interior noise problems, not only the structure but
also the cavity may show resonance behavior such as the
booming noise in a car where a cavity resonance is excited
by engine or road-induced vibrations. The concept of modal
analysis can also be applied to acoustical or mixed structural
acoustical systems using a volume velocity source as the acoustical input variable and sound pressure as acoustical output.
Unique to this approach is that the eigen values are due to the
SOUND AND VIBRATION/NOVEMBER 2005

In the classical modal identification approach, the baseline


data that are processed are FRFs or impulse responses (IRs)
measured under laboratory conditions. But in many applications, real operating conditions may differ significantly from
those applied during the modal test thus the need to derive
models in real operational conditions. Since all real-world
systems are nonlinear to some extent, the models obtained under real loading will be linearized for much more representative working points. Additionally, they will properly take into

21

account the environmental influences on system behavior (prestress of suspensions, load-induced stiffening, aero-elastic interaction, etc.).
Another characteristic of in-operation modal analysis stems
from the fact that in many cases (swell excitation of off-shore
platforms, traffic/wind excitation of civil construction), forcedexcitation tests are very difficult to conduct, and operating data
are often all that are available. Therefore, a considerable interest exists in extracting valid models directly from operating
data. Finally, in many cases (car road tests, aircraft/spacecraft
flight tests), large in-operation data sets are measured anyway
for level verification, for operating field shape analysis, and
other purposes. Extending classical operating data analysis
procedures with modal parameter identification allows a better exploitation of these data.
Only response data are measurable in most cases, while actual loading conditions are unknown. Consequently, over recent years, several modal parameter estimation techniques have
been proposed and studied for modal parameter extraction
from output-only data. They include auto-regressive moving
averaging (ARMA) models, natural excitation technique
(NexT), and stochastic realization methods. The approach is
becoming well accepted in the industry. Recently the LMS
PolyMAX method was also extended to output-only data. Figure 7 shows mode shapes from a test on a highway bridge excited by the wind and traffic. Despite the low vibration levels
and the fact that only response data were available, the resulting mode shapes are of high quality.

Figure 8. Effect of residual modes on modal synthesis calculation.

Testing for Hybrid Modeling


Some final considerations can be devoted to the fact that
hybrid modeling requires its own specific test results. The classical Craig-Bampton approach using fixed interface modes and
static modes is not applicable when using test data. Hybrid approaches are based either on FRF substructuring or residual
flexibility methods. The latter approach is based on free-interface modes, a condition fulfilled by standard modal tests but
that requires additional, dedicated tests for rigid-body modes
and residual flexibility modes.
The residual flexibility modes are necessary to model the
effect of truncated higher modes affecting (dominating) local
stiffness at the coupling points. This is shown in Figure 8,
evaluating the effect of a modal resynthesis of an FRF before
and after accounting for residual stiffness.
The modal truncation problem can be overcome by estimating residual modes (or pseudo modes) from static and dynamic
compensation terms. Much improved results are obtained by
including these residual modes in a conventional modal synthesis calculation. New methods to estimate the residual modes
from measured FRF data have also been developed. An example of a hybrid substructuring calculation on a part of a
vehicle assembly is shown in Figure 9, comparing the assembly FRF after combining two substructure models before and
after including residual modes. Also, specific procedures based
on a standard modal analysis have been developed for the experimental establishment of actual rigid-body modes and inertial parameters (center of gravity) of a structure.
Another example is the increased awareness of the effect of
rotational degrees of freedom. These quantities, which are traditionally neither measured nor used in experimental models
(modal analysis), play an important role in numerical calculations. When combining experimental and numerical models,
their influence must be assessed, and dedicated measurements
need to be made where needed.

Conclusions
The pressure on shifting product performance optimization
to earlier stages of the development process has been relieved
by a revolution in computer-aided engineering methods, resulting in a virtual-prototype engineering approach to product development. But contrary to the belief that this would eliminate
testing from the development process, this has resulted in new

22

Figure 9. (a) Predicted FRF without residual modes. (b) Predicted FRF
with residual modes.

demands for testing that are more stringent than ever. While
testing was traditionally executed in a context of test-analyzeand-fix strategy, experimental data collection and analysis are
now integrated throughout the various phases of the virtual
product development process, from benchmarking and target
setting through component model validation to establishing
true hybrid product models.
Test data are no longer retained in isolated islands of excellence, but play an essential role throughout the development
process and are used throughout the extended organization.
This requires that each of the applied test procedures (operational data collection, modal analysis, acoustic testing, noise
source identification, etc.) has to be critically revisited in view
of the new requirements of virtual prototype refinement.
The authors may be contacted at: herman.vanderauweraer@lms.be.

SOUND AND VIBRATION/NOVEMBER 2005

You might also like