E4433b Calibracion
E4433b Calibracion
E4433b Calibracion
Agilent Technologies
ESG Family Signal Generators
Printed in USA
March 2011
ii
Contents
1. Equipment Required
Required Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1-2
2. Operation Verification
Verification Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2-2
3. Service Software
Required Test Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3-2
Installing the ESG_B or ESG_APDP Service Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3-3
Uninstalling the Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3-5
ESG Family Support Software Administration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3-6
Running the Service Support Software. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3-20
4. Performance Tests
Support Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-3
1. Internal FM Accuracy and Distortion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-9
2. Internal AM Accuracy and Distortion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-10
3. Phase Modulation Accuracy and Distortion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-12
4. FM Frequency Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-13
5. AM Frequency Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-14
6. Phase Modulation Frequency Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-16
7. DCFM Frequency Offset Relative to CW . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-17
8. Residual FM (ESG-A and ESG-D only). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-18
9. Harmonic, Subharmonic, and Nonharmonic Spurious Signals . . . . . . . . . . . . . . . . . . . . .4-20
10. Power Level Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-21
11. Timebase Aging Rate (ESG-AP, ESG-DP or Option 1E5 only) . . . . . . . . . . . . . . . . . . . .4-24
12. Digital Modulation Level Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-27
13. Internal Digital Modulation Quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-28
14. Custom I/Q RF Modulation Quality (Option UN8 only) . . . . . . . . . . . . . . . . . . . . . . . . .4-29
15. I/Q Modulation Quality (Options UN3, UN4 & UN8) . . . . . . . . . . . . . . . . . . . . . . . . . . .4-31
16. Pulse Modulation On/Off Ratio . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-32
17. Burst Modulation On/Off Ratio (ESG-D only) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-33
18. CDMA Adjacent Channel Power (Option UN5 only) . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-34
19. Alternate Timeslot Power Settling Time (Option UNA only) . . . . . . . . . . . . . . . . . . . . .4-35
20. Pulse Rise/Fall Time (Option 1E6 only) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-36
21. Measuring Phase Noise and Residual FM
(ESG-AP and ESG-DP Series Signal Generators) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-37
22. Dual Arbitrary Waveform Generator Check (Option UND only) . . . . . . . . . . . . . . . . . .4-42
23. GSM Loopback BER Check (Option 300 only) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-44
24. Frac-N Check (ESG-AP, ESG-DP only) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-45
25. Sampler/YO Driver Check (ESG-AP, ESG-DP only). . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-45
Performance Test Records . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4-46
5. Adjustments
Adjustment Relationships . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5-2
Internal Reference Oscillator Adjustment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5-4
Analog Bus ADC Calibration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5-5
Pretune Calibration (ESG-AP and ESG-DP only) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5-6
iii
Contents
iv
Contents
Warranty . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .7-7
Assistance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .7-8
Certification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .7-9
v
Contents
vi
ESG Family Signal Generators
1 Equipment Required
This chapter contains a list of all the recommended equipment required to perform the
performance tests and adjustments for the ESG Family Signal Generators.
Critical Specifications
Frequency Range: 100 kHz to 10 MHz
Noise Figure: 1.5 dB
Gain: 50 dB
1 dB Compression Minimum: +5 dBm
External Power Supply Required: +15 Vdc @ 25 mA
Configuration
Two amplifiers are cascaded to obtain the necessary requirements. The specifications
listed below are the combined specifications of the cascaded amplifiers.
Critical Specifications
Frequency Range: 10 MHz to 2 GHz
Noise Figure: 1.8 dB
Gain: > 56 dB
1 dB Compression Minimum: +10 dBm
External Power Supply Required: +15 Vdc @ 300 mA
2 Operation Verification
This chapter provides procedures that will either ensure that the signal generator is
operating correctly or will help to point to problem areas if it is not. Operation verification
does not ensure performance to specifications, but should provide a level of confidence that
the signal generator is operating correctly within a minimum amount of time. Operation
verification is appropriate for incoming inspection, after repair when a full calibrated
performance is not required, or whenever the integrity of the signal generator is in
question.
Verification Procedures
Perform the following procedures in the order they are presented. The tables referenced by
the tests are located in the back of the chapter where they can be copied easily.
1. Power On the Signal Generator on page 2-2
2. Check for Error Messages on page 2-3
3. Frequency Range and Accuracy Check on page 2-3
4. Power Level Accuracy Check on page 2-4
5. FM Accuracy Check on page 2-7
6. AM Accuracy Check on page 2-8
7. I/Q Modulation Check (ESG-D Only) on page 2-9
Equipment Required
• HP/Agilent 5350A Frequency Counter
• HP/Agilent 438A Power Meter
• HP/Agilent 8482A Power Sensor
• HP/Agilent 8563E Spectrum Analyzer
• HP/Agilent 8491A/B Option 006 Attenuator (6 dB)
• HP/Agilent 8491A/B Option 010 Attenuator (10 dB)
• HP/Agilent 8902A Measuring Receiver
NOTE For ESG-AP or ESG-DP signal generators, or those with Option 1E5, ERROR
514, Reference Oven Cold occurs whenever the signal generator is first
connected to AC line power. The OVEN COLD annunciator and the ERR
annunciator both turn on. The OVEN COLD annunciator automatically clears
after approximately 5 minutes. The error queue cannot be cleared, however,
until the OVEN COLD annunciator has turned off.
2. Cycle the power to the signal generator. The green LED should again be lit and the
signal generator will perform a check.
NOTE Set the gate time of > 5 seconds for maximum counter accuracy. Verify that
the counter is phase-locked to the 10 MHz external reference.
5. FM Accuracy Check
Connect the Test Equipment
6. AM Accuracy Check
Connect the Test Equipment
Test Tables
Table 2-1 Frequency Accuracy
Limits
Frequency
(MHz) Measured
Lower (Hz) Upper (Hz)
(Hz)
Table 2-2 Power Level Accuracy Setup 1 (Signal Generators without Option 1E6)
+7 6.5 7.5
0 −0.5 0.5
−5 −5.5 −4.5
+7 6.5 7.5
0 −0.5 0.5
−5 −5.5 −4.5
+7 6.5 7.5
0 −0.5 0.5
−5 −5.5 −4.5
+7 6.5 7.5
0 −0.5 0.5
−5 −5.5 −4.5
+7 6.5 7.5
0 −0.5 0.5
−5 −5.5 −4.5
+7 6.5 7.5
0 −0.5 0.5
−5 −5.5 −4.5
+7 6.5 7.5
0 −0.5 0.5
−5 −5.5 −4.5
+7 6.1 7.9
0 −0.9 0.9
−5 −5.9 −4.1
+7 6.1 7.9
0 −0.9 0.9
−5 −5.9 −4.1
0 −0.9 0.9
−5 −5.9 −4.1
0 −0.9 0.9
−5 −5.9 −4.1
Table 2-3 Power Level Accuracy Setup 1 (Signal Generators with Option 1E6)
+3 2.5 3.5
0 −0.5 0.5
−5 −5.5 −4.5
+3 2.5 3.5
0 −0.5 0.5
−5 −5.5 −4.5
+3 2.5 3.5
0 −0.5 0.5
−5 −5.5 −4.5
+3 2.5 3.5
0 −0.5 0.5
−5 −5.5 −4.5
+3 2.5 3.5
0 −0.5 0.5
−5 −5.5 −4.5
+3 2.5 3.5
0 −0.5 0.5
−5 −5.5 −4.5
+3 2.5 3.5
0 −0.5 0.5
−5 −5.5 −4.5
+3 2.1 3.9
0 −0.9 0.9
−5 −5.9 −4.1
+3 2.1 3.9
0 −0.9 0.9
−5 −5.9 −4.1
0 −0.9 0.9
−5 −5.9 −4.1
0 −0.9 0.9
−5 −5.9 −4.1
Power
Power Spectrum Actual
Meter Lower Upper
Frequency Level Analyzer Power
Reading Limit Limit
Setting Setting Marker Level
for (dBm) (dBm)
(dBm) (dB) (dBm)
−15 dBm
Power
Power Spectrum Actual
Meter Lower Upper
Frequency Level Analyzer Power
Reading Limit Limit
Setting Setting Marker Level
for (dBm) (dBm)
(dBm) (dB) (dBm)
−15 dBm
Power
Power Spectrum Actual
Meter Lower Upper
Frequency Level Analyzer Power
Reading Limit Limit
Setting Setting Marker Level
for (dBm) (dBm)
(dBm) (dB) (dBm)
−15 dBm
Power
Power Spectrum Actual
Meter Lower Upper
Frequency Level Analyzer Power
Reading Limit Limit
Setting Setting Marker Level
for (dBm) (dBm)
(dBm) (dB) (dBm)
−15 dBm
Limits (kHz)
Frequency Deviation
(MHz) (kHz)
Lower Measured Upper
Limits (%)
Frequency
Depth (%)
(MHz)
Lower Measured Upper
3 Service Software
The ESG Family Support Software contains the program and supporting files necessary to
run the automated performance tests and adjustments for your signal generator. This
chapter lists the equipment required to run the software, and gives instructions for
installing, administering, using and un-installing the software. For a description of the
individual performance tests and adjustments, refer to Chapter 4, “Performance Tests,”
and Chapter 5, “Adjustments.”
Follow this procedure to install the ESG_B or ESG_APDP Service Software on your
personal computer.
1. Insert “Disk 1” into the disk drive.
2. To display the Run dialog box:
• For MS Windows 95 or Windows NT: Select the Start button, then select Run ... from
the pop-up menu.
• For MS Windows version 3.x: Open the Program Manager, then select Run ... from
the File drop-down menu.
3. From the Run dialog box, type a:setup and select the OK button.
The Setup window is displayed as it loads files for the installation. Once these files are
loaded, the ESG Service Software’s Welcome screen is displayed.
4. Continue with the setup by selecting the Next button.
5. The ESG Service Software’s Important Information window is displayed. It contains
information that is vital to installing and using the software. In addition, any new
information may also be shown on this screen. Stop and read this information before
continuing with the software installation!
6. Continue with the setup by selecting the Next button.
The Choose Destination Location window is displayed. The default location for
installing the software is C:\HP_SVC\ESG_B (or C:\HP_SVC\ESG_APDP). Use this
as the software’s destination folder.
7. Continue with the setup by selecting the Next button.
The ESG Service Software’s Select Program Folder window is displayed. This
installation procedure will install the service software icons into a program folder. You
can enter a new folder name in the Program Folders text field or select a folder from the
Existing Folders field. It is recommended to use “ESG_B Service Software” (or
“ESG_APDP Service Software”) for the folder name.
NOTE This documentation refers to the folder name as “ESG_B Service Software.” If
you use another name for the folder, be aware of this difference.
NOTE This is the last point that you can cancel the installation. If you select the
Next button, the installation proceeds until the software is completely
installed.
NOTE The MS Windows program must be restarted before you can use the software.
When you select restart, the computer reboots.
If you do not want to restart MS Windows at this time, select the No, I will restart my
computer later radio button. If you select restart, the computer reboots and you can
start using the software.
Software Configuration
Follow the instructions below to configure the software to run in either User Mode or
Administration Mode.
Start the software using the steps appropriate for the version of MS Windows that is
installed on your PC.
• For MS Windows version 3.x:
1. Open the Program Manager window.
2. Open the HP Service Support program group.
3. Open the HP Service Software icon.
• For MS Windows 95 or Windows NT:
1. Select Start.
2. Select HP Service Software for PC’s.
3. Select HP Service Software.
The service support software has two configurations: User and Administration. The
following sections explain the difference between the two configurations.
1. Refer to the illustration above and fill in the fields in the User Information window:
a. In the User Name field, type in the word Admin. (Case sensitive.)
b. In the Password field, type in the word Falcon. (Case sensitive.)
2. Click OK. (Cancel closes the software application.)
1. Make sure that ESG_B is selected in the Select an Instrument Family list.
2. In the Select Model list, select the instrument model of the DUT to be adjusted or
tested.
3. In the Serial Number box, enter the complete serial number of the DUT.
4. In the Address box, enter the two-digit GPIB address of the DUT. (To display the
address on the signal generator, press Utility > GPIB/RS-232. The GPIB Address softkey
shows the current address.)
5. Select the OK button.
6. The Select Test Equipment and Tests window appears on the display. Close this
window.
2. Refer to Figure 3-3. In the Test Equipment dialog box, select the Device Type for the
new equipment you are adding.
Figure 3-3 Adding Test Equipment Using the Test Equipment Menu
NOTE The power sensor must be assigned GPIB address −1 (negative 1).
Cal Due Date the projected calibration due date of the new equipment.
Trace Number the calibration tracking number. This is the last required item.
Calibration Type (optional) is used only when special calibration data is associated
with the device being added, and only as a function setup by the
factory for equipment requiring specific calibration data. The only
devices currently requiring this feature are power sensors.
Table 3-1 is an example of the calibration information required
to ensure accuracy for measurements using the power sensor.
This window is accessed for data entry by selecting the words
CAL DATA from the Calibration Type field in the Edit or New
Test Equipment window, Figure 3-4.
Calibration
Frequency (MHz)
Factor (%)
0.1000 97.6
0.3000 98.9
1.0 99.1
3.0 99.4
Figure 3-4 Adding the Equipment Information Using the New Test Equipment
Window
6. Click OK.
NOTE The serial number of the test equipment added will be displayed in the
Equipment field of the Test Equipment dialog box (Figure 3-3).
Removal of test equipment is accomplished using the Test Equipment dialog box. Refer to
Figure 3-5.
1. Select the Device Type of the test equipment to be removed.
Figure 3-5 Removing and Editing Test Equipment Using the Test Equipment
Window
2. Select the model of the test equipment to be removed from the Models field.
3. Select the serial number of the test equipment to be removed from the Equipment field.
4. Click Remove.
5. Click Close.
1. Log into the software normally until the Select Test Equipment and Tests window is
displayed.
2. Click the Cancel button to close the Select Test Equipment and Tests window.
3. Refer to Figure 3-6. In the File drop-down menu, select Test Equipment Drivers.
This selection allows for the addition or removal of software drivers for the test
equipment being used to verify the performance of the DUT.
4. To add a device driver to the existing list of test equipment drivers, click Add
(Figure 3-7).
Figure 3-7 Adding a Device Driver Using the Test Equipment Drivers Window
5. Refer to Figure 3-8. Using the standard file search procedure, select the driver that you
are adding and click OK.
Figure 3-8 Using the Open Dialog Box to Search for a Device Driver File to
Add
The selected driver should now be displayed in the Test Equipment Drivers dialog box,
as seen in Figure 3-9.
Figure 3-9 Removing a Device Driver Using the Test Equipment Drivers
Window
2. Ensure that the information displayed in the Version, Device Type, and Models
Supported fields reflects the correct information for the selected driver being removed.
3. Click Remove.
4. Click Close.
2. Refer to Figure 3-11. To add a test driver to the existing list of test drivers, click Add.
Figure 3-11 Adding a Test Driver Using the Test Drivers Window
3. Refer to Figure 3-12. Using the standard file search procedure, select the test driver
that you are adding and click OK.
Figure 3-12 Using the Open Dialog Box to Search for a Test Driver File to Add
The selected driver should now be displayed in the Test Drivers dialog box, as seen in
Figure 3-11.
4. Click Close (Figure 3-11).
Figure 3-13 Removing a Test Driver Using the Test Drivers Window
2. Ensure that the information displayed in the Version, Required Devices, and Tests
Supported fields reflects the correct information for the selected driver being removed.
3. Click Remove.
4. Click Close.
Adding Datapacks
Adding datapacks for test procedures is accomplished using the Datapacks dialog box.
1. Refer to Figure 3-14. In the File drop-down menu, select Datapacks.
2. Refer to Figure 3-15. To add a datapack to the existing list of datapacks, click Add.
3. Refer to Figure 3-16. Using the standard file search procedure, select the datapack that
you are adding and click OK.
Figure 3-16 Using the Open Dialog Box to Search for a Datapack File to Add
The selected driver should now be displayed in the Datapacks dialog box, as seen in
Figure 3-15.
4. Click Close (Figure 3-15).
Removing Datapacks
Removing datapacks is accomplished using the Datapacks dialog box. Refer to Figure 3-17.
1. Select the datapack (.000) file to be removed.
2. Click Remove.
3. Click Close.
1. Make sure that ESG_B is selected in the Select An Instrument Family list.
2. In the Select Model list, select the signal generator model of the DUT to be adjusted or
tested.
3. In the Serial Number box, enter the complete serial number of the DUT.
4. In the Address box, enter the two-digit GPIB address of the DUT. (To display the
address on the signal generator, press Utility > GPIB/RS-232. The GPIB Address softkey
shows the current address.)
5. Select the OK button.
1. Select either the Performance Tests radio button to display the list of automated
performance tests or the Adjustments radio button to display the list of automated
adjustments.
2. From the list of performance tests or adjustments, select the tests or adjustments that
will be performed on the DUT. Select a test name by clicking on it (the DCFM
FREQUENCY OFFSET test will be selected for the sections following). The selected
test will be highlighted.
Select all of the performance tests or adjustments by selecting the Select All button.
Unselect all of the selected tests or adjustments by choosing the Unselect button.
As each test or adjustment is selected, the test equipment required to perform the test
or adjustment is listed in the Required Test Equipment box. As each test or adjustment
is removed from the list, the test equipment required to perform the test or adjustment
is removed from the list. Only the test equipment for the highlighted tests is displayed
in the list.
3. Once the test is highlighted, identify all test equipment listed in the Required Test
Equipment box. To identify test equipment:
a. Select the type of device from the Device Type list.
b. Select the model number from the Model list for the device.
c. Select the device’s serial number from the Available Test Equipment list.
d. Select the Add button to add the device to the list in the Selected Test Equipment
box.
The following buttons are available for the Selected Test Equipment box:
Add Copies the tests and adjustments highlighted in the Available Test
Equipment box to the Selected Test Equipment box.
Remove Removes the highlighted test equipment from the Selected Test
Equipment box.
Clear Removes all of the test equipment from the Selected Test Equipment
box.
NOTE If necessary, the test equipment GPIB address can be changed after it is
added to the Selected Test Equipment box. Change the GPIB address by
pressing the right arrow on the keyboard until the GPIB address selection in
the Selected Test Equipment box is selected. (The GPIB address is selected
when it has a dark box around the selection.) Then, type the new GPIB
address and press Enter to change the address.
5. If the appropriate tests or adjustments are listed in the Selected Tests box and the
appropriate test equipment is listed in the Selected Test Equipment box, select the OK
button.
The results file name suffix is.log. The results are saved automatically to the.log file.
You may select the directory into which you would like to save the file by changing the
drive and folder in this window. This file is a text file and can be viewed using many text
editors. Notepad, a standard accessory in MS Windows, is an example of a text editor
that can be used to view text files.
2. Once you have selected the drive, folder, and assigned a file name, select the OK button
to save these settings for when the tests are complete.
1. Select the Run button to start the automated tests or adjustments displayed in the
Selected Tests box.
The software steps through the tests or adjustments sequentially.
2. Follow the instructions displayed on the PC.
A description for each automated performance test or adjustment can be found in
Chapter 4, “Performance Tests,” and Chapter 5, “Adjustments.”
The Selected Tests box displays the name of the selected tests, the pass/fail status (P/F) of
each test that has been run, the total number of points that each test checks, and the
number of points that passed and failed for each test. The pass/fail status indicates a
failure if any point in that test fails.
The Selected Test Results box shows the results of the test that is highlighted in the
Selected Tests box. The Selected Test Results box shows the pass/fail status (P/F), the
lower limits (LL), the measured value (Result), the upper limits (UL), and the measured
units (for example, kHz, mV, or dBm) for each test point checked by the performance tests
and some adjustments. “**” indicates values not displayed by the adjustments.
The Current test box shows the results of the test that is currently running. The Current
test box shows the pass/fail status (P/F), the lower limits (LL), the measured value
(Result), the upper limits (UL), and the measured units (for example, kHz, mV, or dBm) for
each test point checked by the performance tests and some adjustments. “**” indicates
values not displayed by the adjustments.
The following buttons are also displayed on the HP Service Support Software window.
These buttons are used to control the testing. Only the buttons that are appropriate are
active. For example, if the test is already in progress, the Run button would not be active.
Run Start running the highlighted test when initially starting the testing. Also
used to start testing at the same point where the test was stopped.
Stop Stop the test that is currently running. The test stops after making the
next measurement.
Restart Rerun a test that was running when the testing was stopped. This restarts
the test from the beginning.
Next Test Quit running the current test and give it a Fail status. Then, continue
testing with the next test.
Rerun Restart the testing at the beginning of the first test.
Abort Quit testing. Abort all tests.
The log file is the file in which the test (or adjustment) results are stored. Select the Yes
button to print the test results using the printer connected to LPT1. Choosing the No
button allows you to exit the program without printing the test results.
There are two other methods of printing the test results.
The first method is selecting the Print Log File selection from the File drop-down menu.
The software asked you to define the computer path and file name when the tests were
performed. The default directory is the “log” subdirectory of the destination directory
where you installed the software. (C:\HP_SVC\EGS_B\ was the default destination
directory.)
The second alternative method of printing is opening the file in a text editor and printing
the file in the text editor.
4 Performance Tests
Unless stated otherwise, the procedures in this chapter enable you to test the electrical
performance of the signal generator to its specifications.
Calibration Cycle
This instrument requires periodic verification of performance. Under normal use and
environmental conditions, the instrument should be calibrated every two years. Normal
use is defined as about 2,000 hours of use per year.
Support Software
The ESG Family Support Software runs performance tests and where applicable,
generates reports of the results for the tests. The following manual tests are not listed in
the software:
This chapter shows how to run the software to test and verify the performance of a signal
generator. Chapter 3, “Service Software,” explains how to set up the software for a
particular set of test instruments and interfaced components in a test environment.
6. Click OK.
7. In the Select Test Equipment and Tests window (Figure 4-3), create a list of test
equipment that will verify the signal generator’s performance:
a. In the Device Type list (item 6), select a device type that you want to add to the
equipment list.
b. In the Model list (item 7), select the model of the device type that you want to add to
the equipment list.
c. In the Available Test Equipment box (item 8), select the serial number of an
instrument. There may be more than one instrument available for any model. Make
sure that the instrument calibration due date has not passed. If the due date has
passed, use another available instrument. Also, alert your system administrator that
the instrument is due for calibration.
d. Click Add (item 9) to add the instrument to the test equipment list.
e. Continue adding test equipment to the list by repeating these steps.
NOTE You can correct mistakes while you are entering instruments in the test
equipment list:
• To delete a single entry in the equipment list, click on the unwanted
selection in the Selected Test Equipment box and then click Remove (item
11).
• To delete the entire list of equipment, click Clear (item 12).
8. Create a list of performance tests that you want the software to run (Figure 4-3):
a. In the Available Tests list (item 13), select a performance test.
b. Click Add (item 14) to add the test to the list of tests that the software will run.
You can view the Required Test Equipment list (item 15) for a performance test by
clicking on the test title in the Available Tests (item 13) box.
c. Continue adding performance tests to the list by repeating these steps.
NOTE You can correct mistakes while you are entering tests in the performance test
list:
• To delete a single entry in the test list, select the unwanted test in the
Selected Tests box (item 16) and then click Remove (item 17).
• To delete the entire list of equipment, click Clear (item 18).
10.To save the test results to a file, make the following selections (Figure 4-4):
a. Click on the Drives window (item 20) and select the drive where you want to save the
test results.
b. Click on the Save file as type window (item 21) and select the file type for saving the
test results.
c. Click on the File name window (item 22) and type in a name for the test results file.
11.Click OK (item 23).
12.In the Main Test and Results window (Figure 4-5), verify that the selected DUT (item
24), serial number (item 24), and selected tests (item 25) are correct. Then click Run.
The selected performance tests are listed in the order in which they will run. You
cannot change this order.
During the testing sequence the window shows the following information:
• the model and serial number of the signal generator under test (item 24)
• the list of selected performance tests with the current test highlighted (item 25)
• the performance test currently running (item 26)
• the data points taken (item 27)
• the total number of data points that will be measured during the test (item 28)
• the number of points that have currently passed the test (item 29)
• the number of points that have currently failed the test (item 30)
• the lower limits (LL) (item 31) and upper limits (UL) (item 32) for each test point
• the measured results (item 33) and the pass/fail (P/F) (item 34) indication
You can scroll the results window, vertically and horizontally. You can also click on the
following buttons during the test sequence:
Run Start running the highlighted test when initially starting the testing.
Also used to start testing at the same point where the test was stopped.
Stop Stop the test that is currently running. The test stops after making the
next measurement.
Restart Rerun a test that was running when the testing was stopped. This
restarts the test from the beginning.
Next Test Quit running the current test and give it a Fail status. Then, continue
testing with the next test.
Rerun Restart the testing at the beginning of the first test.
Abort Quit testing. Abort all tests.
To begin a performance test sequence with a new DUT, click File and New Session. This
allows you to reselect the DUT, the equipment list, and the performance test list.
13.When the software has run all of the selected performance tests, the status bar
indicates that the tests are completed (item 35).
Recommended Equipment
• HP/Agilent 8902A Measuring Receiver Option 010
• HP/Agilent 8903B Audio Analyzer
Equipment Setup
Figure 4-6 Internal FM Accuracy and Distortion Tests Setup
Recommended Equipment
• HP/Agilent 8902A Measuring Receiver Option 010
• HP/Agilent 8903B Audio Analyzer
• HP/Agilent 8663A Signal Generator
• MD/MDC-174 Mixer (1 to 2800 MHz)
• MD/MDC-164 Mixer (0.5 to 9 GHz)
Equipment Setups
Figure 4-7 Accuracy and Distortion for AM ≤ 1300 MHz Test Setup
Figure 4-8 Accuracy and Distortion for AM > 1300 MHz and ≤ 2500 MHz Test Setup
Figure 4-9 Accuracy and Distortion for AM > 2500 MHz Test Setup
Recommended Equipment
• HP/Agilent 8902A Measuring Receiver Option 010
• HP/Agilent 8903B Audio Analyzer
Equipment Setup
Figure 4-10 Accuracy & Distortion for Phase Modulation Tests Setup
4. FM Frequency Response
This automated test verifies the FM frequency response specifications. The equipment
measures the variations in frequency deviations due to changes in the applied FM rate; dc
to 100 kHz. The variations are expressed relative to a reference signal; 1 kHz rate set at
100 kHz deviation, in dB. Each frequency is tested with this sequence:
1. A reference signal is measured.
2. The set deviation is measured for each applied rate change.
3. Each deviation measurement is compared to the reference signal deviation.
The comparison is a calculation of the difference in the deviations relative to the reference
rate. The relative value is the FM frequency response.
Recommended Equipment
• HP/Agilent 8902A Measuring Receiver Option 010
• HP/Agilent 8904A Function Generator
Equipment Setup
Figure 4-11 FM Frequency Response Tests Setup
5. AM Frequency Response
This automated test verifies the AM frequency response specifications. The equipment
measures the variations in modulation depth due to changes in the applied AM rate; dc to
10 kHz. The response is relative to a 1 kHz rate set at the test depth, and expressed in dB.
Each frequency is tested with the following sequence:
1. A reference signal is measured.
2. The set depth is measured for each applied rate change.
3. Each depth measurement is compared to the reference signal depth.
The comparison is a calculation of the difference in the depths relative to the reference
rate. The relative value is the AM Frequency Response.
Recommended Equipment
• HP/Agilent 8902A Measuring Receiver Option 010
• HP/Agilent 8904A Function Generator
• HP/Agilent 8663A Signal Generator
• MD/MDC-164 Mixer (0.5 to 9 GHz)
Equipment Setups
Figure 4-12 Frequency Response for AM ≤ 1300 MHz Test Setup
Figure 4-13 Frequency Response for AM > 1300 MHz Test Setup
Recommended Equipment
• HP/Agilent 8902A Measuring Receiver Option 010
• HP/Agilent 8904A Function Generator
Equipment Setup
Figure 4-14 Phase Modulation Frequency Response Test Setup
Recommended Equipment
• HP/Agilent 53132A Frequency Counter Option 050
Equipment Setup
Figure 4-15 DCFM Frequency Offset Test Setup
Recommended Equipment
• HP/Agilent 8902A Measuring Receiver Option 010
• HP/Agilent 8903B Audio Analyzer Options 051, 010
• HP/Agilent 8663A Signal Generator
• MD/MDC-174 Mixer (1 to 2800 MHz)
• MD/MDC-164 Mixer (0.5 to 9 GHz)
Equipment Setups
Figure 4-16 Residual FM ≤ 2500 MHz Test Setup
Recommended Equipment
• HP/Agilent 8563E Spectrum Analyzer
• HP/Agilent 8491A/B Attenuator (10 dB) Option 010
Equipment Setup
Figure 4-18 Harmonic, Subharmonic, and Nonharmonic Spurious Signals Tests Setup
Recommended Equipment
• HP/Agilent 438A Power Meter
• HP/Agilent 8482A Power Sensor
• HP/Agilent 89441A Vector Signal Analyzer
• HP/Agilent 8563E Option 001 Signal Analyzer
• HP/Agilent 8491A/B Option 006 Attenuator (6 dB)
• HP/Agilent 8495G Programmable Step Attenuator (0 to 70 dB)
• HP/Agilent 11713A Step Attenuator Driver
Equipment Setups
Figure 4-21 Low-Power, Power Level Accuracy (≥ 10 MHz and ≤ 2 GHz) Setup
NOTE The internal timebase can be tested after the AC power is reconnected for 10
minutes. For best accuracy, retest after the instrument has been on for 24
hours.
Frequency changes due to either a change in orientation, with respect to the earth’s
magnetic field, or to a change in altitude usually go away when the instrument is returned
to its original position. A frequency change due to mechanical shock usually appears as a
fixed frequency error.
Recommended Equipment
• HP/Agilent 54610B Digital Oscilloscope
• HP/Agilent 5071A Primary Frequency Standard
Equipment Setup
Figure 4-23 Timebase Aging Rate Test Setup
Procedure
1. Preset all instruments and let them warm up for at least one hour.
2. If the oscilloscope does not have a 50Ω input impedance, connect channel 1 through a
50Ω feedthrough.
3. On the oscilloscope, adjust the external triggering for a display of the 10 MHz REF
OUTPUT signal from the synthesizer.
a. On Channel 1 of the oscilloscope, set the following:
• Display: On
• Volts/Division: 500 mV
• Input Coupling: dc
• Input Impedance: 50 ohms (or use a 50 ohm feedthrough)
b. On Channel 2 of the oscilloscope, set the following:
• Display: Off
• Input Coupling: dc
• Input Impedance: 50 ohms (or use a 50 ohm feedthrough)
c. On the Timebase of the oscilloscope, set the following:
• Time/Division: 5 ns
d. On the Trigger of the oscilloscope, set the following:
• Trigger Source: CH 2
• Trigger Mode: Normal
• Trigger Level: 0 V
4. If the signal drifts a full cycle (360°) in less than 2 minutes, refer to Chapter 5,
“Adjustments,” and perform the “Internal Reference Oscillator Adjustment.” After the
adjustment, restart this performance test.
5. Watch the oscilloscope display and monitor the time. Notice the time required for a 360°
phase change and record this time as T1.
6. Wait 3 to 24 hours. Record the time that you waited as T2.
7. Repeat steps 1 through 6. Notice the time required for a 360° phase change and record
this time as T3.
8. Calculate the aging rate as follows:
Aging Rate = (1 cycle/10 MHz) (1/T1 − 1/T3) (24 hours/T2)
Example:
T1 = 351 seconds
T2 = 3 hours
T3 = 349 seconds
= (1 cycle/10 MHz) (1/351s − 1/349s) (24h/3h)(1.306 × 10-11 per day)
9. Write the test results on the performance test record located at the end of this chapter.
Compare the results to the limits in the test record.
NOTE If the absolute frequency of the standard and the timebase oscillator are
extremely close, you can reduce the measurement time (T1 and T3) by
measuring the time required for a phase change of less than 360°. In step 4,
change 1 cycle to 0.5 cycle for 180°, or 0.25 cycle for 90°.
Recommended Equipment
• HP/Agilent 438A Power Meter
• HP/Agilent 8482A Power Sensor
• HP/Agilent 33120A Option 001 Arbitrary Waveform Generator (2)
• HP/Agilent 89441A Options AY7, AY9, AYA, UFG or UTH Vector Signal Analyzer
Equipment Setup
Figure 4-24 Digital Modulation Level Accuracy Test Setup
This automated test verifies the RF modulation quality of the signal generator’s internal
I/Q modulation. A vector signal analyzer is connected to the signal generator’s RF output.
The internal baseband generator modulates the RF carrier in each of the available digital
modulation formats. The vector signal analyzer measures the appropriate error parameter
for the modulation generated (EVM for PHS, PDC and NADC formats; global phase error
for GSM format).
Recommended Equipment
• HP/Agilent 89441A Options AYA, AY9, UFG or UTH Vector Signal Analyzer
Equipment Setup
Figure 4-25 Internal Digital Modulation Quality Test Setup
Recommended Equipment
• MD/MDC-174 Mixer (0.001 to 2.8 GHz)
• HP/Agilent 8663A Signal Generator
• HP/Agilent 89441A Options AY9, AYA, AYH, UFG or UTH Vector Signal Analyzer
Equipment Setups
Figure 4-26 Custom I/Q RF Modulation Quality ≤ 2000 MHz Test Setup
Figure 4-27 Custom I/Q RF Modulation Quality > 2000 MHz Test Setup
Procedure
1. Connect the equipment as shown in Figure 4-26 on page 4-29.
2. Preset all of the equipment.
3. Follow the instructions as they appear on the controller’s display.
Recommended Equipment
• HP/Agilent 89441A Vector Signal Analyzer
Equipment Setup
Figure 4-28 I/Q Modulation Quality Test Setup (Options UN3, UN4, & UN8)
Procedure
1. Connect the equipment as shown in Figure 4-28.
2. Preset all of the equipment.
3. Follow the instructions as they appear on the controller’s display.
NOTE This test does not test the high performance pulse circuitry used in
Option 1E6 instruments. See “20. Pulse Rise/Fall Time (Option 1E6 only)” on
page 4-36.
Recommended Equipment
• HP/Agilent 8563E Spectrum Analyzer
• HP/Agilent 8491A/B Option 010 Attenuator (10 dB)
• HP/Agilent 33120A Arbitrary Waveform Generator
Equipment Setup
Figure 4-29 Pulse Modulation On/Off Ratio Test Setup
Recommended Equipment
• HP/Agilent 8563E Spectrum Analyzer
• HP/Agilent 8491A/B Option 010 Attenuator (10 dB)
• HP/Agilent 33120A Arbitrary Waveform Generator
Equipment Setup
Figure 4-30 Burst Modulation On/Off Ratio Test Setup
Recommended Equipment
• HP/Agilent 8563E Spectrum Analyzer
• HP/Agilent 8491A/B Option 010 Attenuator (10 dB)
Equipment Setup
Figure 4-31 Adjacent Channel Power Test Setup (Option UN5)
Recommended Equipment
• HP/Agilent 8563E Option 007 Spectrum Analyzer
• HP/Agilent 8491A/B Option 010 Attenuator (10 dB)
Equipment Setup
Figure 4-32 Alternate Timeslot Power Settling Time Test Setup (Option UNA)
Procedure
1. Connect the equipment as shown in Figure 4-32.
2. Preset all of the equipment.
3. Follow the instructions as they appear on the controller’s display.
Recommended Equipment
• HP/Agilent 54750A Digitizing Oscilloscope
• HP/Agilent 54751A or 54752A Plug-in
• HP/Agilent 33120A Function Generator
• HP/Agilent 8491A/B Opt 010 Attenuator (10 dB)
• HP/Agilent 8491A/B Opt 020 Attenuator (20 dB)
Equipment Setup
Figure 4-33 Pulse Rise/Fall Time Setup (Option 1E6)
Figure 4-34
Figure 4-35
Figure 4-36
To get an accurate idea of the actual phase noise of the DUT, it may be necessary to
combine the two plots, always using the lowest result of both (see Figure 4-37). Keep in
mind that the actual phase noise results are probably better than what is displayed.
The results shown on the E5500 system may be worse than the actual DUT. If a tested
device has the same phase noise characteristics as the system’s down-converting source,
the displayed result will be 3 dB worse than either device. To eliminate this error, measure
three different sources. This results in three equations and three unknowns, and the
software can then sort out the actual results for each device. This 3-source substitution
method generates a correction table that can be used for future measurements, effectively
eliminating the combined error (of up to 3 dB) for subsequent measurements. For best
results, use this method. Refer to the E5500 documentation for instructions on how to
implement a 3-source substitution. When the plots in this section were taken, this method
was not used. If it had been used, portions of the plots could have been up to 3 dB better.
Another way of stating the above: for any phase noise measurement where a 3-source
substitution is not performed, the actual phase noise of one of the two sources involved
(DUT or down-converting source) is at least 3 dB better than shown.
Figure 4-37
Measuring Residual FM
Residual FM is closely related to phase noise. Good phase noise typically implies good
residual FM. Directly measuring residual FM at very low levels is difficult, but the
HP/Agilent E5500 system can integrate the phase noise results to determine an accurate
value. This is done under the E5500 software Trace Integration menu by selecting a data
type of Snu(f) or Spectral density of frequency fluctuations.
In the ESG-P series, residual FM is defined only within a 300-3000 Hz bandwidth and
with a specified frequency response within this band. That frequency response is specified
by a CCITT filter characteristic. In the E5500 system, software revision 5.0 or higher, use
of this filter response can be achieved by simply checking the CCITT weighting box under
the Trace Integration menu.
If you use the older HP/Agilent 3048 phase noise system, there is no direct way to measure
residual FM, but because residual FM and phase noise are so closely related and
interdependent, it is safe to assume that if the DUT’s phase noise meets specification,
looks typical (between 300 and 3000 Hz), and shows no unexpected large spikes in the
response, then it will most likely also meet the residual FM specification.
NOTE This is not a performance test. This check is only provided to ensure that the
dual arbitrary waveform generator is operational. The results are verified
visually on the oscilloscope, and there is no data that is automatically logged
or that can be printed out.
Recommended Equipment
• An oscilloscope with 2 input channels
Equipment Setup
Figure 4-38 Dual Arbitrary Waveform Generator Setup (Option UND)
Procedure
1. Connect the test equipment as shown. The oscilloscope is not connected to GPIB; this
allows for the use of most general 2-channel oscilloscopes.
2. Set the oscilloscope to display both Channel 1 and Channel 2. Set the oscilloscope to
trigger on Channel 1. The recommended settings are:
• Vertical Scale: 500 mV/div
• Timebase: 5 microseconds/div
3. Verify that Channel 1 and Channel 2 are both displaying triangular waveforms
approximately 180 degrees out of phase and that they do not have any discontinuities.
Refer to the illustration below.
Recommended Equipment
No equipment is required.
Equipment Setup
Figure 4-39 GSM Loopback BER Check Setup (Option 300)
Recommended Equipment
No equipment is required.
Recommended Equipment
No equipment is required.
Test Equipment Used Model Number Trace Number Cal Due Date
Measurement
Test Description Results Specification
Uncertainty
Table 4-3 ESG-AP and ESG-DP Series Signal Generators Phase Noise and
Residual FM Performance Test Record
Specification
Frequency (GHz) Measurement Pass/Fail?
(dBc/Hz)
1.0 <1
2.0 <2
3.0 <3
4.0 <4
5 Adjustments
This chapter contains the adjustment procedures that may be required for the signal
generator.
Adjustment Relationships
Anytime an adjustment is made to the signal generator other related adjustments may be
affected. For optimal performance, whenever an adjustment is performed, the related
adjustments should also be performed.
Procedure
Figure 5-1 Internal Reference Oscillator Adjustment Setup
Description
This test is used to calibrate the gain of the ABUS. The ABUS is connected to the ground
node (ACOM) and the ADC is zeroed. The ABUS is then connected to the 10 V reference
and measured. The result of the measured value divided by the ideal value is the ABUS
gain calibration constant. This value is then saved in the signal generator’s firmware.
Procedure
Figure 5-2 Analog Bus ADC Calibration Setup
Description
This adjustment determines the YO offset and gain calibration constants that minimize
the YO phase lock error voltage. The phase lock error voltage is measured with the
internal analog bus and is minimized at both low and high YO frequencies by controlling
the YO pretune DAC. The YO pretune DAC settings are used to calculate the YO offset and
gain calibration constants.
Procedure
Figure 5-3 Pretune Calibration Setup
Description
This test is used to calibrate the internal source amplitude versus frequency. The values
for offset and gain are set to their default values in the internal source calibration arrays.
Next, the offset calibration factor is determined by connecting the DVM to the ABUS and
measuring the dc offset of the motherboard common ground ABUS node and the offset of
the DSP ABUS node with the DSP set to 0 Vdc. The calibration factor is the difference
between these two measurements. The scaling factors are determined by setting the DSP
to output a full-scale sinewave with the DVM connected to the front panel LF OUTPUT
port. Measurements are then made in 1 kHz steps and the calibration factors are
calculated to achieve 2 Vpeak on the motherboard by accounting for the nominal gain
presented by the reference board. Upon completion, the calibration factors are stored in
the signal generator’s firmware.
Procedure
Figure 5-4 Internal Source Calibration Setup
Description
This test sets the VCO bias potentiometer at a level that will keep the VCO in a stable
operating region over the entire frequency and temperature range. First, the F/2 and the
lock angle potentiometers are set fully CW (clock-wise). The signal generator is set to
750 MHz and the potentiometer is adjusted until the F/2 oscillations disappear. The
voltage at the SYNTH_F2 ABUS node is measured and then the potentiometer is adjusted
for a 0.77 V to 0.80 V drop.
After you have performed this adjustment, you must perform the Lock Angle
Potentiometer Adjustment on the following page.
Procedure
Figure 5-5 VCO Bias Potentiometer Adjustment Setup
Description
This test is used to optimize the phase detector sampling of the synthesizer phase-locked
loop reference frequencies. The lock angle adjustment sets the time during the reference
cycle when the ultra-quiet time phase detector measurement occurs. The phase detector
needs to make its measurement at the quietest point in the reference cycle. By adjusting
this potentiometer to minimize the level of the fractional-N spur, the time of the phase
detector sample can be optimized.
Procedure
Figure 5-6 Lock Angle Potentiometer Adjustment Setup
Description
This test determines the tuning sensitivity of the synthesizer loop. To measure the
sensitivity, the tuning voltage is measured as the frequency is stepped from 500 to
1000 MHz in 10 MHz steps. At each incremental frequency the tuning voltage is measured
(Vtune1) and again at the incremental frequency +300 kHz (Vtune2). The sensitivity is
then calculated in units of MHz/V and stored in the signal generator’s firmware.
Procedure
Figure 5-7 KV versus Frequency Calibration Setup
Description
This test calibrates the AM path to remove any offset generated when LIN AM, LIN
BURST, or LOG BURST are enabled. This test determines the ALC_REF_DAC delta value
which is used to correct the offset when the modulation is enabled. This value is then
stored in the appropriate calibration constant.
Procedure
Figure 5-8 AM Audio Path Offset Calibration Setup
Description
This test ensures that the signal generator has warmed-up sufficiently and then adjusts
the coarse and fine reference timebase DACs for minimum internal reference frequency
error. The coarse and fine DAC calibration factors are then stored in the signal generator’s
firmware.
Procedure
Figure 5-9 Timebase DAC Calibration Setup
Description
This test is used to remove the offset associated with the FM SCALE DAC operational
amplifier located on the reference board. This calibration results in a DAC value for FM
OFFSET DAC 2. After this DAC value has been properly adjusted, the effects of the FM
SCALE DAC value on the offset will be minimized.
Procedure
Figure 5-10 FM Scale DAC Offset Calibration Setup
Description
This test is used to remove the offsets associated with the various FM1 and FM2 audio
paths on the reference board. When FM is enabled, voltage offsets on the reference and
synthesizer boards appear as frequency shifts on the synthesizer VCO. By using a
frequency counter to measure the frequency of the VCO, the voltage offsets can be
quantified. The voltage offsets are adjusted and stored in the signal generator’s firmware.
Procedure
Figure 5-11 FM Path Offset Calibration Setup
Description
This test is used to remove the offset associated with the FM IN-BAND DAC located on the
synthesizer board. The calibration determines the DAC value for the FM IN-BAND
OFFSET DAC on the synthesizer board which will remove the offset.
Procedure
Figure 5-12 FM In-Band DAC Offset Calibration
Description
This test is used to remove the offset associated with the differential inverting amplifier on
the FM input of the synthesizer board. The calibration determines the DAC value for the
FM OFFSET DAC 1 on the reference board which will remove the offset associated with
the amplifier.
Procedure
1. Preset the signal generator.
2. Follow the instructions as they appear on the controller’s display.
Description
This test equalizes the gain between the FM1 and FM2 paths. The gain of the FM2 path is
adjusted using the FM SCALE DAC and the resulting DAC value is stored in the signal
generator’s firmware. This calibration only affects source-independent gains. When
uncalibrated sources feed into the paths, the gains are adjusted using the Modulation
Source Relative Gain Calibration.
Procedure
Figure 5-13 FM 1/2 Path Ratio Gain Calibration Setup
Modulation Source
Relative Gain Compression Calibration
Description
This test provides a scaling factor for all of the multiplexed FM modulation inputs. The
scaling factor is used by the signal generator’s firmware to scale the actual requested FM
deviation from the synthesizer board when the corresponding input is selected. Three
scaling factors (EXT1, EXT2, and INT1) are generated during this calibration. The
resulting values are used to calculate the calibration values that are then stored in the
signal generator’s firmware.
Procedure
Figure 5-14 Modulation Source Relative Gain Compression Calibration Setup
Description
This test adjusts the FM out-of-band deviation to match the in-band FM deviation. It also
determines the attenuation values of the out-of-band attenuators and sets the values of
some other FM constants. The loop bandwidth of the synthesizer phase-locked loop is
approximately 5 kHz. This calibration is used to set the FM deviation at rates above the
loop bandwidth (out-of-band) so they will equal the deviation rates within the loop
bandwidths (in-band).
Procedure
Figure 5-15 FM Out-of-Band Calibration Setup
Description
This test adjusts the match between the delay of the signal passing through the FM
in-band path to the delay of the signal passing through the FM out-of-band path. This
calibration is the final adjustment required to achieve good FM performance over a wide
range of different rates. The FM delay potentiometer forms part of an RC network on the
input of the FM out-of-band circuit. The adjustment is used to flatten the delay of the FM
response from low rates to high rates. The delay affects the deviation accuracy. By
monitoring the deviation, the delay can be adjusted.
Procedure
Figure 5-16 FM Delay Potentiometer Adjustment Setup
Description
This test is used to calibrate the phase modulation circuitry on the synthesizer board. The
phase modulation has two operational modes: normal and wide bandwidth. This
calibration determines the attenuation correction factors for maximum deviation for each
of the phase modulation out-of-band ranges for normal and wide bandwidth modes and
in-band and out-of-band operation. This calibration MUST be performed at the same
frequency as the FM out-of-band calibration (1 GHz). The resulting correction factors are
then stored in the signal generator’s firmware.
Procedure
Figure 5-17 Wide Bandwidth Phase Modulation Calibration Setup
Description
This adjustment calibrates the FM/PM out-of-band paths on the Fractional-N and YO
driver modules. The internal ABUS is connected to the phase lock loop integrator node on
each board, and the attenuators and DAC are adjusted to minimize the voltage. This
ensures that the in-band and out-of-band paths are matched. The “FM In-Band DAC
Offset Calibration” on page 5-15 must be performed prior to this adjustment.
Procedure
Figure 5-18 FM/PM Out-of-Band Calibration Setup
Description
This adjustment calibrates the FM/PM YIG oscillator frequency compensation latches on
the YO driver assembly. These latches adjust the flatness of the YO FM/PM paths for rates
greater than 100 kHz. The “FM In-Band DAC Offset Calibration” on page 5-15 and the
“FM/PM Out-of-Band Calibration (ESG-AP and ESG-DP only)” on page 5-22 must be
performed prior to this adjustment.
Procedure
Figure 5-19 FM/PM YO Freq. Compensation Setup
DCFM Calibration
Description
This test removes all of the dc offsets associated with the FM path while in DCFM mode.
This test uses only the FM1 path to verify the functionality of the circuitry. The resulting
values are stored in the signal generator’s firmware.
Procedure
Figure 5-20 DCFM Calibration Setup
Description
The LF output provides a calibrated audio frequency signal. This test is used to set the
full-scale amplitude of the LF-OUT DAC on the reference board. When the calibration is
complete, the voltage on the LF output should be equal to the input voltage, plus or minus
1 mV. The resulting DAC values are then stored in the signal generators firmware.
Procedure
Figure 5-21 Low Frequency (LF) Output Calibration Setup
Description
This test is used to calibrate the positive trip level of the EXT1 and EXT2 peak detectors
located on the reference board. The calibration generates DAC values for the modulation
comparator DACs which provide a voltage to the window comparator operational
amplifiers. When the peak detector level DACs are set correctly, the overmodulation
indicator will trigger whenever the voltage to the EXT1/EXT2 input is greater than 1.03V.
The resulting DAC values are saved as a calibration constant in the signal generator’s
firmware.
Procedure
Figure 5-22 External Input Peak Detector Calibration Setup
Description
This test is used to adjust the bias modulator circuitry to provide an accurate logarithmic
drop in power level for a linear input voltage. When properly adjusted, a one-volt signal on
the input will result in a 10 dB drop in power level. The adjustment involves the
adjustment of three DACs (BURST BIAS, BURST GAIN, and BURST OFFSET) at several
different frequencies. The BURST OFFSET DAC sets the initial current level through the
burst modulator diode. The BURST GAIN DAC is used to calibrate the input voltage level
to the burst driver circuit. The BURST BIAS DAC sets the breakpoint at which the
modulator switches from a logarithmic to a linear transfer function. The results are stored
in the calibration arrays associated with each DAC.
Procedure
Figure 5-23 Burst Modulator Calibration Setup
Description
This test adjusts the scaling of the linear burst audio path until a -0.99 Vdc input to the
EXT1 connector results in a 40 dB drop in power relative to 0.00 Vdc input when linear
burst is activated. The AM 1 DAC value is changed until this result is achieved and its
value is stored as a calibration array.
Procedure
Figure 5-24 Burst Audio Path Gain Calibration Setup
Description
This test calibrates the PRE LEVEL REF DAC on the output board. This DAC is used to
control the RF power level that is incident upon the marble I/Q modulator by setting the
control point for the prelevel loop. The prelevel loop detector is on the output of the marble
I/Q modulator, its drive circuitry on the output board, and its RF modulator on the
synthesizer board. This calibration ensures that there will be sufficient power available at
the marble I/Q modulator to perform the desired modulation while minimizing
intermodulation distortion when the digital modulation is activated. The signal generator
is set to a frequency and the auxiliary output amplitude level is adjusted using the PRE
LEVEL DAC to within the specified tolerance of the desired value. This process is repeated
for all the frequencies in the marble I/Q modulator index array. The results are then stored
in the signal generator’s firmware.
Procedure
Figure 5-25 Prelevel Calibration Setup
Description
This adjustment determines and stores calibration data for the prelevel reference DAC on
the output module. This DAC maintains a consistent RF power level over the full
frequency range between the synthesizer module and the output module. Measurements
are performed at the RF OUTPUT connector. DAC calibration is determined by making
measurements at a high power level, reducing power level 10 dB with the ALC DAC, then
setting the prelevel reference DAC to set the RF output to +10 dBM.
Procedure
Figure 5-26 Prelevel Calibration Setup
Description
This test calibrates the GAIN ADJUST DAC on the output board. This DAC is used to
control the RF power level that is incident upon the switched filters and subsequent RF
amplifiers by setting the control point for the gain adjust modulator. This calibration
ensures that there will be sufficient power available at the RF amplifiers to perform the
desired modulation while minimizing intermodulation distortion when the digital
modulation is activated. After adjustment, the results are stored in the signal generator’s
firmware.
Procedure
Figure 5-27 Gain Adjust Calibration Setup
Description
This adjustment calibrates the gain adjust DAC on the output module for low noise floor
(LNF) mode. The instrument is set up with attenuation applied by way of the burst
modulator. This ensures that the RF chain is not in compression. The gain adjust DAC is
then set for a specific RF output power, as measured with a power meter.
Procedure
Figure 5-28 LNF Gain Adjust Setup
Description
This test adjusts the ALC MOD DRV BIAS DAC on the output board. This DAC is used to
control the bias current to the ALC modulator driver. It is primarily used to accommodate
unmatched VBE values in the drives, but has a strong influence on modulator gain. This
test adjusts the DAC until the ALC modulator gain is balanced around its nominal design
center. The results are stored as a calibration array in the signal generator’s firmware.
Procedure
1. Preset the signal generator.
2. Follow the instructions as they appear on the controller’s display.
Description
This test calibrates the VBLO MIXER BIAS DAC on the output board. This DAC is used to
control the bias voltage to the internal marble mixers. This DAC is primarily used to
adjust the mixer bias for optimum I/Q modulation linearity, but it has a secondary
influence on mixer gain. This calibration ensures that there will be sufficient power
available through the marble mixers to perform the desired modulation while minimizing
intermodulation distortion when the digital modulation is activated. After adjustment, the
results are stored in the signal generator’s firmware.
Procedure
1. Preset the signal generator.
2. Follow the instructions as they appear on the controller’s display.
Description
This test adjusts both power flatness and power level accuracy. First the power flatness
and accuracy calibration constants are initialized to zero. Then power flatness is measured
with a power meter, and corrected, with the internal attenuator set to 0 dB. The power
meter then measures the RF output over the +13 to –15 dBm dynamic range, and
calibration constants are set to optimize power level accuracy over this range.
For power levels below −15 dBm, a vector signal analyzer is used to make relative power
measurements. A low noise amplifier (LNA) and step attenuator are connected in series to
control the absolute power level input to the signal analyzer. For power level settings
≥ −60 dBm, the step attenuator is set to approximately cancel the gain of the LNA. Below
−60 dBm, the step attenuator is set to 0 dB, which keeps the signal level well above the
signal analyzer’s noise floor.
Relative measurements are accomplished by setting the DUT amplitude. For example:
The DUT is set to a frequency and the internal step attenuator is set to 0 dB. The signal
analyzer does a peak search, sets the reference level, and sets the marker to delta mode
(0 dB). The DUT internal attenuator is set to 20 dB, and the signal analyzer makes
another measurement. The signal analyzer relative measurement should be - 20 dB. Any
variation from this value is used to set a calibration constant for the 20 dB attenuator
setting at that frequency.
To determine calibration values for internal step attenuator settings below 60 dB, the
signal analyzer makes the 0 dB marker delta measurement with the internal step
attenuator set to 60 dB, and all lower level measurements are made relative to this
reference. The previously measured error for the 60 dB setting is then accounted for in the
measurement.
For frequencies above 2 GHz, an HP/Agilent 8563E signal analyzer is used to downconvert
the frequency to 321.4 MHz.
• Low Frequency Noise Amplifier (LNA) • High Frequency Low Noise Amplifier (LNA)
Procedure
1. Connect the equipment as shown below.
2. Preset all of the equipment.
3. Follow the instructions as they appear on the controller’s display.
Figure 5-31 Low-Power, Power Level Accuracy (≥ 10 MHz and ≤ 2 GHz) Setup
Description
This test determines the level meter gain and offset calibration constants on the output
board. The level meter function is used when the ALC loop is open. It allows the output
power of the signal generator to be monitored and controlled without using the ALC loop.
After the level meter gain and offsets are measured, a calibration constant is calculated
and stored in the appropriate calibration array.
Procedure
1. Preset the signal generator.
2. Follow the instructions as they appear on the controller’s display.
Description
This test calibrates the ALC REF DAC corrections necessary to maintain power flatness in
the ALC open-loop mode of operation. The nominal ALC REF DAC settings are adjusted to
re-align the ALC open-loop power with the desired level. The resulting values are stored as
calibration arrays in the signal generator’s firmware.
Procedure
Figure 5-33 ALC Modulation Flatness Calibration Setup
AM Gain Calibration
Description
This test adjusts the gain of the AM circuitry to provide a 10 dB drop for a 1 V input signal.
First a power level is set and a one-volt signal is connected to the input. AM is enabled and
the AM DACs are adjusted for an exact 10 dB drop. The resulting DAC value is stored as a
calibration array.
Procedure
Figure 5-34 AM Gain Calibration Setup
Description
This test determines the required I/Q gain, offset, and quadrature calibration
constants/arrays that will minimize the I/Q modulation imperfections on the output board.
Several calibration constants are determined and then the I/Q gain, offset, and quadrature
DACs are adjusted over frequency to minimize the static vector modulation errors. The
results are stored in the signal generator as calibration arrays.
Procedure
1. Preset the signal generator.
2. Follow the instructions as they appear on the controller’s display.
Description
This adjustment sets internal calibration array values to minimize errors in the I/Q
adjustments (I/Q gain, I offset, Q offset, and quadrature skew) in the I/Q menu.
Procedure
1. Preset the signal generator.
2. Follow the instructions as they appear on the controller’s display.
Description
This adjustment uses a DVM to set the baseband generator’s four potentiometers (I Gain, I
Offset, Q Gain, and Q Offset) to minimize the dc offset and set the AC voltage level to
0.5 Vpk (into 50 ohms) at the rear-panel I and Q output connectors.
Procedure
Figure 5-35 Baseband Generator I/Q Gain and Offset Adjustment Setup
Figure 5-36 Baseband Generator I/Q Gain and Offset Adjustment Location
UNITED STATES
Instrument Support Center
Agilent Technologies
(800) 403-0801
Great Britain
Agilent Technologies
Eskdale Road, Winnersh Triangle
Wokingham, Berkshire RG41 5DZ
England
(44 118) 9696622
Safety Notes
The following safety notes are used throughout this manual. Familiarize yourself with
each of the notes and its meaning before operating this instrument.
Instrument Markings
The following markings and caution and warning labels are used on the instrument. Be
sure to observe all cautions and warnings.
Statement of Compliance
This product has been designed and tested in accordance with IEC Publication 1010,
Safety Requirements for Electronic Measuring Apparatus, and has been supplied in a safe
condition. The instruction documentation contains information and warnings which must
be followed by the user to ensure safe operation and to maintain the product in a safe
condition.
Electrostatic Discharge
Electrostatic discharge (ESD) can damage or destroy electronic components. Therefore, all
work performed on assemblies consisting of electronic components should be done at a
static-free work station. Figure 7-1 shows an example of a static-safe work station using
two kinds of ESD protection:
• conductive table mat and wrist-strap combination
• conductive floor mat and heel-strap combination
These methods may be used together or separately.
Warranty
This Agilent Technologies instrument product is warranted against defects in material
and workmanship for a period of three years from date of shipment. During the warranty
period, Agilent Technologies will, at its option, either repair or replace products which
prove to be defective.
For warranty service or repair, this product must be returned to a service facility
designated by Agilent Technologies. Buyer shall prepay shipping charges to Agilent
Technologies and Agilent Technologies shall pay shipping charges to return the product to
Buyer. However, Buyer shall pay all shipping charges, duties, and taxes for products
returned to Agilent Technologies from another country.
Agilent Technologies warrants that its software and firmware designated by Agilent
Technologies for use with an instrument will execute its programming instructions when
properly installed on that instrument. Agilent Technologies does not warrant that the
operation of the instrument, or software, or firmware will be uninterrupted or error-free.
LIMITATION OF WARRANTY
The foregoing warranty shall not apply to defects resulting from improper or inadequate
maintenance by Buyer, Buyer-supplied software or interfacing, unauthorized modification
or misuse, operation outside of the environmental specifications for the product, or
improper site preparation or maintenance.
NO OTHER WARRANTY IS EXPRESSED OR IMPLIED. AGILENT TECHNOLOGIES
SPECIFICALLY DISCLAIMS THE IMPLIED WARRANTIES OF MERCHANTABILITY
AND FITNESS FOR A PARTICULAR PURPOSE.
EXCLUSIVE REMEDIES
THE REMEDIES PROVIDED HEREIN ARE BUYER’S SOLE AND EXCLUSIVE
REMEDIES. AGILENT TECHNOLOGIES SHALL NOT BE LIABLE FOR ANY DIRECT,
INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES, WHETHER
BASED ON CONTRACT, TORT, OR ANY OTHER LEGAL THEORY.
Assistance
Product maintenance agreements and other customer assistance agreements are available
for Agilent Technologies products. For any assistance, contact your nearest Agilent
Technologies sales and service office. (See “Sales and Service Offices” on page 6-3.)
Certification
Agilent Technologies certifies that this product met its published specifications at the time
of shipment from the factory. Agilent Technologies further certifies that its calibration
measurements are traceable to the United States National Institute of Standards and
Technology, to the extent allowed by the Institute’s calibration facility, and to the
calibration facilities of other International Standards Organization members.
A AM
accuracy, 4-10
AC symbol, 7-3
AM accuracy check, 2-8
add/remove programs, 3-5
AM audio path offset calibration, 5-11
adding datapacks, 3-17
AM frequency response performance test, 4-14
adjustments, 5-1
ALC modulation driver bias, 5-33 AM gain calibration, 5-40
ALC modulation flatness, 5-39 analog bus ADC calibration, 5-5
AM audio path offset, 5-11 analyzer
AM gain, 5-40 audio, 1-2
analog bus ADC, 5-5 spectrum, 1-2
baseband generator I/Q gain and offset, 5-43 vector signal, 1-2
burst audio path gain, 5-28 arbitrary waveform generator, 1-3
burst modulator, 5-27 assistance, 7-8
DCFM, 5-24 attenuator, 1-3
external input peak detector, 5-26 audio analyzer, 1-2
FM 1/2 path ratio gain, 5-17 automatic leveling control calibration, 5-35
FM delay potentiometer, 5-20
FM in-band DAC offset, 5-15
FM inverting amplifier offset, 5-16 B
FM out-of-band, 5-19 baseband generator I/Q gain and offset adjustment,
FM path offset, 5-14 5-43
FM scale DAC offset, 5-13 burst audio path gain calibration, 5-28
FM/PM OB cal and delay potentiometer, 5-22 burst modulation on/off ratio performance test, 4-33
FM/PM YO frequency compensation calibration, burst modulator calibration, 5-27
5-23
gain adjust, 5-31
I/Q gain/offset/quadrature, 5-41
C
I/Q impairment, 5-42 calibration cycle, 4-2
internal reference oscillator, 5-4 calibration due date, 3-9
internal source, 5-7 calibration tracking number, 3-9
KV versus frequency, 5-10 calibration type, 3-9
level meter, 5-38 Canadian Standards Association, 7-3
LNF gain adjust, 5-32 carrier frequency offset, 4-17
lock angle potentiometer, 5-9 caution sign, 7-2
low frequency output, 5-25
CDMA adjacent channel power performance test,
modulation source relative gain compression, 5-18 4-34
power level accuracy, 5-35
CE mark, 7-3
prelevel, 5-29, 5-30
certification, 7-9
pretune calibration, 5-6
specifications, 7-9
relationships between adjustments, 5-2
timebase DAC, 5-12 choose destination location screen, 3-3
VBLO mixer bias, 5-34 compliance
VCO bias potentiometer, 5-8 statement of, 7-5
wide bandwidth phase modulation, 5-21 configuring the software, 3-6
administration configuration counter
service software, 3-7 universal, 1-3
administration of service software, 3-6 CSA mark, 7-3
ALC custom I/Q RF modulation quality performance test,
calibrations 4-29
modulation driver bias, 5-33
modulation flatness, 5-39 D
power level accuracy (4-point fit), 5-35
datapacks
alternate timeslot power settling time performance
test, 4-35 adding, 3-17
Index I-1
Index
I-2 Index
Index
internal AM accuracy and distortion test, 4-10 output power level check, 2-4
internal digital modulation quality performance test,
4-28 P
internal reference oscillator adjustment, 5-4
password
internal source calibration, 5-7
service software, 3-20
ISM 1-A symbol, 7-3
performance tests
alternate timeslot power settling time, 4-35
K AM frequency response, 4-14
KV versus frequency calibration, 5-10 burst modulation on/off ratio, 4-33
CDMA adjacent channel power, 4-34
custom I/Q RF modulation quality, 4-29
L DCFM frequency offset relative to CW, 4-17
level meter calibration, 5-38 digital modulation level accuracy, 4-27
list, 4-5 dual arbitrary waveform generator check, 4-42
lock angle potentiometer adjustment, 5-9 FM frequency response, 4-13
low frequency output calibration, 5-25 Frac-N/Divider assembly check, 4-45
low noise floor (LNF) gain adjust, 5-32 GSM loopback BER check, 4-44
harmonic, subharmonic, and nonharmonic spurious
signals, 4-20
M I/Q modulation quality, 4-31
main test and results window, 4-7 internal AM accuracy and distortion, 4-10
maintenance agreements, 7-8 internal digital modulation quality, 4-28
maintenance and service, 6-1 internal FM accuracy and distortion, 4-9
maintenance procedures list, 4-5
cleaning the cabinet, 6-2 measuring phase noise and residual FM, 4-37
cleaning the display, 6-2 modulation source frequency accuracy, 4-2
measuring receiver, 1-2 phase modulation accuracy and distortion, 4-12
meter phase modulation frequency response, 4-16
power, 1-3 power level accuracy, 4-21
pulse modulation on/off ratio, 4-32
mixer, 1-3
pulse rise/fall time (option 1E6), 4-36
modulation, 4-2
records, 4-2, 4-46, 4-47
burst on/off ratio performance test, 4-33
residual FM, 4-18
internal digital quality performance test, 4-28
sampler/YO driver check, 4-45
modulation source relative gain compression timebase aging rate, 4-2, 4-24
calibration, 5-18
phase modulation accuracy and distortion
multimeter digital, 1-3 performance test, 4-12
phase modulation frequency response performance
N test, 4-16
new test equipment window, 3-9 plug-in, oscilloscope, 1-2
nonharmonic spurious signals, 4-20 power
meter, 1-3
sensor, 1-3
O power flatness calibration, 5-35
operation verification, 2-1 power level accuracy performance test, 4-21
option 1E6 power on symbol, 7-3
pulse rise/fall time test, 4-36
prelevel calibration, 5-29, 5-30
oscilloscope
pretune calibration, 5-6
digital, 1-2
preventive maintenance procedures, 6-2
digitizing, 1-2
plug-in, 1-2 printer, 1-3
printing test results, 4-8
Index I-3
Index
I-4 Index
Index
U
uninstalling the software, 3-5
universal counter, 1-3
user configuration
service software, 3-7
user information window, 4-3
user name
service software, 3-20
V
VBLO mixer bias calibration, 5-34
VCO bias potentiometer adjustment, 5-8
vector signal analyzer, 1-2
verification procedure tables, 2-10
verification procedures, 2-2
test tables/records, 2-10
verifying operation, 2-1
W
warning sign, 7-2
warranty, 7-7
waveform generator, 1-3
welcome screen, 3-3
wide bandwidth phase modulation calibration, 5-21
window
DUT, 4-4
main test and results, 4-7
save as, 4-6
select test equipment and tests, 4-5
user information, 4-3
Y
YO
sampler driver check performance test, 4-45
Index I-5
Index
I-6 Index