Abstract
The Silent Cerebral Infarct Multicenter Transfusion (SIT) Trial is a multi-institutional intervention trial in which children with silent cerebral infarcts are randomized to receive either blood transfusion therapy or observation (standard care) for 36 months. The SIT Trial is scheduled to enroll approximately 1,880 children with sickle cell disease from 29 clinical sites in the United States, Canada, UK, and France. Each child undergoes a screening magnetic resonance imaging (MRI) of the brain to detect the presence of silent cerebral infarct-like lesions, a pre-randomization (baseline) MRI and exit MRI to determine if there are new or enlarged cerebral infarcts, using a designated, prospective imaging protocol. The objective of this manuscript is to describe the innovative method used to process and adjudicate imaging studies for an international trial with a primary endpoint that includes neuroimaging. Institution investigators at each site were provided with computer hardware and software for transmission of MRI images that allow them to strip the scans of all personal information and add unique study identifiers. Three neuroradiologists at separate academic centers review MRI studies and determine the presence or absence of silent cerebral infarct-like lesions. Their findings are subsequently placed on web-based case report forms and sent to the Statistical Coordinating Center. The average time from imaging center receipt of the MRI study to the radiology committee report back to the local site is less than two working days. This novel strategy was designed to maximize efficiency and minimize cost of a complex large multicenter trial that depends heavily on neuroimaging for entry criteria and assessment for the primary outcome measures. The technology, process, and expertise used in the SIT Trial can be adapted to virtually any clinical research trial with digital imaging requirements.
Key words: Clinical trial imaging, PHI, case report forms, central review
Background
Sickle Cell Disease
Sickle cell disease is a recessive genetic disorder, which is caused by a point mutation resulting in the substitution of valine for glutamic acid at the sixth position in the beta chain of hemoglobin. Sickle cell disease affects one of every 400 African American newborns and approximately 70,000 persons in the United States.1 Vaso-occlusion is the major cause of morbidity and mortality in sickle cell disease. Complications of sickle cell disease can occur in many organs, but cerebral ischemic injury represents one of the most devastating complications, leading to life-threatening and disabling complications.
Overt and Silent Cerebral Infarcts
Overt strokes in children with sickle cell disease usually involve the recognition of rapidly evolving neurological deficits, leading to immediate treatment. Historically, overt strokes have occurred in approximately 11% of the children with sickle cell disease before their 14th birthday.2 While new screening methods using transcranial Doppler measurements may reduce the prevalence of overt strokes,3,4 they remain common in this population. Silent cerebral infarcts are defined by an area of abnormal magnetic resonance imaging (MRI) signal intensity on fluid attenuated inversion recovery (FLAIR) T2-weighted images of the brain, in a child with no prior history or physical findings of a focal neurological deficit, as ascertained by physical exam.5 Silent cerebral infarcts are seen in approximately 22% of children with sickle cell disease who do not have evidence of overt stroke.6 Despite their name, silent cerebral infarcts are not truly “silent,” but are associated with a variety of neurological morbidities; these include an increase in risk for overt stroke,6 poor academic performance,7 and lower I.Q. in comparison to children with sickle cell disease who do not have silent cerebral infarction8 or their siblings who do not have sickle cell disease.9 Individuals with silent cerebral infarctions are also at risk for the development of new or progressive brain lesions (silent cerebral infarcts) on MRI.10
The Silent Cerebral Infarct Transfusion Trial
There is currently no systematic strategy to identify or treat children with silent cerebral infarction to prevent the significant neurological morbidity and risk of further neurological compromise associated with this finding. The Silent Cerebral Infarct Transfusion (SIT) Trial is a cooperative study performed by 28 pediatric study centers in collaboration with a clinical coordinating center, a statistical and data coordinating center, and staff from the National Institute of Neurological Disorders and Stroke (NINDS), Rockville, Maryland. This work is supported by funding from NINDS #U01 NS042804 (Clinical Trials.gov). In the SIT Trial, children with silent cerebral infarcts are randomized to receive either blood transfusion therapy or observation (standard care) for 36 months. The primary hypothesis is that monthly prophylactic blood transfusion therapy in children with silent cerebral infarcts will result in an 86% reduction in the proportion of children with clinically evident strokes or new or progressive silent cerebral infarcts.
Neuroimaging Requirements of the Trial
All patients in the SIT Trial undergo a first screening MRI (MRI-1), a second expanded MRI (prerandomization) examination of the brain immediately prior to randomization and a third MRI at study exit. The first MRI will establish the presence of a potential silent infarction (a silent infarct-like lesion). The requirement for the trial is that this review occurs within two working days of receipt of the images by a single neuroradiologist, to assess the MRI for unsuspected, clinically significant finding, which might require immediate medical attention. A second MRI was included to identify a true baseline for the MRI of the brain prior to randomization, given that the time interval between the first and second MRI may result in new lesions before randomization. Additionally, unscheduled MRI examinations will be required for suspected neurologic events occurring during the course of the SIT Trial. The study procedures must also accommodate an imaging panel of three neuroradiologists, located in separate cities, who evaluate the images for both eligibility screening and analysis of imaging. The SIT Trial therefore poses numerous challenges for the acquisition, processing, dissemination, and adjudication of neurologic images, all of which must be done in a timely fashion, which is critical to the smooth and effective operation of the trial.
The objective of this manuscript is to describe the innovative method used to process and adjudicate imaging studies for an international trial with a primary endpoint that includes neuroimaging.
Materials and Methods
SIT Trial Screening and Randomization
The SIT Trial is being conducted over an eight and a half year period in two stages consisting of enrollment and an active clinical trial period. During the first stage, 1880 eligible children will receive screening MRIs. In addition to the first screening MRI (MRI-1), a second expanded MRI (prerandomization) examination of the brain is done immediately prior to randomization to identify a true baseline for the MRI of the brain, given that the maximum interval between the first and second MRI scan of 6 months may result in new lesions that were not present on the original scan. A third MRI is done at study exit for the outcome measurement. Additionally, during the course of the trial, suspected neurological events will result in unscheduled MRI examinations.
An imaging panel of three neuroradiologists, located in three academic centers and cities, performs the evaluation of images for both eligibility screening and analysis of imaging from enrolled subjects, including children with a suspected or confirmed event. A neuroradiologist is available to read MRIs within two working days to allow for the rapid detection of unexpected, clinically significant findings. In the event that a clinically significant finding is identified, a redundant notification system has been established to contact the site investigator.
The operational definition of an infarct-like lesion is an MRI signal abnormality at least 3 mm in one direction and visible on two views on FLAIR T2-weighted images. The presence or absence of an infarct-like lesion will be determined by a consensus of two of the three study neuroradiologists. Any lesion must be independently determined to be silent by two pediatric neurologists that are part of the neurology committee. A cerebral lesion is defined as being a silent cerebral infarct if the child has a normal neurological examination or an abnormality on neurological exam that can not be explained by the location of the infarct-like MRI lesion.
Image Collection and Transmission
Collecting and viewing multisequence, multiplanar MRI studies of the brain from a large subject population is a challenge. This challenge is increased by the requirements in clinical research studies where multireader consensus interpretation and database audit trails are mandated. The Imaging Center for the SIT Trial, the Electronic Radiology Laboratory (ERL) of the Mallinckrodt Institute of Radiology at Washington University in St. Louis, was charged with implementing an efficient and effective process to collect images from participating sites worldwide, present the MRI studies to the panel of neuroradiologists for interpretation and report the findings to both the Neurology Committee and the statistical coordinating center.
Early in the design phase of the SIT Trial, the acquisition from sites and distribution to the radiology panel of hard copy film was determined to be too costly and too slow. An electronic distribution and reporting system was the obvious solution. However, the challenge was to design a process that took advantage of available technology that could be economically adapted for multi-institutional research use. In addition, because the MRI studies serve as a key element in screening potential subjects for the SIT Trial, not only was a thorough interpretation required, but the findings needed to be disseminated quickly in order to transition the eligible participants into the randomization phase of the trial.
Figure 1 displays the technology and workflow implemented by the ERL to support the SIT Trial. Initially, the ERL provided each of the 28 participant enrolling sites (United States, Canada, United Kingdom and France) with two software programs to transmit the MRI images electronically to the imaging center at Washington University in St. Louis. The first program was the commercially available VPN Client (Cisco Systems, San Jose, CA, USA). This software creates a secure virtual private network between a sending and receiving site which allows the images to be sent encrypted over the internet.
The second software program was the Clinical Studies Workstation (CSW) program. CSW is a Windows (Microsoft, Redmond, WA, USA) program that was previously developed by ERL to transmit images for clinical trials. This software allows site personnel to receive DICOM studies directly from a local modality/PACS or a CD-ROM. CSW also provides a graphical user interface with which site personnel can remove Protected Health Information (PHI) and replace it with unique study identifiers, and then transmit the study to a DICOM receiver over the internet using a secure VPN connection. CSW was chosen for the SIT Trial over other competing products because it is easier to support remotely and has an ideal balance between the functionality required for the project and the technical abilities of the users at the clinical sites. In addition CSW is a well-tested software package that incorporates a de-identification algorithm that has been successfully applied in two previous major clinical trials11–13 and which is conformant to the DICOM committee recommendation for de-identification of DICOM objects.14 Figure 2 is an example of the CSW program showing how the PHI is replaced with the participant identifiers.
The enrollment sites were provided with a laptop computer with the VPN Client and Clinical Studies Workstation software preinstalled. The laptop provides the portability required at many sites which collect MRI images from an MRI scanner or PACS located on the hospital network but later transmit the images by site study personnel while attached to a university or research network. Some sites had sufficient personal computer resources that required only the installation of the VPN Client and CSW programs.
Site personnel were trained on both the VPN Client and CSW programs. The sites use a variety of personnel to transmit the images, MRI technologists, PACS administrators, Radiology Department IT support, or study coordinators (clinical nurses). An operations manual provides detailed instructions for transmitting MRI studies and troubleshooting software and network problems. Because of frequent turnover in site personnel a web-based conference call training session was also developed to orient new personnel.
The transmission and receipt of MRI studies is electronic. Nevertheless, several safeguards ensure studies are not misplaced. First, the enrollment sites are required to fax to the Imaging Center a worksheet that provides the subject’s study identifiers and the details on the series and number of images transmitted. This provides the Imaging Center personnel with a “shipping” document to check that all the images are received. Second, the Imaging Center notifies the site via email whenever a study has been received. This closes the loop with the site and ensures that the transmission process was successful.
Study Check-in
The DICOM Images transmitted by the clinical sites are received at ERL by a DICOM storage service running on the Data Check-in Server (Power Edge 2950, Dell Computer, Round Rock, TX, USA). Several times a day, the ERL data analysts assigned to the SIT Trial manually run a UNIX script that generates a list of new studies that have been received. The ERL staff also runs this script whenever an Image Tracking Worksheet has been received on the fax machine. If images have been received without an accompanying Image Tracking Worksheet, the ERL staff will contact the originating clinical site. The site is also contacted if an Image Tracking Worksheet has been received without accompanying images. Steps are taken to resolve quickly any transmission failures.
The imaging center data analysts use a web-based system to register the participant in the database and to upload studies to the image viewing system. An automated telephone response registration system, managed by the Data Coordinating Center, is used by the clinical sites to register participants for the SIT Trial. After the required registration information has been provided by the site study coordinator via telephone, the automated telephone response system automatically faxes the assigned participant code and identification number to the imaging center. The assigned participant code and identification numbers are then entered into the imaging database through the web interface. Next, the registration form is compared to the Image Tracking Worksheet and the transmitted images file. If the identifiers match, the data analysts execute an upload command and a check-in screen is presented that displays the details of a study. Figure 3 is an example of a check-in screen. This check-in screen is used to both process the study before it is entered into the image database and to upload to the image viewing system. The check-in process begins by obtaining a unique accession number which is used to identify the study in both the database and the image viewing system. Next, an exam type is selected which later will determine what series of case report forms (CRFs) are generated. Then the study is assigned to the “on-call” radiologist who reviews each study in less than 24 h to assess for any routine or urgent incidental findings.
The table of series parameters in Figure 3 serves two functions. First, it allows the data analysts to conduct a QA review to ensure that the number of images received matches the number indicated on the faxed Image Tracking Worksheet and to ensure that the pulse sequences submitted meet the imaging protocol requirements. Second, the data analysts select a common Protocol Code description and appropriate image plane for each series. This information is collected in the image database and helps facilitate data mining for ancillary studies.
The workflow imbedded in the check-in program automatically triggers the next step of the process, the automated upload of the study to the image viewing system. The QA process on the image viewing system ensures the participant’s study can be located by patient identifier or accession number. This process also ensures that previous MRI studies for this participant are presented together. The MRI study images are then checked to make sure all Protected Health Information (PHI) has been removed. Next the number of series and images is checked to ensure the upload was successful. The image quality assurance steps evaluate the MRI study for adherence with the imaging protocol, confirm images cover the entire brain, and determine if obscuring artifacts are present. If all QA checks are passed, the study is ready for viewing by the radiology panel.
The last part of the check-in process consists of email notifications. After a study or group of studies has been processed, the neuroradiologists are notified that new studies have been added to their web-based work list. This eliminates the need for the neuroradiologists to disrupt their daily activities and connect to the web site just to see if there is new work. This also allows the imaging center to provide any special instructions to the radiologists. In addition, the clinical site is notified that their study has been received and processing has begun. This step provides reassurance to site personnel that the transmission process is complete and establishes a time frame for when they should expect to receive the results.
An important task for an imaging core is to maintain a log of studies received and where they are in the process. Figure 4 is an example of a Microsoft Excel worksheet that the ERL uses to track the progress of each study as it works its way through the QA and image review process. This simple but comprehensive report provides study managers with up-to-date status on each study received and provides the ability to generate overall or site specific volume reports.
Image Viewing
The image viewing requirements of the SIT Trial dictate a system that allows the individual neuroradiologists to view the studies and make precise measurements. This allows the radiology panel to access each member’s findings to help arrive at a consensus position and allows the study neurologists and other personnel to view the images and the neuroradiologists’ findings. The system also requires the simultaneous viewing of each study by the neuroradiologists with the ability to review studies at different time points for individual participants to determine possible change in infarct-like lesions. Because the SIT Trial does not use a Radiology Information System to enter participant information, the image viewing system must accept studies without the usual patient registration and processing. The product chosen for the image viewing system was Philips Corporation iSite PACS (Philips Healthcare Informatics, Foster City, CA, USA). The iSite/Radiology client module provides the radiology panel with a full featured PACS environment to interpret the studies. The iSite/Enterprise module allows the Neurology Committee easy access to the studies over the Internet. The iSuite system administrator’s module allows ERL personnel to serve as PACS administrators with minimal training. Philips iSite has proven to be well suited to the clinical research environment with its lesion measurement tools and limited subject demographics. Figure 5 is an example of an iSite screen showing the measurement of an infarct-like lesion.
Image Viewing Workstation Calibration
The capabilities of web-based server and browser technologies are well suited to the task of delivering images to geographically dispersed reading panel members. An important element in reducing variability between the readers is standardized viewing monitors calibrated to achieve the same results in different settings. In addition, in order to provide consistent diagnostic interpretation over time, it is important to conduct periodic calibration tests to maintain the monitors in a near steady state.
For the SIT Trial, we have implemented two types of monitor calibration procedures. In these procedures controls were set to take full advantage of the display’s capabilities to achieve set minimum and maximum luminance targets. During deployment and setup of the radiology viewing workstations, our goal was to maximize the performance of the particular display model for the reading environment used by each neuroradiologist. The deployment calibration procedure required a photometer with a “field of view” or “angle of acceptance” in the 8° to 15° range. For the SIT Trial, we used an UDT Instruments (San Diego, CA, USA) Model 371 Meter with a Model 265 Probe, which comes with a photopic filter and an occluder to keep out the ambient light. During the calibration, luminance adjustments were made with the backlight, contrast, and brightness controls on the monitor to achieve the specified luminance values while viewing a series of test patterns.
Given the observations that monitor performance changes with time, the second calibration procedure used in the SIT Trial provided a method to set a display’s controls so that its performance continued to match the specified luminance targets. The ERL staff or radiologists perform a quarterly calibration check that requires only a few minutes of their time. Should the monitor fail this simple test, the monitor will be recalibrated by ERL staff using the same procedures as the deployment calibration. The quarterly check consists of viewing the test pattern in Figure 6 to determine if the “0” square, upper left, is totally black and the “255” square, lower right, is totally white. The interior squares in the “2” and “250” squares may be visible; however, the interior squares in all of the other boxes of the top and bottom lines should be visible.
By following these display calibration procedures with periodic checks, the displays provide sufficient image quality for the clinical evaluations of SIT Trial MRIs, and they can be relied upon to behave similarly throughout the study.
Image Data Collection
The different imaging time points of the SIT Trial require MRIs with different pulse sequences. The workflow implemented in the check-in program supports a web-based reporting process that provides the neuroradiologist with a series of CRFs. Each CRF is designed to collect the appropriate information for the type of MRI being interpreted. In addition to collecting the findings from the individual radiologist, this web-based system provides the consensus reading of the radiology panel to the SIT Trial Neurology Committee.
The neuroradiologists use the iSite image viewing system to view the images and the web-based Case Report Form system to record their findings. The review process starts with the neuroradiologists being notified via email that a new study has been added to their worklist. Each radiologist accesses the web site to view a worklist similar to the one presented in Figure 7. The worklist provides the radiologist the accession number that can be used in iSite/Radiology to retrieve the study.
After viewing the study in iSite, the radiologist can click on the CRF in the Form No. column and be taken to the web-based CRF for that study. Figure 8 is an example of the CRF completed by the “on-call” radiologist after the initial assessment for incidental findings. Completing the form is very quick, only requiring the radiologist to make one selection and save the results. This information is automatically entered into the imaging database. Because the radiologists are blinded to the actual subject demographics, any specific incidental findings are communicated to the Clinical Coordinating Center staff.
Each neuroradiologist completes a lesion assessment CRF that collects the findings for a particular type of study. The example in Figure 9 is the CRF for the MRI-1 (screening) study. The participant identifiers and exam particulars are provided to help ensure that the CRF matches the study being read. The neuroradiologists simply record their findings by selecting the appropriate radio button and saving their results. This process was designed to allow the neuroradiologist the maximum time to view images and minimize the time spent in recording their findings. Even though the CRFs for subsequent studies ask additional questions, i.e. “Are the lesions found on the screening exam still visible?”, the process still provides the neuroradiologists with a quick and easy to use method of reporting their findings.
The Consensus Process
The ERL data analysts closely monitor the completion of the CRFs to determine if a consensus has been reached on the presence of a qualifying infarct-like lesion. Figure 10 is an example of the information that is available on the web interface. As soon as two radiologists are in agreement, the site is notified of the results.
The radiology committee conducts a biweekly conference call to review each study. For each study, the radiologist assigned as the “on-call” radiologist for that study is designated as the recording radiologist for that particular study. This radiologist will have on their worklist a CRF that is similar to Figure 11. This CRF displays the findings of each radiologist along with the ability to indicate a consensus lesion assessment. A consensus presentation state is also created in iSite which shows the location and size of the lesion(s). Studies with discordant interpretations are reviewed in depth. Quality failures are also discussed and clinical sites notified of specific problems.
Several imaging time points require the collection of lesion measurement data. Although all of the radiologists participate in determining the lesion size, for consistency in reporting, the chairman of the radiology committee has been designated as the recorder of the measurement data. Figure 12 is an example of the web-based CRF used to collect measurement data. The participant identifiers and exam particulars ensure that the CRF matches the study being read. Although this CRF collects a large amount of data, it is designed to minimize the workload for the radiologist. The form is populated with an input column based on the number of lesions found. Radio buttons and check boxes are provided to facilitate entering the data. Text boxes are also provided to capture any additional information.
At the conclusion of each consensus call, the ERL staff provides members of the Neurology Committee with a summary report of findings on each case with a qualifying infarct-like lesion. The study neurologists can use the report as shown in Figure 13 to locate the lesions on the images in the iSite/Enterprise viewing system. By viewing the presentation state in iSite along with the comprehensive neurological exam from each site’s neurologist, the Neurology Committee makes the final determination if the infarct-like lesion is silent and the child is eligible to participate in the SIT Trial.
Status Reports
The ERL process was designed to provide timely reports to SIT Trial management on the status and volume of image studies received and individual or cumulative findings. The principal investigator and the co-principal investigator receive a weekly volume report similar to Figure 14. This report provides a snapshot of the number and type of studies received from each clinical site for the week and cumulative totals. With this report study management is regularly updated on the volume of studies received.
Results
Since active enrollment in the SIT Trial began in December 2004, ERL has received and processed MRIs for over 600 children. Figure 15 shows the monthly distribution of the studies received to date. The total number of MRIs processed includes over 600 screening MRIs (MRI-1), over 100 pre-randomization MRIs (Pre-Rand), and 7 unscheduled MRIs (MRI-Un) on children with clinical events. Beginning in March 2008, we will start receiving the study exit MRIs (MRI-36) on those children who have been in the trial for 3 years.
Transmission Time
In response to a survey, most sites report that the typical time to prepare a study for transmission and actually transmit the study is between 5 and 40 min. The larger preparation and transmission times are caused by uploading a study to the laptop from a CD and the slow speed of the local network. Once the transmission process starts, it can run unattended; therefore, the actual workload on the site is minimal. The electronic transmission of studies provides the capability for the imaging center to receive the images the same day they are acquired at the clinical site. As shown in Figure 16, over 10% of the SIT Trial studies are received at ERL the same day the participant is scanned. Although the transmission of some studies is delayed because they were acquired late in the day or on a weekend, and frequently a local radiologist will read the study before it is released for transmission, over 50% of the SIT Trial studies are still received at ERL within 3 days and over two-thirds (68.7%) are received within 5 days.
Processing Time
Once the study is received at the Imaging Center, the average time to process an MRI study for a central review by the three neuroradiologists and provide the results back to the clinical site is less than 2 days. This is achieved by the ERL staff closely monitoring the system, quickly processing the newly arrived studies, and immediately notifying the radiologists when new studies have been added to their work list. The turnaround time is also dependent upon the radiologists making time available in their schedule to read the studies and complete the CRFs. The last component in the process requires ERL to closely monitor the completion of the CRFs and quickly notify the clinical sites when at least two of the radiologists have reached identical findings. Although typically the findings of the radiologists do not change, occasionally, at the panel’s bi-weekly consensus call, the radiologist in the minority agrees to join the consensus position. More rarely (0.5% of cases), a minority position becomes the consensus and the site must be notified of a change in findings.
Figure 17 shows the average number of days for the Radiology Committee to process a study and the number of studies received each month for the most recent 12 months of data. The Radiology Committee processing time includes the time required by ERL to conduct the QA and upload the study to the image viewing system. The time also includes the time required for the radiologists to reach consensus on each study. The QA process starts as soon as the images have been transmitted and the image tracking worksheet has been received. Although ERL is only staffed to receive studies during normal working hours and is closed during weekends and holidays, the Cisco VPN server and web servers and ERL facsimile machines are available to receive studies 24 h a day, 7 days a week. The time required to process a study begins when a study is received, regardless if the office is staffed, and ends when the clinical site, study management, and the statistical coordinating center have been provided the findings.
Imaging Quality
Another criterion for evaluating the effectiveness of the ERL process is the number of studies that cannot be evaluated because of image quality issues or failure to follow the imaging protocol. The key to reducing the number of rejected studies is to develop and communicate standard acquisition parameters, site training, and immediate feedback when problems arise. The Imaging Site Procedures Manual provides the MRI acquisition guidelines and pulse sequence specifications. Each site was trained and then tested on their ability to follow the imaging protocol. Studies with quality issues were quickly reviewed and the deficiencies reported back to the imaging site. Table 1 shows the percent of studies received that were protocol failures because of either image quality or failure to comply with the protocol requirements.
Table 1.
Study Type | Studies Received | Number of Protocol Failures | Percent Failed (%) |
---|---|---|---|
Screening | 628 | 17 | 2.7 |
Prerandomization | 109 | 5 | 4.6 |
Unscheduled | 7 | 0 | 0.0 |
Total | 744 | 2 | 3.0 |
The design of the imaging database has not only facilitated the collection and reporting of data for the specific aims of the SIT Trial, but it has also been able to support data queries for ancillary research projects approved by the SIT Trial Executive Committee. Current ancillary studies include: “Prevalence of Incidental Intracranial Findings on MRI in Children Screened for the Silent Infarct Transfusion Trial”; “The Epidemiology of Cerebral Vasculopathy in Patients with Sickle Cell Disease and Silent Cerebral Infarcts”; and “The Effect of Prophylactic Blood Transfusion Therapy on Progression of Cerebral Vasculopathy in Sickle Cell Disease”. With the planned accumulation of MRIs on over 1,800 children with Sickle Cell Disease, we will have a rich source of images and descriptive data that can be efficiently mined to support additional research.
Discussion
Prior to the design of this innovative method to process imaging studies for clinical trials, researchers in multicenter trials faced the labor-intensive and time-delayed task of acquiring and distributing hard copy films. The effort becomes even more daunting when the study design calls for the image analysis to be conducted at different geographic locations. Figure 18 displays the typical approach to collecting radiologic images on hard copy film and then presenting to a radiologist for reading. What is obvious in this example is the amount of time lost not only in creating and sending hard copy films but also the additional time required to find and correct any errors. A more subtle problem with this process is the limitations of adjudicating the findings of multiple readers.
The process implemented for the SIT Trial not only uses technology to eliminate transmission delays and the cost of sending hard copy films, but also provides the ability for a reading panel to simultaneously review imaging studies to reach consensus positions on the findings quickly and more accurately. The workflow implemented in the study check-in program is adaptable to meet the needs of different clinical trials. For the SIT Trial, the workflow automatically sends the images to an image viewing system and creates case report forms. The workflow process can easily be modified to support multiple trials with differing data capture requirements.
The use of a digital viewing system has not only improved reader productivity but also improved diagnostic confidence. In addition, the database stores all of the data collected during check-in, the readings from the individual radiologists, and their consensus reports. This repository of study information will allow future queries concerning the specifics of the lesions found during the SIT Trial. We are collecting a wealth of information on children with sickle cell disease that can be easily accessed by other researchers.
Conclusion
The SIT Trial is in its third year of enrolling children for an important National Institutes of Health funded clinical trial. The number of children screened to date and the large number to follow until the trial is complete in 2011 would have been difficult without the adoption of a new strategy to process imaging studies. The SIT multidisciplinary team that includes pediatric hematologist, neurologists, neuroradiologists, and information systems engineers has combined readily available technology with a customized data collection and reporting process. This process easily obtains digital studies from anywhere in the world and quickly provides them to one or more radiologists in different locations for interpretation. The entire process also ensures compliance with the Health Insurance Portability and Accountability Act and standard research practice that requires protection of privacy. This novel strategy has successfully minimized the cost of collecting data for the SIT Trial, while still meeting the requirements for timely review of the scans, along with a central adjudication process. The basic workflow pattern of clinical trials is fairly constant, and the trial to trial variations can be handled by flexible procedures, adaptable software, and multiple visualization and analysis applications. The technology and expertise used in the SIT Trial can be adapted to virtually any clinical research study with digital imaging requirements.
Acknowledgments
Substantial contributions to the study were also made by individual members from the following SIT Trial committees: Executive Committee—Michael R. DeBaun, MD, MPH (Chair, Co PI for the SIT Trial), Washington University in St. Louis School of Medicine, Bruce Barton, PhD, (PI for the Statistical Coordinating Center) Maryland Medical Research Institute, Inc., James F. Casella, MD (Vice-Chair and Co-PI for the SIT Trial), Johns Hopkins University School of Medicine, Deborah Hirtz, MD, National Institute of Neurological Disorders and Stroke, Rebecca N. Ichord, MD, Children’s Hospital of Philadelphia, Robert C. McKinstry, MD, PhD, Michael Noetzel, MD, Washington University School of Medicine, Fred Prior, PhD, Mallinckrodt Institute of Radiology, Washington University, and Desiree A. White, PhD, Washington University. Neuropsychology Committee—Desiree A. White, PhD (Chair), Washington University, T. David Elkin, PhD, University of Mississippi Medical Center, Kevin R. Krull, PhD, Baylor College of Medicine, Kimberly Rennie, PhD, Medical College of Wisconsin, and H. Gerry Taylor, PhD, Rainbow Babies and Children’s Hospital. Neuroradiology Committee—Robert C. McKinstry, MD, PhD (Chair), Washington University School of Medicine, William S. Ball, Jr., MD, University of Cincinnati, Michael A. Kraut, MD, PhD, Johns Hopkins University School of Medicine, and Marilyn Siegel, MD, Washington University School of Medicine. Neurology Committee—Michael J. Noetzel, MD, (Chair) Washington University School of Medicine, Rebecca N. Ichord, MD, Children’s Hospital of Philadelphia, E. Steve Roach, MD, Children’s Research Institute, The Ohio State University, Deborah Hirtz, MD, National Institute of Neurological Disorders and Stroke, and Michael Dowling, MD, PhD, University of Texas Southwestern Medical Center. The following investigators participated in the Silent Infarct Transfusion Trial: Michael R. DeBaun, MD, MPH, Washington University School of Medicine, St. Louis, MO.; Bruce Barton, PhD, Maryland Medical Research Institute, Inc., Baltimore, MD.; Fred Prior, PhD, Mallinckrodt Institute of Radiology at the Washington University School of Medicine, St. Louis, MO.; James F. Casella, MD, Harold Lehmann, MD and John J. Strouse, MD, Johns Hopkins University School of Medicine, Baltimore, MD; Scott T. Miller, MD, State University of New York-Downstate Medical Center/Kings County Hospital Center, Brooklyn, NY; Caterina Minniti, MD, Children’s Research Institute–Children’s National Medical Center, Washington, DC; Rupa Redding-Lallinger, MD, University of North Carolina at Chapel Hill, Chapel Hill, NC; Charles Daeschner, MD, East Carolina University, Greenville, NC; Anthony Villella, MD, Case Western Reserve University/Rainbow Babies and Children’s Hospital, Cleveland, OH; Mark A. Ranalli, MD, Columbus Children’s Hospital/Ohio State University College of Medicine and Public Health, Columbus OH; Karen Kalinyak, MD, University of Cincinnati, Cincinnati, OH; Mark Heiny, MD, Riley Hospital for Children/Indiana University, Indianapolis, IN; Ingrid A. Sarnaik, MD, Wayne State University, Detroit, MI; Alexis A. Thompson, MD, Northwestern University–Children’s Memorial Hospital, Chicago, IL; Julie Panepinto, MD, MSPH, Medical College of Wisconsin/Children’s Research Institute of the Children’s Hospital of Wisconsin, Milwaukee, WI; Gerald M. Woods, MD, University of Missouri-Kansas/Children’s Mercy Hospital, Kansas City, MO; Allison King, MD, MPH, Washington University School of Medicine/St. Louis Children’s Hospital, St. Louis, MO; Suzanne L. Saccente, MD, University of Arkansas/Arkansas Children’s Hospital Research Institute, Little Rock, AK.; Thomas Howard, MD, University of Alabama at Birmingham, Birmingham, AL; Rathi V. Iyer, MD, University of Mississippi Medical Center, Jackson, MS.; Gladstone Airewele, MD, MPH and Donald H. Mahoney, Jr., MD, Baylor College of Medicine, Houston, TX; Charles Scher, MD, Tulane University Health Sciences Center, New Orleans, LA; Charles T. Quinn, MD, University of Texas Southwestern Medical Center, Dallas, TX; Thomas Coates, MD, University of Southern California, Los Angeles, CA; Hernan Sabio, MD, Wake Forest School of Medicine, Winston-Salem, NC; Fenella Kirkham, MD, University College Hospital, Institute of Child Health, London, United Kingdom; Baba Inusa, MD, Guy’s and St. Thomas’ Foundation Trust, London, United Kingdom; Françoise Bernaudin, MD, Hôpital Intercommunal de Créteil, Créteil, France; Melanie A. Kirby, FRCP, The Hospital for Sick Children, Toronto, ON; and Deborah Hirtz, MD, National Institutes of Health (NINDS), Bethesda, MD.
References
- 1.Ashley-Koch A, Yang Q, Olney RS. Sickle hemoglobin (HbS) allele and sickle cell disease: a HuGE review. Am J Epidemiol. 2000;151:839–845. doi: 10.1093/oxfordjournals.aje.a010288. [DOI] [PubMed] [Google Scholar]
- 2.Ohene-Frempong K, Weiner SJ, Sleeper LA, et al. Cerebrovascular accidents in sickle cell disease: rates and risk factors. Blood. 1998;91(1):288–294. [PubMed] [Google Scholar]
- 3.Adams RJ, McKie VC, Brambilla D, et al. Stroke prevention trial in sickle cell anemia. Control Clin Trials. 1998;19(1):110–129. doi: 10.1016/S0197-2456(97)00099-8. [DOI] [PubMed] [Google Scholar]
- 4.Adams RJ. Stroke prevention and treatment in sickle cell disease. Arch Neurol. 2001;58(4):565–568. doi: 10.1001/archneur.58.4.565. [DOI] [PubMed] [Google Scholar]
- 5.Buchanan GR, DeBaun MR, Quinn CT, Steinberg MH: Sickle Cell Disease. ASH Education Book. 35–47, 2004 [DOI] [PubMed]
- 6.Miller ST, Sleeper LA, Pegelow CH, et al. Prediction of adverse outcomes in children with sickle cell disease. N Engl J Med. 2000;342(2):83–89. doi: 10.1056/NEJM200001133420203. [DOI] [PubMed] [Google Scholar]
- 7.Schatz J, Brown RT, Pascual JM, Hsu L, DeBaun MR. Poor school and cognitive functioning with silent cerebral infarcts and sickle cell disease. Neurology. 2001;56(8):1109–1111. doi: 10.1212/wnl.56.8.1109. [DOI] [PubMed] [Google Scholar]
- 8.Armstrong FD, Thompson RJ, Jr, Wang W, et al. Cognitive functioning and brain magnetic resonance imaging in children with sickle Cell disease. Neuropsychology Committee of the Cooperative Study of Sickle Cell Disease. Pediatrics. 1996;97(6 Pt 1):864–870. [PubMed] [Google Scholar]
- 9.DeBaun MR, Schatz J, Siegel MJ, et al. Cognitive screening examinations for silent cerebral infarcts in sickle cell disease. Neurology. 1998;50(6):1678–1682. doi: 10.1212/wnl.50.6.1678. [DOI] [PubMed] [Google Scholar]
- 10.Pegelow CH, Macklin EA, Moser FG, et al. Longitudinal changes in brain magnetic resonance imaging findings in children with sickle cell disease. Blood. 2002;99(8):3014–3018. doi: 10.1182/blood.V99.8.3014. [DOI] [PubMed] [Google Scholar]
- 11.Moore SM, Maffitt DR, Blaine GJ, KT B: A workstation acquisition node for multi-center imaging studies. In: SPIE, Medical Imaging 2001, PACS and Integrated Medical Information Systems: Design and Evaluation; 2001; 2001. pp 271–277
- 12.Moore SM, Gierada DS, Clark KW: Image quality assurance in the Prostate, Lung, Colon, and Ovarian (PLCO) Cancer Screening Trial Network of the National Lung Screening Trial (NLST). In: SCAR. Vancouver, BC; 2004.
- 13.Clark KW, Gierada DS, Moore SM, et al. Creation of a CT image library for the lung screening study of the National Lung Screening Trial. J Digit Imaging. 2007;20(1):23–31. doi: 10.1007/s10278-006-0589-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.DICOM Standards Committee WG: Digital imaging and communications in medicine (DICOM) Supplement 55: Attribute level confidentiality, 2002