Transfection of a bovine stomach cDNA encoding for an NK 2 receptor into murine fibroblasts produ... more Transfection of a bovine stomach cDNA encoding for an NK 2 receptor into murine fibroblasts produced a clone that exhibited specific binding of NKA, a selective NK 2 agonist. In these transfected cells, NKA mediated hydrolysis of phosphatidyl-inositol (PI) with an EC 50 value of 10 nM ...
Geographic information systems (GIS), global positioning systems and remote sensing have been inc... more Geographic information systems (GIS), global positioning systems and remote sensing have been increasingly used in public health settings since the 1990s, but application of these methods in humanitarian emergencies has been less documented. Recent areas of application of GIS methods in humanitarian emergencies include hazard, vulnerability, and risk assessments; rapid assessment and survey methods; disease distribution and outbreak investigations; planning and implementation of health information systems; data and programme integration; and programme monitoring and evaluation. The main use of GIS in these areas is to provide maps for decisionmaking and advocacy, which allow overlaying types of information that may not normally be linked. GIS is also used to improve data collection in the field (for example, for rapid health assessments or mortality surveys). Development of GIS methods requires further research. Although GIS methods may save resources and reduce error, initial investment in equipment and capacity building may be substantial. Especially in humanitarian emergencies, equipment and methodologies must be practical and appropriate for field use. Add-on software to process GIS data needs to be developed and modified. As equipment becomes more user-friendly and costs decrease, GIS will become more of a routine tool for humanitarian aid organisations in humanitarian emergencies, and new and innovative uses will evolve.
On 20 Nov 2019 the Tonkolili District Health Office was notified that a physician working in the ... more On 20 Nov 2019 the Tonkolili District Health Office was notified that a physician working in the district hospital was diagnosed with Lassa Fever (LF). The Tonkolili District had its last LF case in 2012. An investigation was performed to determine mode of transmission, magnitude and scope of this outbreak. Clinical information, exposure history, and blood samples were collected. Active case search and Infection Prevention and Control (IPC) assessment were conducted in the hospital and community. Three of five people with symptoms compatible with LF were polymerase chain reaction positive. The primary case, a pregnant woman from the community, was admitted with severe bleeding and operated by two surgeons and anesthetist. The same medical staff operated on another woman later that day. Three of the five cases died. The hospital assessment revealed non-adherence to IPC procedures. The primary case's residence had unhygienic conditions and inappropriate food storage. Low index of suspicion for LF and non-compliance to IPC procedures contributed to the associated healthcare workers' infection spread. Health workers were sensitized to LF and trained on IPC. Education of the community in high-risk areas about LF recognition, transmission and ways to decrease rodent populations in and around their homes is recommended.
BackgroundIn February 2021 Kazakhstan began offering COVID-19 vaccines to adults. Breakthrough SA... more BackgroundIn February 2021 Kazakhstan began offering COVID-19 vaccines to adults. Breakthrough SARS-CoV-2 infections raised concerns about real-world vaccine effectiveness. We aimed to evaluate effectiveness of four vaccines against SARS-CoV-2 infection.MethodsWe conducted a retrospective cohort analysis among adults in Almaty using aggregated vaccination data and individual-level breakthrough COVID-19 cases (≥14 days from 2nd dose) using national surveillance data. We ran time-adjusted Cox-proportional-hazards model with sensitivity analysis accounting for varying entry into vaccinated cohort to assess vaccine effectiveness for each vaccine (measured as 1-adjusted hazard ratios) using the unvaccinated population as reference (N = 565,390). We separately calculated daily cumulative hazards for COVID-19 breakthrough among vaccinated persons by age and vaccination month.ResultsFrom February 22 to September 1, 2021, in Almaty, 747,558 (57%) adults were fully vaccinated (received 2 doses), and 108,324 COVID-19 cases (11,472 breakthrough) were registered. Vaccine effectiveness against infection was 79% [sensitivity estimates (SE): 74%–82%] for QazVac, 77% (SE: 71%–81%) for Sputnik V, 71% (SE: 69%–72%) for Hayat-Vax, and 70% (SE: 65%–72%) for CoronaVac. Among vaccinated persons, the 90-day follow-up cumulative hazard for breakthrough infection was 2.2%. Cumulative hazard was 2.9% among people aged ≥60 years versus 1.9% among persons aged 18–39 years (p < 0.001), and 1.2% for people vaccinated in February–May versus 3.3% in June–August (p < 0.001).ConclusionOur analysis demonstrates high effectiveness of COVID-19 vaccines against infection in Almaty similar to other observational studies. Higher cumulative hazard of breakthrough among people ≥60 years of age and during variant surges warrants targeted booster vaccination campaigns.
Neurokinin A (NKA) mediated a concentration dependent increase in the intracellular free Ca 2+ co... more Neurokinin A (NKA) mediated a concentration dependent increase in the intracellular free Ca 2+ concentration, [Ca2+]i, in B82 fibroblasts transfected with the neurokinin 2 (NK2) receptor. The ECs0 value of this response was 24 nM. A selective NK 2 antagonist. MEN 10207, at a concentration of 1 ~M completely inhibited the [Ca2+]i rise to 0.1 #M NKA. These results suggest that activation of NK 2 receptors expressed in the transfected fibroblasts are functionally coupled to intracellular calcium mobilization.
In a study conducted by the US Geological Survey and the Centers for Disease Control and Preventi... more In a study conducted by the US Geological Survey and the Centers for Disease Control and Prevention, 24 water samples were collected at selected locations within a drinking-water-treatment (DWT) facility and from the two streams that serve the facility to evaluate the potential for wastewater-related organic contaminants to survive a conventional treatment process and persist in potable-water supplies. Stream-water samples as well as samples of raw, settled, filtered, and finished water were collected during low-flow conditions, when the discharge of effluent from upstream municipal sewage-treatment plants accounted for 37-67% of flow in stream 1 and 10-20% of flow in stream 2. Each sample was analyzed for 106 organic wastewater-related contaminants (OWCs) that represent a diverse group of extensively used chemicals. Forty OWCs were detected in one or more samples of stream water or raw-water supplies in the treatment plant; 34 were detected in more than 10% of these samples. Several of these compounds also were frequently detected in samples of finished water; these compounds include selected prescription and non-prescription drugs and their metabolites, fragrance compounds, flame retardants and plasticizers, cosmetic compounds, and a solvent. The detection of these compounds suggests that they resist removal through conventional water-treatment processes. Other compounds that also were frequently detected in samples of stream water and rawwater supplies were not detected in samples of finished water; these include selected prescription and non-prescription drugs and their metabolites, disinfectants, detergent metabolites, and plant and animal steroids. The non-detection of these compounds indicates that their concentrations are reduced to levels less than analytical detection limits or that they are transformed to degradates through conventional DWT processes. Concentrations of OWCs detected in finished water generally were low and did not exceed Federal drinking-water standards or lifetime health advisories, although such standards or advisories have not been established for most of these compounds. Also, at least 11 and as many as 17 OWCs were detected in samples of finished water. Drinking-water criteria currently are based on the toxicity of individual compounds and not combinations of compounds. Little is known about potential human-health effects associated with chronic exposure to trace levels of multiple OWCs through routes such as drinking water. The occurrence in drinking-water supplies of many of the OWCs analyzed for during this study is unregulated and most
Many Bangladeshi suffer from arsenic-related health concerns. Most mitigation activities focus on... more Many Bangladeshi suffer from arsenic-related health concerns. Most mitigation activities focus on identifying contaminated wells and reducing the amount of arsenic ingested from well water. Food as a source of arsenic exposure has been recently documented. The objectives of this study were to measure the main types of arsenic in commonly consumed foods in Bangladesh and estimate the average daily intake (ADI) of arsenic from food and water. Total, organic and inorganic, arsenic were measured in drinking water and in cooked rice and vegetables from Bangladeshi households. The mean total arsenic level in 46 rice samples was 358 microg/kg (range: 46 to 1,110 microg/kg dry weight) and 333 microg/kg (range: 19 to 2,334 microg/kg dry weight) in 39 vegetable samples. Inorganic arsenic calculated as arsenite and arsenate made up 87% of the total arsenic measured in rice, and 96% of the total arsenic in vegetables. Total arsenic in water ranged from 200 to 500 microg/L. Using individual, self-reported data on daily consumption of rice and drinking water the total arsenic ADI was 1,176 microg (range: 419 to 2,053 microg), 14% attributable to inorganic arsenic in cooked rice. The ADI is a conservative estimate; vegetable arsenic was not included due to limitations in self-reported daily consumption amounts. Given the arsenic levels measured in food and water and consumption of these items, cooked rice and vegetables are a substantial exposure pathway for inorganic arsenic. Intervention strategies must consider all sources of dietary arsenic intake.
In a study conducted by the US Geological Survey and the Centers for Disease Control and Preventi... more In a study conducted by the US Geological Survey and the Centers for Disease Control and Prevention, 24 water samples were collected at selected locations within a drinking-water-treatment (DWT) facility and from the two streams that serve the facility to evaluate the potential for wastewater-related organic contaminants to survive a conventional treatment process and persist in potable-water supplies. Stream-water samples as well as samples of raw, settled, filtered, and finished water were collected during low-flow conditions, when the discharge of effluent from upstream municipal sewage-treatment plants accounted for 37-67% of flow in stream 1 and 10-20% of flow in stream 2. Each sample was analyzed for 106 organic wastewater-related contaminants (OWCs) that represent a diverse group of extensively used chemicals. Forty OWCs were detected in one or more samples of stream water or raw-water supplies in the treatment plant; 34 were detected in more than 10% of these samples. Several of these compounds also were frequently detected in samples of finished water; these compounds include selected prescription and non-prescription drugs and their metabolites, fragrance compounds, flame retardants and plasticizers, cosmetic compounds, and a solvent. The detection of these compounds suggests that they resist removal through conventional water-treatment processes. Other compounds that also were frequently detected in samples of stream water and rawwater supplies were not detected in samples of finished water; these include selected prescription and non-prescription drugs and their metabolites, disinfectants, detergent metabolites, and plant and animal steroids. The non-detection of these compounds indicates that their concentrations are reduced to levels less than analytical detection limits or that they are transformed to degradates through conventional DWT processes. Concentrations of OWCs detected in finished water generally were low and did not exceed Federal drinking-water standards or lifetime health advisories, although such standards or advisories have not been established for most of these compounds. Also, at least 11 and as many as 17 OWCs were detected in samples of finished water. Drinking-water criteria currently are based on the toxicity of individual compounds and not combinations of compounds. Little is known about potential human-health effects associated with chronic exposure to trace levels of multiple OWCs through routes such as drinking water. The occurrence in drinking-water supplies of many of the OWCs analyzed for during this study is unregulated and most
International Journal of Tropical Disease & Health, Sep 24, 2022
Background: In December 2019, the COVID-19 pandemic began in Wuhan and quickly spread in China an... more Background: In December 2019, the COVID-19 pandemic began in Wuhan and quickly spread in China and other countries in the world. The SARS-CoV-2 virus reached Bangladesh in March 2020 and the index case of the first cluster of COVID-19 was reported on 13 March, 2020, in Madaripur District. Methods: A team from the Institute of Epidemiology, Disease Control and Research (IEDCR), of Ministry of Health and Family Welfare, Bangladesh investigated the cluster, established active syndromic surveillance for respiratory diseases, and implemented control activities. Results: The index case traveled from Italy to Bangladesh and developed respiratory symptoms and sought medical treatment in Dhaka. He was diagnosed with COVID-19 and transferred and isolated in a hospital on the day of diagnosis. We followed up his contacts as soon as we got their names and contact information. We quarantined 34 among 139 contacts, rest of them were missed contacts. The attack rate among the index cases' contacts was 18% (6/34).
Western Pacific Surveillance and Response, Sep 25, 2017
The Pacific region has widely dispersed populations, limited financial and human resources and a ... more The Pacific region has widely dispersed populations, limited financial and human resources and a high burden of disease. There is an urgent need to improve the availability, reliability and timeliness of useable health data. Context: The purpose of this paper is to share lessons learnt from a three-year pilot field epidemiology training programme that was designed to respond to these Pacific health challenges. The pilot programme built on and further developed an existing field epidemiology training programme for Pacific health staff. Action: The programme was delivered in country by epidemiologists working for Pacific Public Health Surveillance Network partners. The programme consisted of five courses: four one-week classroom-based courses and one field epidemiology project. Sessions were structured so that theoretical understanding was achieved through interaction and reinforced through practical hands-on group activities, case studies and other interactive practical learning methods. Outcome: As of September 2016, 258 students had commenced the programme. Twenty-six course workshops were delivered and one cohort of students had completed the full five-course programme. The programme proved popular and gained a high level of student engagement. Discussion: Face-to-face delivery, a low student-to-facilitator ratio, substantial group work and practical exercises were identified as key factors that contributed to the students developing skills and confidence. Close engagement of leaders and the need to quickly evaluate and adapt the curriculum were important lessons, and the collaboration between external partners was considered important for promoting a harmonized approach to health needs in the Pacific. Problem: The Pacific region has widely dispersed populations, limited financial and human resources and a high burden of disease. There is an urgent need to improve the availability, reliability and timeliness of useable health data. Context: The purpose of this paper is to share lessons learnt from a three-year pilot field epidemiology training programme that was designed to respond to these Pacific health challenges. The pilot programme built on and further developed an existing field epidemiology training programme for Pacific health staff. Action: The programme was delivered in country by epidemiologists working for Pacific Public Health Surveillance Network partners. The programme consisted of five courses: four one-week classroom-based courses and one field epidemiology project. Sessions were structured so that theoretical understanding was achieved through interaction and reinforced through practical hands-on group activities, case studies and other interactive practical learning methods.
To identify risk factors for death or injury from landmines and ordnance in Kabul City, Afghanist... more To identify risk factors for death or injury from landmines and ordnance in Kabul City, Afghanistan, so programs can target preventive actions. Methods: Active surveillance in hospitals and communities for injuries and deaths from landmine and ordnance explosions in Kabul City. Results: Of the 571 people the authors identified during the 25month period, 161 suffered a traumatic amputation and 94 were killed from a landmine or ordnance explosion. Of those asked, 19% of victims had received mine awareness education before the incident, and of those, the majority was injured while handling or playing with an explosive device. Most victims were young males with a few years of education. The occupation types most at risk were students and laborers, and unemployment was common among the victims. Collecting wood or paper and playing with or handling an explosive were the most frequent activities associated with injuries and deaths. Conclusions: From May 1996 to July 1998, explosions from landmines and ordnance claimed 571 victims and were an important preventable cause of injury and death among people in Kabul City. Prevention strategies should focus on high-risk groups and changing risky behaviors, such as tampering with explosive devices.
Much progress has been made in recent years to address the estimation of summary statistics, usin... more Much progress has been made in recent years to address the estimation of summary statistics, using data that are subject to censoring of results that fall below the limit of detection (LOD) for the measuring instrument. Truncated data methods (e.g., Tobit regression) and multiple-imputation are two approaches for analyzing data results that are below the LOD. To apply these methods requires an assumption about the underlying distribution of the data. Because the log-normal distribution has been shown to fit many data sets obtained from environmental measurements, the common practice is to assume that measurements of environmental factors can be described by log-normal distributions. This article describes methods for obtaining estimates of percentiles and their associated confidence intervals when the results are log-normal and a fraction of the results are below the LOD. We present limited simulations to demonstrate the bias of the proposed estimates and the coverage probability of their associated confidence intervals. Estimation methods are used to generate summary statistics for 2,3,7,8-tetrachloro dibenzo-p-dioxin (2,3,7,8-TCDD) using data from a 2001 background exposure study in which PCDDs/PCDFs/cPCBs in human blood serum were measured in a Louisiana population. Because the congener measurements used in this study were subject to variable LODs, we also present simulation results to demonstrate the effect of variable LODs on the multiple-imputation process.
Archives of Environmental & Occupational Health, Jul 1, 2005
To study the association between levels of lead in blood and bone among female former smelter wor... more To study the association between levels of lead in blood and bone among female former smelter workers in Bunker Hill, Idaho, the authors performed a longitudinal study using homeostatic regulators of calcium and biomarkers of bone turnover. The authors measured participants' blood lead levels (by means of a graphite furnace atomic absorption spectrophotometer) and tibia-bone lead levels (by means of the 109Cd K x-ray fluorescence system) in 1994 and again in 2000; serum ionized calcium, parathyroid hormone, osteocalcin, urinary deoxypyridinoline, pyridinoline, and 1, 25-dihydroxyvitamin D were measured. After controlling for weight and age, significant predictors of changes in blood lead levels from 1994 to 2000 in postmenopausal women were duration of employment, higher ionized calcium levels, alcohol consumption, and higher parathyroid hormone levels. Predictors of change in tibia-bone lead levels in the same group of women were employment in a technical job such as mining and higher urinary pyridinoline levels (p < .05). Changes in blood and bone lead levels over time were associated with increased bone resorption, especially among postmenopausal women.
Transfection of a bovine stomach cDNA encoding for an NK 2 receptor into murine fibroblasts produ... more Transfection of a bovine stomach cDNA encoding for an NK 2 receptor into murine fibroblasts produced a clone that exhibited specific binding of NKA, a selective NK 2 agonist. In these transfected cells, NKA mediated hydrolysis of phosphatidyl-inositol (PI) with an EC 50 value of 10 nM ...
Geographic information systems (GIS), global positioning systems and remote sensing have been inc... more Geographic information systems (GIS), global positioning systems and remote sensing have been increasingly used in public health settings since the 1990s, but application of these methods in humanitarian emergencies has been less documented. Recent areas of application of GIS methods in humanitarian emergencies include hazard, vulnerability, and risk assessments; rapid assessment and survey methods; disease distribution and outbreak investigations; planning and implementation of health information systems; data and programme integration; and programme monitoring and evaluation. The main use of GIS in these areas is to provide maps for decisionmaking and advocacy, which allow overlaying types of information that may not normally be linked. GIS is also used to improve data collection in the field (for example, for rapid health assessments or mortality surveys). Development of GIS methods requires further research. Although GIS methods may save resources and reduce error, initial investment in equipment and capacity building may be substantial. Especially in humanitarian emergencies, equipment and methodologies must be practical and appropriate for field use. Add-on software to process GIS data needs to be developed and modified. As equipment becomes more user-friendly and costs decrease, GIS will become more of a routine tool for humanitarian aid organisations in humanitarian emergencies, and new and innovative uses will evolve.
On 20 Nov 2019 the Tonkolili District Health Office was notified that a physician working in the ... more On 20 Nov 2019 the Tonkolili District Health Office was notified that a physician working in the district hospital was diagnosed with Lassa Fever (LF). The Tonkolili District had its last LF case in 2012. An investigation was performed to determine mode of transmission, magnitude and scope of this outbreak. Clinical information, exposure history, and blood samples were collected. Active case search and Infection Prevention and Control (IPC) assessment were conducted in the hospital and community. Three of five people with symptoms compatible with LF were polymerase chain reaction positive. The primary case, a pregnant woman from the community, was admitted with severe bleeding and operated by two surgeons and anesthetist. The same medical staff operated on another woman later that day. Three of the five cases died. The hospital assessment revealed non-adherence to IPC procedures. The primary case's residence had unhygienic conditions and inappropriate food storage. Low index of suspicion for LF and non-compliance to IPC procedures contributed to the associated healthcare workers' infection spread. Health workers were sensitized to LF and trained on IPC. Education of the community in high-risk areas about LF recognition, transmission and ways to decrease rodent populations in and around their homes is recommended.
BackgroundIn February 2021 Kazakhstan began offering COVID-19 vaccines to adults. Breakthrough SA... more BackgroundIn February 2021 Kazakhstan began offering COVID-19 vaccines to adults. Breakthrough SARS-CoV-2 infections raised concerns about real-world vaccine effectiveness. We aimed to evaluate effectiveness of four vaccines against SARS-CoV-2 infection.MethodsWe conducted a retrospective cohort analysis among adults in Almaty using aggregated vaccination data and individual-level breakthrough COVID-19 cases (≥14 days from 2nd dose) using national surveillance data. We ran time-adjusted Cox-proportional-hazards model with sensitivity analysis accounting for varying entry into vaccinated cohort to assess vaccine effectiveness for each vaccine (measured as 1-adjusted hazard ratios) using the unvaccinated population as reference (N = 565,390). We separately calculated daily cumulative hazards for COVID-19 breakthrough among vaccinated persons by age and vaccination month.ResultsFrom February 22 to September 1, 2021, in Almaty, 747,558 (57%) adults were fully vaccinated (received 2 doses), and 108,324 COVID-19 cases (11,472 breakthrough) were registered. Vaccine effectiveness against infection was 79% [sensitivity estimates (SE): 74%–82%] for QazVac, 77% (SE: 71%–81%) for Sputnik V, 71% (SE: 69%–72%) for Hayat-Vax, and 70% (SE: 65%–72%) for CoronaVac. Among vaccinated persons, the 90-day follow-up cumulative hazard for breakthrough infection was 2.2%. Cumulative hazard was 2.9% among people aged ≥60 years versus 1.9% among persons aged 18–39 years (p < 0.001), and 1.2% for people vaccinated in February–May versus 3.3% in June–August (p < 0.001).ConclusionOur analysis demonstrates high effectiveness of COVID-19 vaccines against infection in Almaty similar to other observational studies. Higher cumulative hazard of breakthrough among people ≥60 years of age and during variant surges warrants targeted booster vaccination campaigns.
Neurokinin A (NKA) mediated a concentration dependent increase in the intracellular free Ca 2+ co... more Neurokinin A (NKA) mediated a concentration dependent increase in the intracellular free Ca 2+ concentration, [Ca2+]i, in B82 fibroblasts transfected with the neurokinin 2 (NK2) receptor. The ECs0 value of this response was 24 nM. A selective NK 2 antagonist. MEN 10207, at a concentration of 1 ~M completely inhibited the [Ca2+]i rise to 0.1 #M NKA. These results suggest that activation of NK 2 receptors expressed in the transfected fibroblasts are functionally coupled to intracellular calcium mobilization.
In a study conducted by the US Geological Survey and the Centers for Disease Control and Preventi... more In a study conducted by the US Geological Survey and the Centers for Disease Control and Prevention, 24 water samples were collected at selected locations within a drinking-water-treatment (DWT) facility and from the two streams that serve the facility to evaluate the potential for wastewater-related organic contaminants to survive a conventional treatment process and persist in potable-water supplies. Stream-water samples as well as samples of raw, settled, filtered, and finished water were collected during low-flow conditions, when the discharge of effluent from upstream municipal sewage-treatment plants accounted for 37-67% of flow in stream 1 and 10-20% of flow in stream 2. Each sample was analyzed for 106 organic wastewater-related contaminants (OWCs) that represent a diverse group of extensively used chemicals. Forty OWCs were detected in one or more samples of stream water or raw-water supplies in the treatment plant; 34 were detected in more than 10% of these samples. Several of these compounds also were frequently detected in samples of finished water; these compounds include selected prescription and non-prescription drugs and their metabolites, fragrance compounds, flame retardants and plasticizers, cosmetic compounds, and a solvent. The detection of these compounds suggests that they resist removal through conventional water-treatment processes. Other compounds that also were frequently detected in samples of stream water and rawwater supplies were not detected in samples of finished water; these include selected prescription and non-prescription drugs and their metabolites, disinfectants, detergent metabolites, and plant and animal steroids. The non-detection of these compounds indicates that their concentrations are reduced to levels less than analytical detection limits or that they are transformed to degradates through conventional DWT processes. Concentrations of OWCs detected in finished water generally were low and did not exceed Federal drinking-water standards or lifetime health advisories, although such standards or advisories have not been established for most of these compounds. Also, at least 11 and as many as 17 OWCs were detected in samples of finished water. Drinking-water criteria currently are based on the toxicity of individual compounds and not combinations of compounds. Little is known about potential human-health effects associated with chronic exposure to trace levels of multiple OWCs through routes such as drinking water. The occurrence in drinking-water supplies of many of the OWCs analyzed for during this study is unregulated and most
Many Bangladeshi suffer from arsenic-related health concerns. Most mitigation activities focus on... more Many Bangladeshi suffer from arsenic-related health concerns. Most mitigation activities focus on identifying contaminated wells and reducing the amount of arsenic ingested from well water. Food as a source of arsenic exposure has been recently documented. The objectives of this study were to measure the main types of arsenic in commonly consumed foods in Bangladesh and estimate the average daily intake (ADI) of arsenic from food and water. Total, organic and inorganic, arsenic were measured in drinking water and in cooked rice and vegetables from Bangladeshi households. The mean total arsenic level in 46 rice samples was 358 microg/kg (range: 46 to 1,110 microg/kg dry weight) and 333 microg/kg (range: 19 to 2,334 microg/kg dry weight) in 39 vegetable samples. Inorganic arsenic calculated as arsenite and arsenate made up 87% of the total arsenic measured in rice, and 96% of the total arsenic in vegetables. Total arsenic in water ranged from 200 to 500 microg/L. Using individual, self-reported data on daily consumption of rice and drinking water the total arsenic ADI was 1,176 microg (range: 419 to 2,053 microg), 14% attributable to inorganic arsenic in cooked rice. The ADI is a conservative estimate; vegetable arsenic was not included due to limitations in self-reported daily consumption amounts. Given the arsenic levels measured in food and water and consumption of these items, cooked rice and vegetables are a substantial exposure pathway for inorganic arsenic. Intervention strategies must consider all sources of dietary arsenic intake.
In a study conducted by the US Geological Survey and the Centers for Disease Control and Preventi... more In a study conducted by the US Geological Survey and the Centers for Disease Control and Prevention, 24 water samples were collected at selected locations within a drinking-water-treatment (DWT) facility and from the two streams that serve the facility to evaluate the potential for wastewater-related organic contaminants to survive a conventional treatment process and persist in potable-water supplies. Stream-water samples as well as samples of raw, settled, filtered, and finished water were collected during low-flow conditions, when the discharge of effluent from upstream municipal sewage-treatment plants accounted for 37-67% of flow in stream 1 and 10-20% of flow in stream 2. Each sample was analyzed for 106 organic wastewater-related contaminants (OWCs) that represent a diverse group of extensively used chemicals. Forty OWCs were detected in one or more samples of stream water or raw-water supplies in the treatment plant; 34 were detected in more than 10% of these samples. Several of these compounds also were frequently detected in samples of finished water; these compounds include selected prescription and non-prescription drugs and their metabolites, fragrance compounds, flame retardants and plasticizers, cosmetic compounds, and a solvent. The detection of these compounds suggests that they resist removal through conventional water-treatment processes. Other compounds that also were frequently detected in samples of stream water and rawwater supplies were not detected in samples of finished water; these include selected prescription and non-prescription drugs and their metabolites, disinfectants, detergent metabolites, and plant and animal steroids. The non-detection of these compounds indicates that their concentrations are reduced to levels less than analytical detection limits or that they are transformed to degradates through conventional DWT processes. Concentrations of OWCs detected in finished water generally were low and did not exceed Federal drinking-water standards or lifetime health advisories, although such standards or advisories have not been established for most of these compounds. Also, at least 11 and as many as 17 OWCs were detected in samples of finished water. Drinking-water criteria currently are based on the toxicity of individual compounds and not combinations of compounds. Little is known about potential human-health effects associated with chronic exposure to trace levels of multiple OWCs through routes such as drinking water. The occurrence in drinking-water supplies of many of the OWCs analyzed for during this study is unregulated and most
International Journal of Tropical Disease & Health, Sep 24, 2022
Background: In December 2019, the COVID-19 pandemic began in Wuhan and quickly spread in China an... more Background: In December 2019, the COVID-19 pandemic began in Wuhan and quickly spread in China and other countries in the world. The SARS-CoV-2 virus reached Bangladesh in March 2020 and the index case of the first cluster of COVID-19 was reported on 13 March, 2020, in Madaripur District. Methods: A team from the Institute of Epidemiology, Disease Control and Research (IEDCR), of Ministry of Health and Family Welfare, Bangladesh investigated the cluster, established active syndromic surveillance for respiratory diseases, and implemented control activities. Results: The index case traveled from Italy to Bangladesh and developed respiratory symptoms and sought medical treatment in Dhaka. He was diagnosed with COVID-19 and transferred and isolated in a hospital on the day of diagnosis. We followed up his contacts as soon as we got their names and contact information. We quarantined 34 among 139 contacts, rest of them were missed contacts. The attack rate among the index cases' contacts was 18% (6/34).
Western Pacific Surveillance and Response, Sep 25, 2017
The Pacific region has widely dispersed populations, limited financial and human resources and a ... more The Pacific region has widely dispersed populations, limited financial and human resources and a high burden of disease. There is an urgent need to improve the availability, reliability and timeliness of useable health data. Context: The purpose of this paper is to share lessons learnt from a three-year pilot field epidemiology training programme that was designed to respond to these Pacific health challenges. The pilot programme built on and further developed an existing field epidemiology training programme for Pacific health staff. Action: The programme was delivered in country by epidemiologists working for Pacific Public Health Surveillance Network partners. The programme consisted of five courses: four one-week classroom-based courses and one field epidemiology project. Sessions were structured so that theoretical understanding was achieved through interaction and reinforced through practical hands-on group activities, case studies and other interactive practical learning methods. Outcome: As of September 2016, 258 students had commenced the programme. Twenty-six course workshops were delivered and one cohort of students had completed the full five-course programme. The programme proved popular and gained a high level of student engagement. Discussion: Face-to-face delivery, a low student-to-facilitator ratio, substantial group work and practical exercises were identified as key factors that contributed to the students developing skills and confidence. Close engagement of leaders and the need to quickly evaluate and adapt the curriculum were important lessons, and the collaboration between external partners was considered important for promoting a harmonized approach to health needs in the Pacific. Problem: The Pacific region has widely dispersed populations, limited financial and human resources and a high burden of disease. There is an urgent need to improve the availability, reliability and timeliness of useable health data. Context: The purpose of this paper is to share lessons learnt from a three-year pilot field epidemiology training programme that was designed to respond to these Pacific health challenges. The pilot programme built on and further developed an existing field epidemiology training programme for Pacific health staff. Action: The programme was delivered in country by epidemiologists working for Pacific Public Health Surveillance Network partners. The programme consisted of five courses: four one-week classroom-based courses and one field epidemiology project. Sessions were structured so that theoretical understanding was achieved through interaction and reinforced through practical hands-on group activities, case studies and other interactive practical learning methods.
To identify risk factors for death or injury from landmines and ordnance in Kabul City, Afghanist... more To identify risk factors for death or injury from landmines and ordnance in Kabul City, Afghanistan, so programs can target preventive actions. Methods: Active surveillance in hospitals and communities for injuries and deaths from landmine and ordnance explosions in Kabul City. Results: Of the 571 people the authors identified during the 25month period, 161 suffered a traumatic amputation and 94 were killed from a landmine or ordnance explosion. Of those asked, 19% of victims had received mine awareness education before the incident, and of those, the majority was injured while handling or playing with an explosive device. Most victims were young males with a few years of education. The occupation types most at risk were students and laborers, and unemployment was common among the victims. Collecting wood or paper and playing with or handling an explosive were the most frequent activities associated with injuries and deaths. Conclusions: From May 1996 to July 1998, explosions from landmines and ordnance claimed 571 victims and were an important preventable cause of injury and death among people in Kabul City. Prevention strategies should focus on high-risk groups and changing risky behaviors, such as tampering with explosive devices.
Much progress has been made in recent years to address the estimation of summary statistics, usin... more Much progress has been made in recent years to address the estimation of summary statistics, using data that are subject to censoring of results that fall below the limit of detection (LOD) for the measuring instrument. Truncated data methods (e.g., Tobit regression) and multiple-imputation are two approaches for analyzing data results that are below the LOD. To apply these methods requires an assumption about the underlying distribution of the data. Because the log-normal distribution has been shown to fit many data sets obtained from environmental measurements, the common practice is to assume that measurements of environmental factors can be described by log-normal distributions. This article describes methods for obtaining estimates of percentiles and their associated confidence intervals when the results are log-normal and a fraction of the results are below the LOD. We present limited simulations to demonstrate the bias of the proposed estimates and the coverage probability of their associated confidence intervals. Estimation methods are used to generate summary statistics for 2,3,7,8-tetrachloro dibenzo-p-dioxin (2,3,7,8-TCDD) using data from a 2001 background exposure study in which PCDDs/PCDFs/cPCBs in human blood serum were measured in a Louisiana population. Because the congener measurements used in this study were subject to variable LODs, we also present simulation results to demonstrate the effect of variable LODs on the multiple-imputation process.
Archives of Environmental & Occupational Health, Jul 1, 2005
To study the association between levels of lead in blood and bone among female former smelter wor... more To study the association between levels of lead in blood and bone among female former smelter workers in Bunker Hill, Idaho, the authors performed a longitudinal study using homeostatic regulators of calcium and biomarkers of bone turnover. The authors measured participants' blood lead levels (by means of a graphite furnace atomic absorption spectrophotometer) and tibia-bone lead levels (by means of the 109Cd K x-ray fluorescence system) in 1994 and again in 2000; serum ionized calcium, parathyroid hormone, osteocalcin, urinary deoxypyridinoline, pyridinoline, and 1, 25-dihydroxyvitamin D were measured. After controlling for weight and age, significant predictors of changes in blood lead levels from 1994 to 2000 in postmenopausal women were duration of employment, higher ionized calcium levels, alcohol consumption, and higher parathyroid hormone levels. Predictors of change in tibia-bone lead levels in the same group of women were employment in a technical job such as mining and higher urinary pyridinoline levels (p < .05). Changes in blood and bone lead levels over time were associated with increased bone resorption, especially among postmenopausal women.
Uploads
Papers by Alden Henderson