Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Geohealth ; 3(7): 190-200, 2019 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-32159041

RESUMEN

A growing literature has documented that rising concentrations of carbon dioxide in the atmosphere threaten to reduce the iron, zinc, and protein content of staple food crops including rice, wheat, barley, legumes, maize, and potatoes, potentially creating or worsening global nutritional deficiencies for over a billion people worldwide. A recent study extended these previous nutrient analyses to include B vitamins and showed that, in rice alone, the average loss of major B vitamins (thiamin, riboflavin, and folate) was shown to be 17-30% when grown under higher CO2. Here, we employ the EAR cut-point method, using estimates of national-level nutrient supplies and requirements, to estimate how B vitamin dietary adequacy may be affected by the CO2-induced loss of nutrients from rice only. Furthermore, we use the global burden of disease comparative risk assessment framework to quantify one small portion of the health burden related to rising deficiency: a higher likelihood of neural tube defects for folate-deficient mothers. We find that, as a result of this effect alone, risk of folate deficiency could rise by 1.5 percentage points (95% confidence interval: 0.6-2.6), corresponding to 132 million (57-239 million) people. Risk of thiamin deficiency could rise by 0.7 points (0.3-1.1) or 67 million people (30-110 million), and riboflavin deficiency by 0.4 points (0.2-0.6) or 40 million people (22-59 million). Because elevated CO2 concentrations are likely to reduce B vitamins in other crops beyond rice, our findings likely represent an underestimate of the impact of anthropogenic CO2 emissions on sufficiency of B vitamin intake.

2.
Geohealth ; 1(6): 248-257, 2017 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-32158990

RESUMEN

Iron deficiency reduces capacity for physical activity, lowers IQ, and increases maternal and child mortality, impacting roughly a billion people worldwide. Recent studies have shown that certain highly consumed crops-C3 grains (e.g., wheat, rice, and barley), legumes, and maize-have lower iron concentrations of 4-10% when grown under increased atmospheric CO2 concentrations (550 ppm). We examined diets in 152 countries globally (95.5% of the population) to estimate the percentage of lost dietary iron resulting from anthropogenic CO2 emissions between now and 2050, specifically among vulnerable age-sex groups: children (1-5 years) and women of childbearing age (15-49 years), holding diets constant. We also cross-referenced these with the current prevalence of anemia to identify most at-risk countries. We found that 1.4 billion children aged 1-5 and women of childbearing age (59% of global total for these groups) live in high-risk countries, where the prevalence of anemia exceeds 20% and modeled loss in dietary iron would be in the most severe tertile (>3.8%). The countries with the highest anemia prevalence also derive their iron from the fewest number of foods, even after excluding countries consuming large amounts of unaccounted wild-harvest foods. The potential risk of increased iron deficiency adds greater incentive for mitigating anthropogenic CO2 emissions and highlights the need to address anticipated health impacts via improved health delivery systems, dietary behavioral changes, or agricultural innovation. Because these are effects on content rather than yield, it is unlikely that consumers will perceive this health threat and adapt to it without education.

3.
J Palliat Med ; 4(3): 325-32, 2001.
Artículo en Inglés | MEDLINE | ID: mdl-11596543

RESUMEN

In September 2000, the Quality Interagency Coordination (QuIC) Task Force invited the RAND Center to Improve Care of the Dying and Americans for Better Care of the Dying to testify at its National Summit on Medical Errors and Patient Safety Research. In their testimony, the organizations urged the QuIC to consider the special vulnerability and needs of individuals at the end of life in crafting their research agenda. Patients at the end of life are particularly vulnerable to medical errors and other lapses in patient safety for three reasons: (1) substantially increased exposure to medical errors; (2) more serious effects from errors because they cannot protect themselves from risks and have less reserve with which to overcome the effects; and (3) pervasive patterns of care that run counter to well-substantiated evidence-based practices. A national research agenda on preventing medical errors and increasing patient safety must include a focus on how to improve shortcomings affecting these vulnerable patients. The QuIC's preliminary research agenda, released in October 2000, included patients coming to the end of life. The Agency for Healthcare Research and Quality, the lead federal agency for researching patient safety and medical errors, released between November 2000 and April 2001 six Requests for Applications for research into medical errors.


Asunto(s)
Investigación sobre Servicios de Salud , Errores de Medicación/prevención & control , Cuidados Paliativos/normas , Administración de la Seguridad , Cuidado Terminal/normas , Enfermedad Crónica/clasificación , Política de Salud , Humanos , Calidad de la Atención de Salud , Enfermo Terminal , Estados Unidos , United States Agency for Healthcare Research and Quality
4.
AIDS ; 7(7): 925-31, 1993 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-8357554

RESUMEN

OBJECTIVE: Direct HIV testing of individual injecting drug users is not always feasible. As an alternative, we have evaluated the sensitivity and specificity of several techniques for detecting HIV-1-specific products in used syringes. DESIGN: Polymerase chain reaction (PCR) and antibody-capture assays were compared using syringes prepared with blood from HIV-1-positive and -negative individuals. METHODS: PCR sensitivity was maximized, enabling detection of single copies of HIV-1-specific proviral DNA. The limits of detection from used syringes were determined for PCR by diluting extracts and correlated to CD4+ cell counts. Similarly, limits of detection were determined for enzyme immunoassays (EIA) and Western blot. RESULTS: All techniques were highly specific, although with PCR false-positives were detected occasionally. EIA proved more sensitive than Western blot in detecting needles containing HIV-1-infected individuals' blood. Even after prolonged storage of syringes at room temperature, EIA was equal to or better than PCR as an HIV-1 detection technique. The most sensitive method for detecting HIV-1 was the viral-based EIA when the recommended predilution step was omitted. CONCLUSIONS: EIA proved preferable to PCR because of their higher sensitivity, absence of false-positives and easier sample preparation and analysis.


Asunto(s)
ADN Viral/aislamiento & purificación , Anticuerpos Anti-VIH/análisis , Seropositividad para VIH/diagnóstico , Abuso de Sustancias por Vía Intravenosa , Jeringas , Western Blotting , VIH-1/aislamiento & purificación , Humanos , Técnicas para Inmunoenzimas , Reacción en Cadena de la Polimerasa
7.
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA