Phenological sensitivity to climate across taxa and trophic levels.
Thackeray, Stephen J; Henrys, Peter A; Hemming, Deborah; Bell, James R; Botham, Marc S; Burthe, Sarah; Helaouet, Pierre; Johns, David G; Jones, Ian D; Leech, David I; Mackay, Eleanor B; Massimino, Dario; Atkinson, Sian; Bacon, Philip J; Brereton, Tom M; Carvalho, Laurence; Clutton-Brock, Tim H; Duck, Callan; Edwards, Martin; Elliott, J Malcolm; Hall, Stephen J G; Harrington, Richard; Pearce-Higgins, James W; Høye, Toke T; Kruuk, Loeske E B; Pemberton, Josephine M; Sparks, Tim H; Thompson, Paul M; White, Ian; Winfield, Ian J; Wanless, Sarah
2016-07-14
Differences in phenological responses to climate change among species can desynchronise ecological interactions and thereby threaten ecosystem function. To assess these threats, we must quantify the relative impact of climate change on species at different trophic levels. Here, we apply a Climate Sensitivity Profile approach to 10,003 terrestrial and aquatic phenological data sets, spatially matched to temperature and precipitation data, to quantify variation in climate sensitivity. The direction, magnitude and timing of climate sensitivity varied markedly among organisms within taxonomic and trophic groups. Despite this variability, we detected systematic variation in the direction and magnitude of phenological climate sensitivity. Secondary consumers showed consistently lower climate sensitivity than other groups. We used mid-century climate change projections to estimate that the timing of phenological events could change more for primary consumers than for species in other trophic levels (6.2 versus 2.5-2.9 days earlier on average), with substantial taxonomic variation (1.1-14.8 days earlier on average).
NASA Astrophysics Data System (ADS)
Conte, Eric D.; Barry, Eugene F.; Rubinstein, Harry
1996-12-01
Certain individuals may be sensitive to specific compounds in comsumer products. It is important to quantify these analytes in food products in order to monitor their intake. Caffeine is one such compound. Determination of caffeine in beverages by spectrophotometric procedures requires an extraction procedure, which can prove time-consuming. Although the corresponding determination by HPLC allows for a direct injection, capillary zone electrophoresis provides several advantages such as extremely low solvent consumption, smaller sample volume requirements, and improved sensitivity.
Plant phenolics and their potential role in mitigating iron overload disorder in wild animals.
Lavin, Shana R
2012-09-01
Phenolic compounds are bioactive chemicals found in all vascular plants but are difficult to characterize and quantify, and comparative analyses on these compounds are challenging due to chemical structure complexity and inconsistent laboratory methodologies employed historically. These chemicals can elicit beneficial or toxic effects in consumers, depending on the compound, dose and the species of the consumer. In particular, plant phenolic compounds such as tannins can reduce the utilization of iron in mammalian and avian consumers. Multiple zoo-managed wild animal species are sensitive to iron overload, and these species tend to be offered diets higher in iron than most of the plant browse consumed by these animals in the wild and in captivity. Furthermore, these animals likely consume diets higher in polyphenols in the wild as compared with in managed settings. Thus, in addition to reducing dietary iron concentrations in captivity, supplementing diets with phenolic compounds capable of safely chelating iron in the intestinal lumen may reduce the incidence of iron overload in these animal species. It is recommended to investigate various sources and types of phenolic compounds for use in diets intended for iron-sensitive species. Candidate compounds should be screened both in vitro and in vivo using model species to reduce the risk of toxicity in target species. In particular, it would be important to assess potential compounds in terms of 1) biological activity including iron-binding capacity, 2) accessibility, 3) palatability, and 4) physiological effects on the consumer, including changes in nutritional and antioxidant statuses.
Single molecule Raman spectroscopic assay to detect transgene from GM plants.
Kadam, Ulhas S; Chavhan, Rahul L; Schulz, Burkhard; Irudayaraj, Joseph
2017-09-01
Substantial concerns have been raised for the safety of transgenics on human health and environment. Many organizations, consumer groups, and environmental agencies advocate for stringent regulations to avoid transgene products' contamination in food cycle or in nature. Here we demonstrate a novel approach using surface enhanced Raman spectroscopy (SERS) to detect and quantify transgene from GM plants. We show a highly sensitive and accurate quantification of transgene DNA from multiple transgenic lines of Arabidopsis. The assay allows us to detect and quantify the transgenes as low as 0.10 pg without need for PCR-amplification. This technology is relatively cheap, quick, simple, and suitable for detection at low target concentration. Copyright © 2017 Elsevier Inc. All rights reserved.
Romero, Peggy; Miller, Ted; Garakani, Arman
2009-12-01
Current methods to assess neurodegradation in dorsal root ganglion cultures as a model for neurodegenerative diseases are imprecise and time-consuming. Here we describe two new methods to quantify neuroprotection in these cultures. The neurite quality index (NQI) builds upon earlier manual methods, incorporating additional morphological events to increase detection sensitivity for the detection of early degeneration events. Neurosight is a machine vision-based method that recapitulates many of the strengths of NQI while enabling high-throughput screening applications with decreased costs.
Duan, Lingfeng; Han, Jiwan; Guo, Zilong; Tu, Haifu; Yang, Peng; Zhang, Dong; Fan, Yuan; Chen, Guoxing; Xiong, Lizhong; Dai, Mingqiu; Williams, Kevin; Corke, Fiona; Doonan, John H; Yang, Wanneng
2018-01-01
Dynamic quantification of drought response is a key issue both for variety selection and for functional genetic study of rice drought resistance. Traditional assessment of drought resistance traits, such as stay-green and leaf-rolling, has utilized manual measurements, that are often subjective, error-prone, poorly quantified and time consuming. To relieve this phenotyping bottleneck, we demonstrate a feasible, robust and non-destructive method that dynamically quantifies response to drought, under both controlled and field conditions. Firstly, RGB images of individual rice plants at different growth points were analyzed to derive 4 features that were influenced by imposition of drought. These include a feature related to the ability to stay green, which we termed greenness plant area ratio (GPAR) and 3 shape descriptors [total plant area/bounding rectangle area ratio (TBR), perimeter area ratio (PAR) and total plant area/convex hull area ratio (TCR)]. Experiments showed that these 4 features were capable of discriminating reliably between drought resistant and drought sensitive accessions, and dynamically quantifying the drought response under controlled conditions across time (at either daily or half hourly time intervals). We compared the 3 shape descriptors and concluded that PAR was more robust and sensitive to leaf-rolling than the other shape descriptors. In addition, PAR and GPAR proved to be effective in quantification of drought response in the field. Moreover, the values obtained in field experiments using the collection of rice varieties were correlated with those derived from pot-based experiments. The general applicability of the algorithms is demonstrated by their ability to probe archival Miscanthus data previously collected on an independent platform. In conclusion, this image-based technology is robust providing a platform-independent tool for quantifying drought response that should be of general utility for breeding and functional genomics in future.
IsoWeb: A Bayesian Isotope Mixing Model for Diet Analysis of the Whole Food Web
Kadoya, Taku; Osada, Yutaka; Takimoto, Gaku
2012-01-01
Quantitative description of food webs provides fundamental information for the understanding of population, community, and ecosystem dynamics. Recently, stable isotope mixing models have been widely used to quantify dietary proportions of different food resources to a focal consumer. Here we propose a novel mixing model (IsoWeb) that estimates diet proportions of all consumers in a food web based on stable isotope information. IsoWeb requires a topological description of a food web, and stable isotope signatures of all consumers and resources in the web. A merit of IsoWeb is that it takes into account variation in trophic enrichment factors among different consumer-resource links. Sensitivity analysis using realistic hypothetical food webs suggests that IsoWeb is applicable to a wide variety of food webs differing in the number of species, connectance, sample size, and data variability. Sensitivity analysis based on real topological webs showed that IsoWeb can allow for a certain level of topological uncertainty in target food webs, including erroneously assuming false links, omission of existent links and species, and trophic aggregation into trophospecies. Moreover, using an illustrative application to a real food web, we demonstrated that IsoWeb can compare the plausibility of different candidate topologies for a focal web. These results suggest that IsoWeb provides a powerful tool to analyze food-web structure from stable isotope data. We provide R and BUGS codes to aid efficient applications of IsoWeb. PMID:22848427
Measurements of degree of sensitization (DoS) in aluminum alloys using EMAT ultrasound.
Li, Fang; Xiang, Dan; Qin, Yexian; Pond, Robert B; Slusarski, Kyle
2011-07-01
Sensitization in 5XXX aluminum alloys is an insidious problem characterized by the gradual formation and growth of beta phase (Mg(2)Al(3)) at grain boundaries, which increases the susceptibility of alloys to intergranular corrosion (IGC) and intergranular stress-corrosion cracking (IGSCC). The degree of sensitization (DoS) is currently quantified by the ASTM G67 Nitric Acid Mass Loss Test, which is destructive and time consuming. A fast, reliable, and non-destructive method for rapid detection and the assessment of the condition of DoS in AA5XXX aluminum alloys in the field is highly desirable. In this paper, we describe a non-destructive method for measurements of DoS in aluminum alloys with an electromagnetic acoustic transducer (EMAT). AA5083 aluminum alloy samples were sensitized at 100°C with processing times varying from 7days to 30days. The DoS of sensitized samples was first quantified with the ASTM 67 test in the laboratory. Both ultrasonic velocity and attenuation in sensitized specimens were then measured using EMAT and the results were correlated with the DoS data. We found that the longitudinal wave velocity was almost a constant, independent of the sensitization, which suggests that the longitudinal wave can be used to determine the sample thickness. The shear wave velocity and especially the shear wave attenuation are sensitive to DoS. Relationships between DoS and the shear velocity, as well as the shear attenuation have been established. Finally, we performed the data mining to evaluate and improve the accuracy in the measurements of DoS in aluminum alloys with EMAT. Copyright © 2010 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khawli, Toufik Al; Eppelt, Urs; Hermanns, Torsten
2016-06-08
In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part ismore » to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.« less
NASA Astrophysics Data System (ADS)
Khawli, Toufik Al; Gebhardt, Sascha; Eppelt, Urs; Hermanns, Torsten; Kuhlen, Torsten; Schulz, Wolfgang
2016-06-01
In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part is to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.
Kuo, Tony; Jarosz, Christopher J; Simon, Paul; Fielding, Jonathan E
2009-09-01
We conducted a health impact assessment to quantify the potential impact of a state menu-labeling law on population weight gain in Los Angeles County, California. We utilized published and unpublished data to model consumer response to point-of-purchase calorie postings at large chain restaurants in Los Angeles County. We conducted sensitivity analyses to account for uncertainty in consumer response and in the total annual revenue, market share, and average meal price of large chain restaurants in the county. Assuming that 10% of the restaurant patrons would order reduced-calorie meals in response to calorie postings, resulting in an average reduction of 100 calories per meal, we estimated that menu labeling would avert 40.6% of the 6.75 million pound average annual weight gain in the county population aged 5 years and older. Substantially larger impacts would be realized if higher percentages of patrons ordered reduced-calorie meals or if average per-meal calorie reductions increased. Our findings suggest that mandated menu labeling could have a sizable salutary impact on the obesity epidemic, even with only modest changes in consumer behavior.
Multiple products monitoring as a robust approach for peptide quantification.
Baek, Je-Hyun; Kim, Hokeun; Shin, Byunghee; Yu, Myeong-Hee
2009-07-01
Quantification of target peptides and proteins is crucial for biomarker discovery. Approaches such as selected reaction monitoring (SRM) and multiple reaction monitoring (MRM) rely on liquid chromatography and mass spectrometric analysis of defined peptide product ions. These methods are not very widespread because the determination of quantifiable product ion using either SRM or MRM is a very time-consuming process. We developed a novel approach for quantifying target peptides without such an arduous process of ion selection. This method is based on monitoring multiple product ions (multiple products monitoring: MpM) from full-range MS2 spectra of a target precursor. The MpM method uses a scoring system that considers both the absolute intensities of product ions and the similarities between the query MS2 spectrum and the reference MS2 spectrum of the target peptide. Compared with conventional approaches, MpM greatly improves sensitivity and selectivity of peptide quantification using an ion-trap mass spectrometer.
Mokhtari, Amirhossein; Moore, Christina M; Yang, Hong; Jaykus, Lee-Ann; Morales, Roberta; Cates, Sheryl C; Cowen, Peter
2006-06-01
We describe a one-dimensional probabilistic model of the role of domestic food handling behaviors on salmonellosis risk associated with the consumption of eggs and egg-containing foods. Six categories of egg-containing foods were defined based on the amount of egg contained in the food, whether eggs are pooled, and the degree of cooking practiced by consumers. We used bootstrap simulation to quantify uncertainty in risk estimates due to sampling error, and sensitivity analysis to identify key sources of variability and uncertainty in the model. Because of typical model characteristics such as nonlinearity, interaction between inputs, thresholds, and saturation points, Sobol's method, a novel sensitivity analysis approach, was used to identify key sources of variability. Based on the mean probability of illness, examples of foods from the food categories ranked from most to least risk of illness were: (1) home-made salad dressings/ice cream; (2) fried eggs/boiled eggs; (3) omelettes; and (4) baked foods/breads. For food categories that may include uncooked eggs (e.g., home-made salad dressings/ice cream), consumer handling conditions such as storage time and temperature after food preparation were the key sources of variability. In contrast, for food categories associated with undercooked eggs (e.g., fried/soft-boiled eggs), the initial level of Salmonella contamination and the log10 reduction due to cooking were the key sources of variability. Important sources of uncertainty varied with both the risk percentile and the food category under consideration. This work adds to previous risk assessments focused on egg production and storage practices, and provides a science-based approach to inform consumer risk communications regarding safe egg handling practices.
Price sensitivity and innovativeness for fashion among Korean consumers.
Goldsmith, Ronald E; Kim, Daekwan; Flynn, Leisa R; Kim, Wan-Min
2005-10-01
Price sensitivity is how consumers react to price levels and to price changes. Consumer innovativeness is a tendency to welcome and to adopt new products. Researchers (e.g., R. E. Goldsmith & S. J. Newell, 1997) consider innovative consumers relatively more price insensitive than other consumers, so there should be a negative correlation between measures of these constructs. The results of the present study supported the psychometric soundness of a self-report measure of price sensitivity among 860 Korean consumers and replicated earlier findings of the negative correlation between the 2 constructs.
Krapfenbauer, Kurt
2017-12-01
Diabetes mellitus is produced and progresses as a consequence of complex and gradual processes, in which a variety of alterations of the endocrine pancreas, are involved and which mainly result in beta cell failure. Those molecular alterations can be found in the bloodstream, which suggests that we could quantify specific biomarkers in plasma or serum by very sensitive methods before the onset diabetes mellitus is diagnosed. However, classical methods of protein analysis such as electrophoresis, Western blot, ELISA, and liquid chromatography are generally time-consuming, lab-intensive, and not sensitive enough to detect such alteration in a pre-symptomatic state of the disease. A very sensitive and novel analytical detection conjugate system by using the combination of polyfluorophor technology with protein microchip method was developed. This innovative system facilitates the use of a very sensitive microchip assays that measure selected biomarkers in a small sample volume (10 μL) with a much higher sensitivity (92%) compare to common immune assay systems. Further advances of the application of this technology combine the power of miniaturization and faster quantification (around 10 min). The power of this technology offers great promise for point-of-care clinical testing and monitoring of specific biomarkers for diabetes in femtogram level in serum or plasma. In conclusion, the results indicate that the technical performance of this new technology is valid and that the assay is able to quantified PPY-specific antigens in plasma at femtogram levels which can be used for identification of beta cell dysfunction at the pre-symptomatic stage of diabetes mellitus.
Menu Labeling as a Potential Strategy for Combating the Obesity Epidemic: A Health Impact Assessment
Jarosz, Christopher J.; Simon, Paul; Fielding, Jonathan E.
2009-01-01
Objectives. We conducted a health impact assessment to quantify the potential impact of a state menu-labeling law on population weight gain in Los Angeles County, California. Methods. We utilized published and unpublished data to model consumer response to point-of-purchase calorie postings at large chain restaurants in Los Angeles County. We conducted sensitivity analyses to account for uncertainty in consumer response and in the total annual revenue, market share, and average meal price of large chain restaurants in the county. Results. Assuming that 10% of the restaurant patrons would order reduced-calorie meals in response to calorie postings, resulting in an average reduction of 100 calories per meal, we estimated that menu labeling would avert 40.6% of the 6.75 million pound average annual weight gain in the county population aged 5 years and older. Substantially larger impacts would be realized if higher percentages of patrons ordered reduced-calorie meals or if average per-meal calorie reductions increased. Conclusions. Our findings suggest that mandated menu labeling could have a sizable salutary impact on the obesity epidemic, even with only modest changes in consumer behavior. PMID:19608944
PIGE as a screening tool for Per- and polyfluorinated substances in papers and textiles
NASA Astrophysics Data System (ADS)
Ritter, Evelyn E.; Dickinson, Margaret E.; Harron, John P.; Lunderberg, David M.; DeYoung, Paul A.; Robel, Alix E.; Field, Jennifer A.; Peaslee, Graham F.
2017-09-01
Per- and polyfluoroalkyl substances (PFASs) comprise a large array of man-made fluorinated chemicals. It is an emerging chemical class of concern because many PFASs are environmentally persistent and some have known ecological and human toxicity. Consumer products treated with PFASs result in human exposure to PFASs through inhalation, ingestion, and environmental exposure to emissions from wastewater or from landfills. A rapid screening method based on total fluorine was developed and applied to quantify PFASs on consumer papers and textiles. Particle-Induced Gamma Ray Emission (PIGE) spectroscopy provides a non-destructive and quantitative measurement of total fluorine on papers and textiles. This technique is both rapid and sensitive, with a limit of detection (LOD) of 13 nmol F/cm2 for papers and 24-45 nmol F/cm2 for textiles, with reproducibility of ±12% RSD for both. PIGE is a high throughput (>20 samples/hr typically) method that was applied to 50 papers and 50 textiles in commerce to demonstrate the method.
A simple and sensitive enzymatic method for cholesterol quantification in macrophages and foam cells
Robinet, Peggy; Wang, Zeneng; Hazen, Stanley L.; Smith, Jonathan D.
2010-01-01
A precise and sensitive method for measuring cellular free and esterified cholesterol is required in order to perform studies of macrophage cholesterol loading, metabolism, storage, and efflux. Until now, the use of an enzymatic cholesterol assay, commonly used for aqueous phase plasma cholesterol assays, has not been optimized for use with solid phase samples such as cells, due to inefficient solubilization of total cholesterol in enzyme compatible solvents. We present an efficient solubilization protocol compatible with an enzymatic cholesterol assay that does not require chemical saponification or chromatographic separation. Another issue with enzyme compatible solvents is the presence of endogenous peroxides that interfere with the enzymatic cholesterol assay. We overcame this obstacle by pretreatment of the reaction solution with the enzyme catalase, which consumed endogenous peroxides resulting in reduced background and increased sensitivity in our method. Finally, we demonstrated that this method for cholesterol quantification in macrophages yields results that are comparable to those measured by stable isotope dilution gas chromatography with mass spectrometry detection. In conclusion, we describe a sensitive, simple, and high-throughput enzymatic method to quantify cholesterol in complex matrices such as cells. PMID:20688754
Quantifying learning in biotracer studies.
Brown, Christopher J; Brett, Michael T; Adame, Maria Fernanda; Stewart-Koster, Ben; Bunn, Stuart E
2018-04-12
Mixing models have become requisite tools for analyzing biotracer data, most commonly stable isotope ratios, to infer dietary contributions of multiple sources to a consumer. However, Bayesian mixing models will always return a result that defaults to their priors if the data poorly resolve the source contributions, and thus, their interpretation requires caution. We describe an application of information theory to quantify how much has been learned about a consumer's diet from new biotracer data. We apply the approach to two example data sets. We find that variation in the isotope ratios of sources limits the precision of estimates for the consumer's diet, even with a large number of consumer samples. Thus, the approach which we describe is a type of power analysis that uses a priori simulations to find an optimal sample size. Biotracer data are fundamentally limited in their ability to discriminate consumer diets. We suggest that other types of data, such as gut content analysis, must be used as prior information in model fitting, to improve model learning about the consumer's diet. Information theory may also be used to identify optimal sampling protocols in situations where sampling of consumers is limited due to expense or ethical concerns.
Consumer trophic diversity as a fundamental mechanism linking predation and ecosystem functioning.
Hines, Jes; Gessner, Mark O
2012-11-01
1. Primary production and decomposition, two fundamental processes determining the functioning of ecosystems, may be sensitive to changes in biodiversity and food web interactions. 2. The impacts of food web interactions on ecosystem functioning are generally quantified by experimentally decoupling these linked processes and examining either primary production-based (green) or decomposition-based (brown) food webs in isolation. This decoupling may strongly limit our ability to assess the importance of food web interactions on ecosystem processes. 3. To evaluate how consumer trophic diversity mediates predator effects on ecosystem functioning, we conducted a mesocosm experiment and a field study using an assemblage of invertebrates that naturally co-occur on North Atlantic coastal saltmarshes. We measured the indirect impact of predation on primary production and leaf decomposition as a result of prey communities composed of herbivores alone, detritivores alone or both prey in combination. 4. We find that primary consumers can influence ecosystem process rates not only within, but also across green and brown sub-webs. Moreover, by feeding on a functionally diverse consumer assemblage comprised of both herbivores and detritivores, generalist predators can diffuse consumer effects on decomposition, primary production and feedbacks between the two processes. 5. These results indicate that maintaining functional diversity among primary consumers can alter the consequences of traditional trophic cascades, and they emphasize the role of the detritus-based sub-web when seeking key biotic drivers of plant production. Clearly, traditional compartmentalization of empirical food webs can limit our ability to predict the influence of food web interactions on ecosystem functioning. © 2012 The Authors. Journal of Animal Ecology © 2012 British Ecological Society.
Schmidt, Debra A; Iambana, R Bernard; Britt, Adam; Junge, Randall E; Welch, Charles R; Porton, Ingrid J; Kerley, Monty S
2010-01-01
The purpose of this study was to quantify the concentrations of crude protein, fat, ash, neutral detergent fiber, acid detergent fiber, lignin, nonstructural carbohydrates, and gross energy in plant foods consumed by wild black and white ruffed lemurs (Varecia variegata). Calcium, phosphorous, magnesium, potassium, sodium, iron, zinc, copper, manganese, molybdenum, and selenium concentrations were also determined. A total of 122 samples from 33 plant families and more than 60 species were collected and analyzed for their nutritional content. The specific nutrient needs of black and white ruffed lemurs are unknown, but quantifying the nutritional composition of the foods they consume in the wild will help nutritionists and veterinarians formulate more appropriate diets for captive ruffed lemurs. This information will also supply information on how man-induced habitat changes affect the nutritional composition of foods consumed by free-ranging lemurs. (c) 2009 Wiley-Liss, Inc.
Seong-Hoon Cho; Michael Bowker; Roland K. Roberts; Seunggyu Kim; Taeyoung Kim; Dayton M. Lambert
2015-01-01
This research quantifies changes in consumer welfare due to changes in visitor satisfaction with the availability of information about recreational sites. The authors tested the hypothesis that an improvement in visitor satisfaction with recreation information increases the number of visits to national forests, resulting in increased consumer welfare. They...
Quantifying and Disaggregating Consumer Purchasing Behavior for Energy Systems Modeling
Consumer behaviors such as energy conservation, adoption of more efficient technologies, and fuel switching represent significant potential for greenhouse gas mitigation. Current efforts to model future energy outcomes have tended to use simplified economic assumptions ...
Quantifying the Release of Silver from Nanotechnology-Based Consumer Products for Children
We assessed the potential for children’s exposure to bioavailable silver during the realistic use of selected nanotechnology-based consumer products (plush toy, fabric products, breast milk storage bags, sippy cups, cleaning products). All products had at least one componen...
A Bayesian Multi-Level Factor Analytic Model of Consumer Price Sensitivities across Categories
ERIC Educational Resources Information Center
Duvvuri, Sri Devi; Gruca, Thomas S.
2010-01-01
Identifying price sensitive consumers is an important problem in marketing. We develop a Bayesian multi-level factor analytic model of the covariation among household-level price sensitivities across product categories that are substitutes. Based on a multivariate probit model of category incidence, this framework also allows the researcher to…
Liu, Changqi; Chhabra, Guneet S; Zhao, Jing; Zaffran, Valerie D; Gupta, Sahil; Roux, Kenneth H; Gradziel, Thomas M; Sathe, Shridhar K
2017-10-01
A commercially available monoclonal antibody (mAb)-based direct sandwich enzyme-linked immunosorbent assay (ELISA) kit (BioFront Technologies, Tallahassee, Fla., U.S.A.) was compared with an in-house developed mAb 4C10-based ELISA for almond detection. The assays were comparable in sensitivity (limit of detection < 1 ppm full fat almond, limit of quantification < 5 ppm full fat almond), specificity (no cross-reactivity with 156 tested foods at a concentration of 100000 ppm whole sample), and reproducibility (intra- and interassay variability < 15% CV). The target antigens were stable and detectable in whole almond seeds subjected to autoclaving, blanching, frying, microwaving, and dry roasting. The almond recovery ranges for spiked food matrices were 84.3% to 124.6% for 4C10 ELISA and 81.2% to 127.4% for MonoTrace ELISA. The almond recovery ranges for commercial and laboratory prepared foods with declared/known almond amount were 30.9% to 161.2% for 4C10 ELISA and 38.1% to 207.6% for MonoTrace ELISA. Neither assay registered any false-positive or negative results among the tested commercial and laboratory prepared samples. Ability to detect and quantify trace amounts of almonds is important for improving safety of almond sensitive consumers. Two monoclonal antibody-based ELISAs were compared for almond detection. The information is useful to food industry, regulatory agencies, scientific community, and almond consumers. © 2017 Institute of Food Technologists®.
Dietary species richness as a measure of food biodiversity and nutritional quality of diets
Raneri, Jessica E.; Smith, Katherine Walker; Kolsteren, Patrick; Van Damme, Patrick; Verzelen, Kaat; Penafiel, Daniela; Vanhove, Wouter; Kennedy, Gina; Hunter, Danny; Odhiambo, Francis Oduor; Ntandou-Bouzitou, Gervais; De Baets, Bernard; Ratnasekera, Disna; Ky, Hoang The; Remans, Roseline; Termote, Céline
2018-01-01
Biodiversity is key for human and environmental health. Available dietary and ecological indicators are not designed to assess the intricate relationship between food biodiversity and diet quality. We applied biodiversity indicators to dietary intake data from and assessed associations with diet quality of women and young children. Data from 24-hour diet recalls (55% in the wet season) of n = 6,226 participants (34% women) in rural areas from seven low- and middle-income countries were analyzed. Mean adequacies of vitamin A, vitamin C, folate, calcium, iron, and zinc and diet diversity score (DDS) were used to assess diet quality. Associations of biodiversity indicators with nutrient adequacy were quantified using multilevel models, receiver operating characteristic curves, and test sensitivity and specificity. A total of 234 different species were consumed, of which <30% were consumed in more than one country. Nine species were consumed in all countries and provided, on average, 61% of total energy intake and a significant contribution of micronutrients in the wet season. Compared with Simpson’s index of diversity and functional diversity, species richness (SR) showed stronger associations and better diagnostic properties with micronutrient adequacy. For every additional species consumed, dietary nutrient adequacy increased by 0.03 (P < 0.001). Diets with higher nutrient adequacy were mostly obtained when both SR and DDS were maximal. Adding SR to the minimum cutoff for minimum diet diversity improved the ability to detect diets with higher micronutrient adequacy in women but not in children. Dietary SR is recommended as the most appropriate measure of food biodiversity in diets. PMID:29255049
Identity theft and consumers' reaction to preventive technological innovations.
Ainscough, Thomas L; Brody, Richard G; Trocchia, Philip J
2007-08-01
The use of identification technology by commercial entities has broad and, for some consumers, disturbing social implications. This two-phase study was done to specify consumers' concerns regarding various identification technologies which may be encountered in retail environments. From the qualitative findings, a 26-item survey was constructed to quantify identified areas of concern with 303 survey participants (147 women and 156 men), whose mean age category was 30 to 39 years. Using exploratory factor analysis (principal components with varimax rotation), five dimensions of consumers' concern emerged: privacy, ethics, health, humanity, and complexity.
Swan, Melanie
2009-01-01
A new class of patient-driven health care services is emerging to supplement and extend traditional health care delivery models and empower patient self-care. Patient-driven health care can be characterized as having an increased level of information flow, transparency, customization, collaboration and patient choice and responsibility-taking, as well as quantitative, predictive and preventive aspects. The potential exists to both improve traditional health care systems and expand the concept of health care though new services. This paper examines three categories of novel health services: health social networks, consumer personalized medicine and quantified self-tracking. PMID:19440396
Oh, Woon Yong; Lee, Ji Woong; Lee, Chong Eon; Ko, Moon Seok; Jeong, Jae Hong
2009-12-01
In this study, a structured survey questionnaire was used to determine consumers' preferences and behavior with regard to horse meat at a horse meat restaurant located in Jeju, Korea, from October 1 to December 24, 2005. The questionnaire employed in this study consisted of 20 questions designed to characterize six general attributes: horse meat sensory property, physical appearance, health condition, origin, price, and other attributes. Of the 1370 questionnaires distributed, 1126 completed questionnaires were retained based on the completeness of the answers, representing an 82.2% response rate. Two issues were investigated that might facilitate the search for ways to improve horse meat production and marketing programs in Korea. The first step was to determine certain important factors, called principal components, which enabled the researchers to understand the needs of horse meat consumers via principal component analysis. The second step was to define consumer segments with regard to their preferences for horse meat, which was accomplished via cluster analysis. The results of the current study showed that health condition, price, origin, and leanness were the most critical physical attributes affecting the preferences of horse meat consumers. Four segments of consumers, with different demands for horse meat attributes, were identified: origin-sensitive consumers, price-sensitive consumers, quality and safety-sensitive consumers, and non-specific consumers. Significant differences existed among segments of consumers in terms of age, nature of work, frequency of consumption, and general level of acceptability of horse meat.
Selecting Tasks for Evaluating Human Performance as a Function of Gravity
NASA Technical Reports Server (NTRS)
Norcross, Jason R.; Gernhardt, Michael L.
2011-01-01
A challenge in understanding human performance as a function of gravity is determining which tasks to research. Initial studies began with treadmill walking, which was easy to quantify and control. However, with the development of pressurized rovers, it is less important to optimize human performance for ambulation as pressurized rovers will likely perform gross translation for them. Future crews are likely to spend much of their extravehicular activity (EVA) performing geology, construction,a nd maintenance type tasks. With these types of tasks, people have different performance strategies, and it is often difficult to quantify the task and measure steady-state metabolic rates or perform biomechanical analysis. For many of these types of tasks, subjective feedback may be the only data that can be collected. However, subjective data may not fully support a rigorous scientific comparison of human performance across different gravity levels and suit factors. NASA would benefit from having a wide variety of quantifiable tasks that allow human performance comparison across different conditions. In order to determine which tasks will effectively support scientific studies, many different tasks and data analysis techniques will need to be employed. Many of these tasks and techniques will not be effective, but some will produce quantifiable results that are sensitive enough to show performance differences. One of the primary concerns related to EVA performance is metabolic rate. The higher the metabolic rate, the faster the astronaut will exhaust consumables. The focus of this poster will be on how different tasks affect metabolic rate across different gravity levels.
Lanzarotta, Adam; Lorenz, Lisa; Voelker, Sarah; Falconer, Travis M; Batson, JaCinta S
2018-05-01
This manuscript is a continuation of a recent study that described the use of fully integrated gas chromatography with direct deposition Fourier transform infrared detection and mass spectrometric detection (GC-FT-IR-MS) to identify and confirm the presence of sibutramine and AB-FUBINACA. The purpose of the current study was to employ the GC-FT-IR portion of the same instrument to quantify these compounds, thereby demonstrating the ability to identify, confirm, and quantify drug substances using a single GC-FT-IR-MS unit. The performance of the instrument was evaluated by comparing quantitative analytical figures of merit to those measured using an established, widely employed method for quantifying drug substances, high performance liquid chromatography with ultraviolet detection (HPLC-UV). The results demonstrated that GC-FT-IR was outperformed by HPLC-UV with regard to sensitivity, precision, and linear dynamic range (LDR). However, sibutramine and AB-FUBINACA concentrations measured using GC-FT-IR were not significantly different at the 95% confidence interval compared to those measured using HPLC-UV, which demonstrates promise for using GC-FT-IR as a semi-quantitative tool at the very least. The most significant advantage of GC-FT-IR compared to HPLC-UV is selectivity; a higher level of confidence regarding the identity of the analyte being quantified is achieved using GC-FT-IR. Additional advantages of using a single GC-FT-IR-MS instrument for identification, confirmation, and quantification are efficiency, increased sample throughput, decreased consumption of laboratory resources (solvents, chemicals, consumables, etc.), and thus cost.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, Andrew J.; McDonald, Benjamin S.; Smith, Leon E.
The methods currently used by the International Atomic Energy Agency to account for nuclear materials at fuel fabrication facilities are time consuming and require in-field chemistry and operation by experts. Spectral X-ray radiography, along with advanced inverse algorithms, is an alternative inspection that could be completed noninvasively, without any in-field chemistry, with inspections of tens of seconds. The proposed inspection system and algorithms are presented here. The inverse algorithm uses total variation regularization and adaptive regularization parameter selection with the unbiased predictive risk estimator. Performance of the system is quantified with simulated X-ray inspection data and sensitivity of the outputmore » is tested against various inspection system instabilities. Material quantification from a fully-characterized inspection system is shown to be very accurate, with biases on nuclear material estimations of < 0.02%. It is shown that the results are sensitive to variations in the fuel powder sample density and detector pixel gain, which increase biases to 1%. Options to mitigate these inaccuracies are discussed.« less
Development of a brief validated geriatric depression screening tool: the SLU "AM SAD".
Chakkamparambil, Binu; Chibnall, John T; Graypel, Ernest A; Manepalli, Jothika N; Bhutto, Asif; Grossberg, George T
2015-08-01
Combining five commonly observed symptoms of late-life depression to develop a short depression screening tool with similar sensitivity and specificity as the conventional, more time-consuming tools. We developed the St. Louis University AM SAD (Appetite, Mood, Sleep, Activity, and thoughts of Death) questionnaire. The frequency of each symptom in the prior 2 weeks is quantified as 0, 1, or 2. Patients 65 years or older from our clinics were administered the AM SAD, the Geriatric Depression Scale (GDS-15), the Montgomery-Asberg Depression Rating Scale (MADRS), and the St. Louis University Mental Status Exam (SLUMS). 100 patients were selected. AM SAD correlation with GDS was 0.72 and MADRS 0.80. AM SAD yielded a sensitivity and specificity of 79% and 62% against diagnosis of depression; of 88% and 62% with GDS-15; and 92% and 71% with MADRS. The AM SAD can be reliably used as a short depression screening tool in patients with a SLUMS score of 20 or higher. Copyright © 2015 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.
Quantification of tumor fluorescence during intraoperative optical cancer imaging.
Judy, Ryan P; Keating, Jane J; DeJesus, Elizabeth M; Jiang, Jack X; Okusanya, Olugbenga T; Nie, Shuming; Holt, David E; Arlauckas, Sean P; Low, Phillip S; Delikatny, E James; Singhal, Sunil
2015-11-13
Intraoperative optical cancer imaging is an emerging technology in which surgeons employ fluorophores to visualize tumors, identify tumor-positive margins and lymph nodes containing metastases. This study compares instrumentation to measure tumor fluorescence. Three imaging systems (Spectropen, Glomax, Flocam) measured and quantified fluorescent signal-to-background ratios (SBR) in vitro, murine xenografts, tissue phantoms and clinically. Evaluation criteria included the detection of small changes in fluorescence, sensitivity of signal detection at increasing depths and practicality of use. In vitro, spectroscopy was superior in detecting incremental differences in fluorescence than luminescence and digital imaging (Ln[SBR] = 6.8 ± 0.6, 2.4 ± 0.3, 2.6 ± 0.1, p = 0.0001). In fluorescent tumor cells, digital imaging measured higher SBRs than luminescence (6.1 ± 0.2 vs. 4.3 ± 0.4, p = 0.001). Spectroscopy was more sensitive than luminometry and digital imaging in identifying murine tumor fluorescence (SBR = 41.7 ± 11.5, 5.1 ± 1.8, 4.1 ± 0.9, p = 0.0001), and more sensitive than digital imaging at detecting fluorescence at increasing depths (SBR = 7.0 ± 3.4 vs. 2.4 ± 0.5, p = 0.03). Lastly, digital imaging was the most practical and least time-consuming. All methods detected incremental differences in fluorescence. Spectroscopy was the most sensitive for small changes in fluorescence. Digital imaging was the most practical considering its wide field of view, background noise filtering capability, and sensitivity to increasing depth.
Gallaher, Sean D.; Berk, Arnold J.
2013-01-01
Adenoviruses are employed in the study of cellular processes and as expression vectors used in gene therapy. The success and reproducibility of these studies is dependent in part on having accurate and meaningful titers of replication competent and helper-dependent adenovirus stocks, which is problematic due to the use of varied and divergent titration protocols. Physical titration methods, which quantify the total number of viral particles, are used by many, but are poor at estimating activity. Biological titration methods, such as plaque assays, are more biologically relevant, but are time consuming and not applicable to helper-dependent gene therapy vectors. To address this, a protocol was developed called “infectious genome titration” in which viral DNA is isolated from the nuclei of cells ~3 h post-infection, and then quantified by Q-PCR. This approach ensures that only biologically active virions are counted as part of the titer determination. This approach is rapid, robust, sensitive, reproducible, and applicable to all forms of adenovirus. Unlike other Q-PCR-based methods, titers determined by this protocol are well correlated with biological activity. PMID:23624118
Smith, Matthew M.; Schmutz, Joel A.; Apelgren, Chloe; Ramey, Andy M.
2015-01-01
Microscopic examination of blood smears can be effective at diagnosing and quantifying hematozoa infections. However, this method requires highly trained observers, is time consuming, and may be inaccurate for detection of infections at low levels of parasitemia. To develop a molecular methodology for identifying and quantifying Leucocytozoon parasite infection in wild waterfowl (Anseriformes), we designed a real-time, quantitative PCR protocol to amplify Leucocytozoon mitochondrial DNA using TaqMan fluorogenic probes and validated our methodology using blood samples collected from waterfowl in interior Alaska during late summer and autumn (n = 105). By comparing our qPCR results to those derived from a widely used nested PCR protocol, we determined that our assay showed high levels of sensitivity (91%) and specificity (100%) in detecting Leucocytozoon DNA from host blood samples. Additionally, results of a linear regression revealed significant correlation between the raw measure of parasitemia produced by our qPCR assay (Ct values) and numbers of parasites observed on blood smears (R2 = 0.694, P = 0.003), indicating that our assay can reliably determine the relative parasitemia levels among samples. This methodology provides a powerful new tool for studies assessing effects of haemosporidian infection in wild avian species.
Kuuliala, L; Al Hage, Y; Ioannidis, A-G; Sader, M; Kerckhof, F-M; Vanderroost, M; Boon, N; De Baets, B; De Meulenaer, B; Ragaert, P; Devlieghere, F
2018-04-01
During fish spoilage, microbial metabolism leads to the production of volatile organic compounds (VOCs), characteristic off-odors and eventual consumer rejection. The aim of the present study was to contribute to the development of intelligent packaging technologies by identifying and quantifying VOCs that indicate spoilage of raw Atlantic cod (Gadus morhua) under atmospheres (%v/v CO 2 /O 2 /N 2 ) 60/40/0, 60/5/35 and air. Spoilage was examined by microbiological, chemical and sensory analyses over storage time at 4 or 8 °C. Selected-ion flow-tube mass spectrometry (SIFT-MS) was used for quantifying selected VOCs and amplicon sequencing of the 16S rRNA gene was used for the characterization of the cod microbiota. OTUs classified within the Photobacterium genus increased in relative abundance over time under all storage conditions, suggesting that Photobacterium contributed to spoilage and VOC production. The onset of exponential VOC concentration increase and sensory rejection occurred at high total plate counts (7-7.5 log). Monitoring of early spoilage thus calls for sensitivity for low VOC concentrations. Copyright © 2017 Elsevier Ltd. All rights reserved.
Analyzing the Sensitivity of Hydrogen Vehicle Sales to Consumers' Preferences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, David L; Lin, Zhenhong; Dong, Jing
2013-01-01
The success of hydrogen vehicles will depend on consumer behavior as well as technology, energy prices and public policy. This study examines the sensitivity of the future market shares of hydrogen-powered vehicles to alternative assumptions about consumers preferences. The Market Acceptance of Advanced Automotive Technologies model was used to project future market shares. The model has 1,458 market segments, differentiated by travel behavior, geography, and tolerance to risk, among other factors, and it estimates market shares for twenty advanced power-train technologies. The market potential of hydrogen vehicles is most sensitive to the improvement of drive train technology, especially cost reduction.more » The long-run market success of hydrogen vehicles is less sensitive to the price elasticity of vehicle choice, how consumers evaluate future fuel costs, the importance of fuel availability and limited driving range. The importance of these factors will likely be greater in the early years following initial commercialization of hydrogen vehicles.« less
Scott, Michael J.; Daly, Don S.; Hejazi, Mohamad I.; ...
2016-02-06
Here, one of the most important interactions between humans and climate is in the demand and supply of water. Humans withdraw, use, and consume water and return waste water to the environment for a variety of socioeconomic purposes, including domestic, commercial, and industrial use, production of energy resources and cooling thermal-electric power plants, and growing food, fiber, and chemical feed stocks for human consumption. Uncertainties in the future human demand for water interact with future impacts of climatic change on water supplies to impinge on water management decisions at the international, national, regional, and local level, but until recently toolsmore » were not available to assess the uncertainties surrounding these decisions. This paper demonstrates the use of a multi-model framework in a structured sensitivity analysis to project and quantify the sensitivity of future deficits in surface water in the context of climate and socioeconomic change for all U.S. states and sub-basins. The framework treats all sources of water demand and supply consistently from the world to local level. The paper illustrates the capabilities of the framework with sample results for a river sub-basin in the U.S. state of Georgia.« less
Isselmann DiSantis, Katherine; Kumanyika, Shiriki; Carter-Edwards, Lori; Rohm Young, Deborah; Grier, Sonya A; Lassiter, Vikki
2017-10-29
Food marketing environments of Black American consumers are heavily affected by ethnically-targeted marketing of sugar sweetened beverages, fast foods, and other products that may contribute to caloric overconsumption. This qualitative study assessed Black consumers' responses to targeted marketing. Black adults (2 mixed gender groups; total n = 30) and youth (2 gender specific groups; total n = 35) from two U.S. communities participated before and after a sensitization procedure-a critical practice used to understand social justice concerns. Pre-sensitization focus groups elicited responses to scenarios about various targeted marketing tactics. Participants were then given an informational booklet about targeted marketing to Black Americans, and all returned for the second (post-sensitization) focus group one week later. Conventional qualitative content analysis of transcripts identified several salient themes: seeing the marketer's perspective ("it's about demand"; "consumers choose"), respect for community ("marketers are setting us up for failure"; "making wrong assumptions"), and food environments as a social justice issue ("no one is watching the door"; "I didn't realize"). Effects of sensitization were reflected in participants' stated reactions to the information in the booklet, and also in the relative occurrence of marketer-oriented themes and social justice-oriented themes, respectively, less and more after sensitization.
77 FR 39222 - Consumer Use of Reverse Mortgages
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-02
... influence reverse mortgage consumers' decision-making, consumers' use of reverse mortgage loan proceeds.... Sensitive personal information such as account numbers or Social Security numbers should not be included... personal information that could be used to identify an individual consumer or account, nor should they...
NASA Astrophysics Data System (ADS)
Wardrop, Nicola A.; Dzodzomenyo, Mawuli; Aryeetey, Genevieve; Hill, Allan G.; Bain, Robert E. S.; Wright, Jim
2017-08-01
Packaged water consumption is growing in low- and middle-income countries, but the magnitude of this phenomenon and its environmental consequences remain unclear. This study aims to quantify both the volumes of packaged water consumed relative to household water requirements and associated plastic waste generated for three West African case study countries. Data from household expenditure surveys for Ghana, Nigeria and Liberia were used to estimate the volumes of packaged water consumed and thereby quantify plastic waste generated in households with and without solid waste disposal facilities. In Ghana, Nigeria and Liberia respectively, 11.3 (95% confidence interval: 10.3-12.4), 10.1 (7.5-12.5), and 0.38 (0.31-0.45) Ml day-1 of sachet water were consumed. This generated over 28 000 tonnes yr-1 of plastic waste, of which 20%, 63% and 57% was among households lacking formal waste disposal facilities in Ghana, Nigeria and Liberia respectively. Reported packaged water consumption provided sufficient water to meet daily household drinking-water requirements for 8.4%, less than 1% and 1.6% of households in Ghana, Nigeria and Liberia respectively. These findings quantify packaged water’s contribution to household water needs in our study countries, particularly Ghana, but indicate significant subsequent environmental repercussions.
Boenzi, Sara; Deodato, Federica; Taurisano, Roberta; Martinelli, Diego; Verrigni, Daniela; Carrozzo, Rosalba; Bertini, Enrico; Pastore, Anna; Dionisi-Vici, Carlo; Johnson, David W
2014-11-01
Two oxysterols, cholestan-3β,5α,6β-triol (C-triol) and 7-ketocholesterol (7-KC), have been recently proposed as diagnostic markers of Niemann-Pick type C (NP-C) disease, representing a potential alternative diagnostic tool to the more invasive and time consuming filipin test in cultured fibroblasts. Usually, the oxysterols are detected and quantified by liquid chromatography-tandem mass spectrometry (LC-MS/MS) method using atmospheric pressure chemical ionization (APCI) or electro-spray-ionization (ESI) sources, after a variety of derivatization procedures to enhance sensitivity. We developed a sensitive LC-MS/MS method to quantify the oxysterols in plasma as dimethylaminobutyrate ester, suitable for ESI analysis. This method, with an easy liquid-phase extraction and a short derivatization procedure, has been validated to demonstrate specificity, linearity, recovery, lowest limit of quantification, accuracy and precision. The assay was linear over a concentration range of 0.5-200ng/mL for C-triol and 1.0-200ng/mL for 7-KC. Intra-day and inter-day coefficients of variation (CV%) were <15% for both metabolites. Receiver operating characteristic analysis estimates that the area under curve was 0.998 for C-triol, and 0.972 for 7-KC, implying a significant discriminatory power for the method in this patient population of both oxysterols. In summary, our method provides a simple, rapid and non-invasive diagnostic tool for the biochemical diagnosis of NP-C disease. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Lopez-Duarte, P. C.; Able, K.; Fodrie, J.; McCann, M. J.; Melara, S.; Noji, C.; Olin, J.; Pincin, J.; Plank, K.; Polito, M. J.; Jensen, O.
2016-02-01
Multiple studies conducted over five years since the 2010 Macondo oil spill in the Gulf of Mexico indicate that oil impacts vary widely among taxonomic groups. For instance, fishes inhabiting the marsh surface show no clear differences in either community composition or population characteristics between oiled and unoiled sites, despite clear evidence of physiological impacts on individual fish. In contrast, marsh insects and spiders are sensitive to the effects of hydrocarbons. Both insects and spiders are components of the marsh food web and represent an important trophic link between marsh plants and higher trophic levels. Because differences in oil impacts throughout the marsh food web have the potential to significantly alter food webs and energy flow pathways and reduce food web resilience, our goal is to quantify differences in marsh food webs between oiled and unoiled sites to test the hypothesis that oiling has resulted in simpler and less resilient food webs. Diets and food web connections were quantified through a combination of stomach content, stable isotope, and fatty acid analysis. The combination of these three techniques provides a more robust approach to quantifying trophic relationships than any of these methods alone. Stomach content analysis provides a detailed snapshot of diets, while fatty acid and stable isotopes reflect diets averaged over weeks to months. Initial results focus on samples collected in May 2015 from a range of terrestrial and aquatic consumer species, including insects, mollusks, crustaceans, and piscivorous fishes.
Evolutions in food marketing, quantifying the impact, and policy implications.
Cairns, Georgina
2013-03-01
A case study on interactive digital marketing examined the adequacy of extant policy controls and their underpinning paradigms to constrain the effects of this rapidly emerging practice. Findings were interactive digital marketing is expanding the strategies available to promote products, brands and consumer behaviours. It facilitates relational marketing; the collection of personal data for marketing; integration of the marketing mix, and provides a platform for consumers to engage in the co-creation of marketing communications. The paradigmatic logic of current policies to constrain youth-oriented food marketing does not address the interactive nature of digital marketing. The evidence base on the effects of HFSS marketing and policy interventions is based on conceptualizations of marketing as a force promoting transactions rather than interactions. Digital technologies are generating rich consumer data. Interactive digital technologies increase the complexity of the task of quantifying the impact of marketing. The rapidity of its uptake also increases urgency of need to identify appropriate effects measures. Independent analysis of commercial consumer data (appropriately transformed to protect commercial confidentiality and personal privacy) would provide evidence sources for policy on the impacts of commercial food and beverage marketing and policy controls. Copyright © 2012 Elsevier Ltd. All rights reserved.
Glyphosate and aminomethylphosphonic acid are not detectable in human milk.
McGuire, Michelle K; McGuire, Mark A; Price, William J; Shafii, Bahman; Carrothers, Janae M; Lackey, Kimberly A; Goldstein, Daniel A; Jensen, Pamela K; Vicini, John L
2016-05-01
Although animal studies have shown that exposure to glyphosate (a commonly used herbicide) does not result in glyphosate bioaccumulation in tissues, to our knowledge there are no published data on whether it is detectable in human milk and therefore consumed by breastfed infants. We sought to determine whether glyphosate and its metabolite aminomethylphosphonic acid (AMPA) could be detected in milk and urine produced by lactating women and, if so, to quantify typical consumption by breastfed infants. We collected milk (n = 41) and urine (n = 40) samples from healthy lactating women living in and around Moscow, Idaho and Pullman, Washington. Milk and urine samples were analyzed for glyphosate and AMPA with the use of highly sensitive liquid chromatography-tandem mass spectrometry methods validated for and optimized to each sample matrix. Our milk assay, which was sensitive down to 1 μg/L for both analytes, detected neither glyphosate nor AMPA in any milk sample. Mean ± SD glyphosate and AMPA concentrations in urine were 0.28 ± 0.38 and 0.30 ± 0.33 μg/L, respectively. Because of the complex nature of milk matrixes, these samples required more dilution before analysis than did urine, thus decreasing the sensitivity of the assay in milk compared with urine. No difference was found in urine glyphosate and AMPA concentrations between subjects consuming organic compared with conventionally grown foods or between women living on or near a farm/ranch and those living in an urban or suburban nonfarming area. Our data provide evidence that glyphosate and AMPA are not detectable in milk produced by women living in this region of the US Pacific Northwest. By extension, our results therefore suggest that dietary glyphosate exposure is not a health concern for breastfed infants. This study was registered at clinicaltrials.gov as NCT02670278. © 2016 American Society for Nutrition.
Oliveira, Denize; Ares, Gastón; Deliza, Rosires
2018-06-01
Sugar reduction in beverages can contribute to reduce consumption of this nutrient and to improve the health status of the population. However, such reduction can negatively affect consumer perception. Label information can be an effective tool to increase consumer interest in sugar-reduced products. In this context, the aim of the present work was to study the influence of health/hedonic claims on consumer hedonic and sensory perception of sugar reduction in orange/passionfruit nectars under expected and informed conditions. Sugar-reduced orange/passionfruit nectars (20% and 40% reduced in added sugar) featuring different claims (none, health claim or hedonic claim) were evaluated, together with a control product without reduction. Following a between-subjects experimental design, 206 participants evaluated the nectars under two experimental conditions: (a) expected, looking at the packages, and (b) informed, looking at the packages and tasting the nectars. In each experimental condition, participants evaluated their overall liking using a 9-point hedonic scale and answered a check-all-that-apply questions related to the sensory characteristics of the nectars. Results showed that although consumers did not have negative expectations about sugar-reduced nectars, the sensory characteristics of the products were the main determinants of consumers' hedonic reaction towards the nectars. The influence of claims on consumers' perception was modulated by their hedonic sensitivity towards sugar-reduction. The hedonic claim increased overall liking of those consumers with low hedonic sensitivity towards sugar reduction, whereas it had the opposite effect on the most sensitive consumers. Results from the present work suggest that although hedonic claims hold potential for a consumer segment, care must be taken to avoid the generation of unrealistic expectations about the sensory characteristics of sugar-reduced products. Copyright © 2018 Elsevier Ltd. All rights reserved.
Tutorial: Parallel Computing of Simulation Models for Risk Analysis.
Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D
2016-10-01
Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.
de Toledo, Fernanda Crossi Pereira; Yonamine, Mauricio; de Moraes Moreau, Regina Lucia; Silva, Ovandir Alves
2003-12-25
The present work describes a highly precise and sensitive method developed to detect cocaine (COC), benzoylecgonine (BE, its main metabolite) and cocaethylene (CE, transesterification product of the coingestion of COC with ethanol) in human head hair samples. The method was based on an alkylchloroformate derivatization of benzoylecgonine and the extraction of the analytes by solid-phase microextraction (SPME). Gas chromatography-mass spectrometry (GC-MS) was used to identify and quantify the analytes in selected ion monitoring mode (SIM). The limits of quantification and detection (LOQ and LOD) were: 0.1 ng/mg for COC and CE, and 0.5 ng/mg for BE. Good inter- and intra-assay precision was observed. The dynamic range of the assay was 0.1-50 ng/mg. The method is not time consuming and was shown to be easy to perform.
Di Giuseppe, Antonella M A; Giarretta, Nicola; Lippert, Martina; Severino, Valeria; Di Maro, Antimo
2015-02-15
In 2013, following the scandal of the presence of undeclared horse meat in various processed beef products across the Europe, several researches have been undertaken for the safety of consumer health. In this framework, an improved UPLC separation method has been developed to detect the presence of horse myoglobin in raw meat samples. The separation of both horse and beef myoglobins was achieved in only seven minutes. The methodology was improved by preparing mixtures with different composition percentages of horse and beef meat. By using myoglobin as marker, low amounts (0.50mg/0.50g, w/w; ∼0.1%) of horse meat can be detected and quantified in minced raw meat samples with high reproducibility and sensitivity, thus offering a valid alternative to conventional PCR techniques. Copyright © 2014 Elsevier Ltd. All rights reserved.
Adjoint-Based Uncertainty Quantification with MCNP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seifried, Jeffrey E.
2011-09-01
This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence inmore » the simulation is acquired.« less
Valls, Rosa-Maria; Soler, Aranzazu; Girona, Josefa; Heras, Mercedes; Romero, Maria-Paz; Covas, Maria-Isabel; Solà, Rosa; Masana, Lluis; Motilva, Maria-Jose
2010-09-21
The effect of repeated consumption of virgin olive oil on endogenous phenolic metabolites of fasting plasma is unknown. For this reason, we hypothesized that regular long-term virgin olive oil intake could have an indirect protection effect on the endogenous phenols. Thus, the aim of the study was to determine the phenolic profile of human plasma in a fasting state of long-term regular virgin olive oil consumers, using the fasting plasma of non-consumers as a natural control. Forty participants living in the area of Reus (Catalonia, Spain) were selected, 20 life-long regular consumers of virgin olive oil and a natural control of 20 non-consumers, the latter being Rumanians who dislike the taste of olive oil. The diet was obtained from 3-day food records. The results showed similar phenolic composition of fasting plasmas of the two volunteer groups. Of special interest is that more of the compounds quantified showed higher concentration in fasting plasma from habitual virgin olive oil consumers. The compounds were semi-quantified using caffeic acid as the calibration standard. The quantification of fasting consumer's plasma showed higher concentration of a hydroxyflavanone type compound (2.90+/-0.04 microM vs 1.5+/-0.04 microM) and a catecholamine derivative (0.70+/-0.03 microM vs 0.56+/-0.03 microM) than the plasma of non-consumers (P<0.05). The results suggest an indirect protective mechanism of long-term regular virgin olive oil consumption related to the protection of the endogenous antioxidant system. Copyright 2010 Elsevier B.V. All rights reserved.
Costs and benefits of direct-to-consumer advertising: the case of depression.
Block, Adam E
2007-01-01
Direct-to-consumer advertising (DTCA) is legal in the US and New Zealand, but illegal in the rest of the world. Little or no research exists on the social welfare implications of DTCA. To quantify the total costs and benefits associated with both appropriate and inappropriate care due to DTCA, for the case of depression. A cost-benefit model was developed using parameter estimates from available survey, epidemiological and experimental data. The model estimates the total benefits and costs (year 2002 values) of new appropriate and inappropriate care stimulated by DTCA for depression. Uncertainty in model parameters is addressed with sensitivity analyses. This study provides evidence that 94% of new antidepressant use due to DTCA is from non-depressed individuals. However, the average health benefit to each new depressed user is 63-fold greater than the cost per treatment, creating a positive overall social welfare effect; a net benefit of >72 million US dollars. This analysis suggests that DTCA may lead to antidepressant treatment in 15-fold as many non-depressed people as depressed people. However, the costs of treating non-depressed people may be vastly outweighed by the much larger benefit accruing to treated depressed individuals. The cost-benefit ratio can be improved through better targeting of advertisements and higher quality treatment of depression.
Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Xuejiang; Tang, Keqi
Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers wouldmore » ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode, also known as multiple reaction monitoring (MRM), is capable of quantitatively measuring hundreds of candidate protein biomarkers from a relevant clinical sample in a single analysis. The specificity, reproducibility and sensitivity could be as good as ELISA. Furthermore, SRM MS can also quantify protein isoforms and post-translational modifications, for which traditional antibody-based immunoassays often don’t exist.« less
Sample preparation: a critical step in the analysis of cholesterol oxidation products.
Georgiou, Christiana A; Constantinou, Michalis S; Kapnissi-Christodoulou, Constantina P
2014-02-15
In recent years, cholesterol oxidation products (COPs) have drawn scientific interest, particularly due to their implications on human health. A big number of these compounds have been demonstrated to be cytotoxic, mutagenic, and carcinogenic. The main source of COPs is through diet, and particularly from the consumption of cholesterol-rich foods. This raises questions about the safety of consumers, and it suggests the necessity for the development of a sensitive and a reliable analytical method in order to identify and quantify these components in food samples. Sample preparation is a necessary step in the analysis of COPs in order to eliminate interferences and increase sensitivity. Numerous publications have, over the years, reported the use of different methods for the extraction and purification of COPs. However, no method has, so far, been established as a routine method for the analysis of COPs in foods. Therefore, it was considered important to overview different sample preparation procedures and evaluate the different preparative parameters, such as time of saponification, the type of organic solvents for fat extraction, the stationary phase in solid phase extraction, etc., according to recovery, precision and simplicity. Copyright © 2013 Elsevier Ltd. All rights reserved.
Analysis of Consumers' Preferences and Price Sensitivity to Native Chickens.
Lee, Min-A; Jung, Yoojin; Jo, Cheorun; Park, Ji-Young; Nam, Ki-Chang
2017-01-01
This study analyzed consumers' preferences and price sensitivity to native chickens. A survey was conducted from Jan 6 to 17, 2014, and data were collected from consumers (n=500) living in Korea. Statistical analyses evaluated the consumption patterns of native chickens, preference marketing for native chicken breeds which will be newly developed, and price sensitivity measurement (PSM). Of the subjects who preferred broilers, 24.3% do not purchase native chickens because of the dryness and tough texture, while those who preferred native chickens liked their chewy texture (38.2%). Of the total subjects, 38.2% preferred fried native chickens (38.2%) for processed food, 38.4% preferred direct sales for native chicken distribution, 51.0% preferred native chickens to be slaughtered in specialty stores, and 32.4% wanted easy access to native chickens. Additionally, the price stress range (PSR) was 50 won and the point of marginal cheapness (PMC) and point of marginal expensiveness (PME) were 6,980 won and 12,300 won, respectively. Evaluation of the segmentation market revealed that consumers who prefer broiler to native chicken breeds were more sensitive to the chicken price. To accelerate the consumption of newly developed native chicken meat, it is necessary to develop a texture that each consumer needs, to increase the accessibility of native chickens, and to have diverse menus and recipes as well as reasonable pricing for native chickens.
The Logic of Collective Rating
NASA Astrophysics Data System (ADS)
Nax, Heinrich
2016-05-01
The introduction of participatory rating mechanisms on online sales platforms has had substantial impact on firms' sales and profits. In this note, we develop a dynamic model of consumer influences on ratings and of rating influences on consumers, focussing on standard 5-star mechanisms as implemented by many platforms. The key components of our social influence model are the consumer trust in the `wisdom of crowds' during the purchase phase and indirect reciprocity during the rating decision. Our model provides an overarching explanation for well-corroborated empirical regularities. We quantify the performance of the voluntary rating mechanism in terms of realized consumer surplus with the no-mechanism and full-information benchmarks, and identify how it could be improved.
Smith, Matthew M; Schmutz, Joel; Apelgren, Chloe; Ramey, Andrew M
2015-04-01
Microscopic examination of blood smears can be effective at diagnosing and quantifying hematozoa infections. However, this method requires highly trained observers, is time consuming, and may be inaccurate for detection of infections at low levels of parasitemia. To develop a molecular methodology for identifying and quantifying Leucocytozoon parasite infection in wild waterfowl (Anseriformes), we designed a real-time, quantitative PCR protocol to amplify Leucocytozoon mitochondrial DNA using TaqMan fluorogenic probes and validated our methodology using blood samples collected from waterfowl in interior Alaska during late summer and autumn (n=105). By comparing our qPCR results to those derived from a widely used nested PCR protocol, we determined that our assay showed high levels of sensitivity (91%) and specificity (100%) in detecting Leucocytozoon DNA from host blood samples. Additionally, results of a linear regression revealed significant correlation between the raw measure of parasitemia produced by our qPCR assay (Ct values) and numbers of parasites observed on blood smears (R(2)=0.694, P=0.003), indicating that our assay can reliably determine the relative parasitemia levels among samples. This methodology provides a powerful new tool for studies assessing effects of haemosporidian infection in wild avian species. Published by Elsevier B.V.
Quantification of Carbon Nanotubes in Different ...
Carbon nanotubes (CNTs) have been incorporated into numerous consumer products, and have also been employed in various industrial areas because of their extraordinary properties. The large scale production and wide applications of CNTs make their release into the environment a major concern. Therefore, it is crucial to determine the degree of potential CNT contamination in the environment, which requires a sensitive and accurate technique for selectively detecting and quantifying CNTs in environmental matrices. In this study, a simple device based on utilizing heat generated/temperature increase from CNTs under microwave irradiation was built to quantify single-walled CNTs (SWCNTs), multi-walled CNTs (MWCNTs) and carboxylated CNTs (MWCNT-COOH) in three environmentally relevant matrices (sand, soil and sludge). Linear temperature vs CNT mass relationships were developed for the three environmental matrices spiked with known amounts of different types of CNTs that were then irradiated in a microwave at low energies (70-149 W) for a short time (15-30 s). MWCNTs had a greater microwave response in terms of heat generated/temperature increase than SWCNTs and MWCNT-COOH. An evaluation of microwave behavior of different carbonaceous materials showed that the microwave measurements of CNTs were not affected even with an excess of other organic, inorganic carbon or carbon based nanomaterials (fullerene, granular activated carbon and graphene oxide) mainly because micr
Weeks, William B; Ventelou, Bruno; Paraponaris, Alain
2016-05-01
Admissions for ambulatory care sensitive conditions (ACSCs) are considered preventable and indicators of poor access to primary care. We wondered whether per-capita rates of admission for ACSCs in France demonstrated geographic variation, were changing, were related to other independent variables, or were comparable to those in other countries; further, we wanted to quantify the resources such admissions consume. We calculated per-capita rates of admission for five categories (chronic, acute, vaccination preventable, alcohol-related, and other) of ACSCs in 94 departments in mainland France in 2009 and 2010, examined measures and causes of geographic variation in those rates, computed the costs of those admissions, and compared rates of admission for ACSCs in France to those in several other countries. The highest ACSC admission rates generally occurred in the young and the old, but rates varied across French regions. Over the 2-year period, rates of most categories of ACSCs increased; higher ACSC admission rates were associated with lower incomes and a higher supply of hospital beds. We found that the local supply of general practitioners was inversely associated with rates of chronic and total ACSC admission rates, but that this relationship disappeared if we accounted for patients' use of general practitioners in neighboring departments. ACSC admissions cost 4.755 billion euros in 2009 and 5.066 billion euros in 2010; they consumed 7.86 and 8.74 million bed days of care, respectively. France had higher rates of ACSC admissions than most other countries examined. Because admissions for ACSCs are generally considered a failure of outpatient care, cost French taxpayers substantial monetary and hospital resources, and appear to occur more frequently in France than in other countries, policymakers should prioritize targeted efforts to reduce them.
Mookerjee, Shona A.; Gerencser, Akos A.; Nicholls, David G.; Brand, Martin D.
2017-01-01
Partitioning of ATP generation between glycolysis and oxidative phosphorylation is central to cellular bioenergetics but cumbersome to measure. We describe here how rates of ATP generation by each pathway can be calculated from simultaneous measurements of extracellular acidification and oxygen consumption. We update theoretical maximum ATP yields by mitochondria and cells catabolizing different substrates. Mitochondrial P/O ratios (mol of ATP generated per mol of [O] consumed) are 2.73 for oxidation of pyruvate plus malate and 1.64 for oxidation of succinate. Complete oxidation of glucose by cells yields up to 33.45 ATP/glucose with a maximum P/O of 2.79. We introduce novel indices to quantify bioenergetic phenotypes. The glycolytic index reports the proportion of ATP production from glycolysis and identifies cells as primarily glycolytic (glycolytic index > 50%) or primarily oxidative. The Warburg effect is a chronic increase in glycolytic index, quantified by the Warburg index. Additional indices quantify the acute flexibility of ATP supply. The Crabtree index and Pasteur index quantify the responses of oxidative and glycolytic ATP production to alterations in glycolysis and oxidative reactions, respectively; the supply flexibility index quantifies overall flexibility of ATP supply; and the bioenergetic capacity quantifies the maximum rate of total ATP production. We illustrate the determination of these indices using C2C12 myoblasts. Measurement of ATP use revealed no significant preference for glycolytic or oxidative ATP by specific ATP consumers. Overall, we demonstrate how extracellular fluxes quantitatively reflect intracellular ATP turnover and cellular bioenergetics. We provide a simple spreadsheet to calculate glycolytic and oxidative ATP production rates from raw extracellular acidification and respiration data. PMID:28270511
Stanhope, Kimber L.; Schwarz, Jean Marc; Keim, Nancy L.; Griffen, Steven C.; Bremer, Andrew A.; Graham, James L.; Hatcher, Bonnie; Cox, Chad L.; Dyachenko, Artem; Zhang, Wei; McGahan, John P.; Seibert, Anthony; Krauss, Ronald M.; Chiu, Sally; Schaefer, Ernst J.; Ai, Masumi; Otokozawa, Seiko; Nakajima, Katsuyuki; Nakano, Takamitsu; Beysen, Carine; Hellerstein, Marc K.; Berglund, Lars; Havel, Peter J.
2009-01-01
Studies in animals have documented that, compared with glucose, dietary fructose induces dyslipidemia and insulin resistance. To assess the relative effects of these dietary sugars during sustained consumption in humans, overweight and obese subjects consumed glucose- or fructose-sweetened beverages providing 25% of energy requirements for 10 weeks. Although both groups exhibited similar weight gain during the intervention, visceral adipose volume was significantly increased only in subjects consuming fructose. Fasting plasma triglyceride concentrations increased by approximately 10% during 10 weeks of glucose consumption but not after fructose consumption. In contrast, hepatic de novo lipogenesis (DNL) and the 23-hour postprandial triglyceride AUC were increased specifically during fructose consumption. Similarly, markers of altered lipid metabolism and lipoprotein remodeling, including fasting apoB, LDL, small dense LDL, oxidized LDL, and postprandial concentrations of remnant-like particle–triglyceride and –cholesterol significantly increased during fructose but not glucose consumption. In addition, fasting plasma glucose and insulin levels increased and insulin sensitivity decreased in subjects consuming fructose but not in those consuming glucose. These data suggest that dietary fructose specifically increases DNL, promotes dyslipidemia, decreases insulin sensitivity, and increases visceral adiposity in overweight/obese adults. PMID:19381015
Large effects of consumer offense on ecosystem structure and function.
Chislock, Michael F; Sarnelle, Orlando; Olsen, Brianna K; Doster, Enrique; Wilson, Alan E
2013-11-01
Study of the role of within-species adaptation in ecological dynamics has focused largely on prey adaptations that reduce consumption risk (prey defense). Few, if any, studies have examined how consumer adaptations to overcome prey defenses (consumer offense) affect ecosystem structure and function. We manipulated two sets of genotypes of a planktonic herbivore (Daphnia pulicaria) in a highly productive ecosystem with abundant toxic prey (cyanobacteria). The two sets of consumer genotypes varied widely in their tolerance of toxic cyanobacteria in the diet (i.e., sensitive vs. tolerant). We found a large effect of tolerant D. pulicaria on phytoplankton biomass and gross primary productivity but no effect of sensitive genotypes, this result stemming from genotype-specific differences in population growth in the presence of toxic prey. The former effect was as large as effects seen in previous Daphnia manipulations at similar productivity levels. Thus, we demonstrated that the effect of consumer genotypes with contrasting offensive adaptations was as large as the effect of consumer presence/absence.
The Release of Nanosilver from Consumer Products Used in the Home
Benn, Troy; Cavanagh, Bridget; Hristovski, Kiril; Posner, Jonathan D.; Westerhoff, Paul
2016-01-01
Nanosilver has become one of the most widely used nanomaterials in consumer products because of its antimicrobial properties. Public concern over the potential adverse effects of nanosilver's environmental release has prompted discussion of federal regulation. In this paper, we assess several classes of consumer products for their silver content and potential to release nanosilver into water, air, or soil. Silver was quantified in a shirt, a medical mask and cloth, toothpaste, shampoo, detergent, a towel, a toy teddy bear, and two humidifiers. Silver concentrations ranged from 1.4 to 270,000 μg Ag g product−1. Products were washed in 500 mL of tap water to assess the potential release of silver into aqueous environmental matrices (wastewater, surface water, saliva, etc.). Silver was released in quantities up to 45 μg Ag g product−1, and size fractions were both larger and smaller than 100 nm. Scanning electron microscopy confirmed the presence of nanoparticle silver in most products as well as in the wash water samples. Four products were subjected to a toxicity characterization leaching procedure to assess the release of silver in a landfill. The medical cloth released an amount of silver comparable to the toxicity characterization limit. This paper presents methodologies that can be used to quantify and characterize silver and other nanomaterials in consumer products. The quantities of silver in consumer products can in turn be used to estimate real-world human and environmental exposure levels. PMID:21284285
Airborne multispectral identification of individual cotton plants using consumer-grade cameras
USDA-ARS?s Scientific Manuscript database
Although multispectral remote sensing using consumer-grade cameras has successfully identified fields of small cotton plants, improvements to detection sensitivity are needed to identify individual or small clusters of plants. The imaging sensor of consumer-grade cameras are based on a Bayer patter...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendelberger, Laura Jean
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Observing the operational significance of discord consumption
NASA Astrophysics Data System (ADS)
Gu, Mile; Chrzanowski, Helen M.; Assad, Syed M.; Symul, Thomas; Modi, Kavan; Ralph, Timothy C.; Vedral, Vlatko; Lam, Ping Koy
2012-09-01
Coherent interactions that generate negligible entanglement can still exhibit unique quantum behaviour. This observation has motivated a search beyond entanglement for a complete description of all quantum correlations. Quantum discord is a promising candidate. Here, we demonstrate that under certain measurement constraints, discord between bipartite systems can be consumed to encode information that can only be accessed by coherent quantum interactions. The inability to access this information by any other means allows us to use discord to directly quantify this `quantum advantage'. We experimentally encode information within the discordant correlations of two separable Gaussian states. The amount of extra information recovered by coherent interaction is quantified and directly linked with the discord consumed during encoding. No entanglement exists at any point of this experiment. Thus we introduce and demonstrate an operational method to use discord as a physical resource.
Quantification of tumor fluorescence during intraoperative optical cancer imaging
Judy, Ryan P.; Keating, Jane J.; DeJesus, Elizabeth M.; Jiang, Jack X.; Okusanya, Olugbenga T.; Nie, Shuming; Holt, David E.; Arlauckas, Sean P.; Low, Phillip S.; Delikatny, E. James; Singhal, Sunil
2015-01-01
Intraoperative optical cancer imaging is an emerging technology in which surgeons employ fluorophores to visualize tumors, identify tumor-positive margins and lymph nodes containing metastases. This study compares instrumentation to measure tumor fluorescence. Three imaging systems (Spectropen, Glomax, Flocam) measured and quantified fluorescent signal-to-background ratios (SBR) in vitro, murine xenografts, tissue phantoms and clinically. Evaluation criteria included the detection of small changes in fluorescence, sensitivity of signal detection at increasing depths and practicality of use. In vitro, spectroscopy was superior in detecting incremental differences in fluorescence than luminescence and digital imaging (Ln[SBR] = 6.8 ± 0.6, 2.4 ± 0.3, 2.6 ± 0.1, p = 0.0001). In fluorescent tumor cells, digital imaging measured higher SBRs than luminescence (6.1 ± 0.2 vs. 4.3 ± 0.4, p = 0.001). Spectroscopy was more sensitive than luminometry and digital imaging in identifying murine tumor fluorescence (SBR = 41.7 ± 11.5, 5.1 ± 1.8, 4.1 ± 0.9, p = 0.0001), and more sensitive than digital imaging at detecting fluorescence at increasing depths (SBR = 7.0 ± 3.4 vs. 2.4 ± 0.5, p = 0.03). Lastly, digital imaging was the most practical and least time-consuming. All methods detected incremental differences in fluorescence. Spectroscopy was the most sensitive for small changes in fluorescence. Digital imaging was the most practical considering its wide field of view, background noise filtering capability, and sensitivity to increasing depth. PMID:26563091
Dámaso, D; Moreno-López, M; Martínez-Beltrán, J
1977-01-01
The bacteriostatic activity of fosfomycin was studied in vitro against 1,243 clinical isolations of gram-positive cocci and 4,086 isolations of gram-negative bacilli that were obtained in 1973, 1974 and in the period from January to May of 1975. MIC was determined by the agar diffusion method, quantifying it by means of the standard curve that was worked out with the strain of E. coli NCTC 10,418. A slight increase in resistance was observed in the gram-positive cocci: 64 mug/ml were inhibitory for 63% of the 249 isolations obtained in 1973, 59.1% of the 716 isolations obtained in 1974, and 57.5% of the 278 isolations from 1975. A slight loss of sensitivity was also observed in the gram-negative bacilli: the aforementioned concentration of fosfomycin inhibited 36% of the 742 isolations from 1973, 33.6% of the 2,387 isolations from 1974 and 32.6% of the 957 isolations from 1975. 933 g of this antibiotic were consumed in our hospital in 1973, 4,203 g in 1974 and 957 g in 1975. The consumption rate per patient per year was 0.15, 0.72 and 0.20 g, respectively. In conclusion, although no change was observed in the sensitivity of some bacterial strains to fosfomycin, the overall study indicates a slight decrease in the sensitivity, although it does not apparently have any relationship to the consumption of fosfomycin in our hospital.
Spatial variations in δ13C and δ15N values of primary consumers in a coastal lagoon
NASA Astrophysics Data System (ADS)
Como, S.; Magni, P.; Van Der Velde, G.; Blok, F. S.; Van De Steeg, M. F. M.
2012-12-01
The analysis of the contribution of a food source to a consumer's diet or the trophic position of a consumer is highly sensitive to the variability of the isotopic values used as input data. However, little is known in coastal lagoons about the spatial variations in the isotopic values of primary consumers considered 'end members' in the isotope mixing models for quantifying the diet of secondary consumers or as a baseline for estimating the trophic position of consumers higher up in the food web. We studied the spatial variations in the δ13C and δ15N values of primary consumers and sedimentary organic matter (SOM) within a selected area of the Cabras lagoon (Sardinia, Italy). Our aim was to assess how much of the spatial variation in isotopic values of primary consumers was due to the spatial variability between sites and how much was due to differences in short distances from the shore. Samples were collected at four stations (50-100 m apart) selected randomly at two sites (1.5-2 km apart) chosen randomly at two distances from the shore (i.e. in proximity of the shore -Nearshore - and about 200 m away from the shore -Offshore). The sampling was repeated in March, May and August 2006 using new sites at the two chosen distances from the shore on each date. The isotopic values of size-fractionated seston and macrophytes were also analyzed as a complementary characterization of the study area. While δ15N did not show any spatial variations, the δ13C values of deposit feeders, Alitta (=Neanthes) succinea, Lekanesphaera hookeri, Hydrobia acuta and Gammarus aequicauda, were more depleted Offshore than Nearshore. For these species, there were significant effects of distance or distance × dates in the mean δ13C values, irrespective of the intrinsic variation between sites. SOM showed similar spatial variations in δ13C values, with Nearshore-Offshore differences up to 6‰. This indicates that the spatial isotopic changes are transferred from the food sources to the deposit feeders studied. In contrast, δ13C and δ15N values of suspension feeders, Ficopomatus enigmaticus and Amphibalanus amphitrite, did not show major variations, either between sites, or between Nearshore and Offshore. These different patterns between deposit feeders and suspension feeders are probably due to a weaker trophic link of the latter with SOM. We suggest that the Nearshore-Offshore gradient might be an important source of isotopic variation that needs to be considered in future web studies in coastal lagoons.
NASA Astrophysics Data System (ADS)
Yoshikawa, K.; Ueyama, M.; Takagi, K.; Kominami, Y.
2015-12-01
Methane (CH4) budget in forest ecosystems have not been accurately quantified due to limited measurements and considerable spatiotemporal heterogeneity. In order to quantify CH4 fluxes at temperate forest at various spatiotemporal scales, we have continuously measured CH4 fluxes at two upland forests based on the micrometeorological hyperbolic relaxed eddy accumulation (HREA) and automated dynamic closed chamber methods.The measurements have been conducted at Teshio experimental forest (TSE) since September 2013 and Yamashiro forest meteorology research site (YMS) since November 2014. Three automated chambers were installed on each site. Our system can measure CH4 flux by the micrometeorological HREA, vertical concentration profile at four heights, and chamber measurements by a laser-based gas analyzer (FGGA-24r-EP, Los Gatos Research Inc., USA).Seasonal variations of canopy-scale CH4 fluxes were different in each site. CH4 was consumed during the summer, but was emitted during the fall and winter in TSE; consequently, the site acted as a net annual CH4 source. CH4 was steadily consumed during the winter, but CH4 fluxes fluctuated between absorption and emission during the spring and summer in YMS. YMS acted as a net annual CH4 sink. CH4 uptake at the canopy scale generally decreased with rising soil temperature and increased with drying condition for both sites. CH4 flux measured by most of chambers showed the consistent sensitivity examined for the canopy scale to the environmental variables. CH4 fluxes from a few chambers located at a wet condition were independent of variations in soil temperature and moisture at both sites. Magnitude of soil CH4 uptake was higher than the canopy-scale CH4 uptake. Our results showed that the canopy-scale CH4 fluxes were totally different with the plot-scale CH4 fluxes by chambers, suggesting the considerable spatial heterogeneity in CH4 flux at the temperate forests.
The social costs of dangerous products: an empirical investigation.
Shapiro, Sidney; Ruttenberg, Ruth; Leigh, Paul
2009-01-01
Defective consumer products impose significant costs on consumers and third parties when they cause fatalities and injuries. This Article develops a novel approach to measuring the true extent of such costs, which may not be accurately captured under current methods of estimating the cost of dangerous products. Current analysis rests on a narrowly defined set of costs, excluding certain types of costs. The cost-of-injury estimates utilized in this Article address this omission by quantifying and incorporating these costs to provide a more complete picture of the true impact of defective consumer products. The new estimates help to gauge the true value of the civil liability system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beach, Connor A.; Krumm, Christoph; Spanjers, Charles S.
Analysis of trace compounds, such as pesticides and other contaminants, within consumer products, fuels, and the environment requires quantification of increasingly complex mixtures of difficult-to-quantify compounds.
Sensitizing Black Adult and Youth Consumers to Targeted Food Marketing Tactics in Their Environments
Isselmann DiSantis, Katherine; Kumanyika, Shiriki; Rohm Young, Deborah; Grier, Sonya A.; Lassiter, Vikki
2017-01-01
Food marketing environments of Black American consumers are heavily affected by ethnically-targeted marketing of sugar sweetened beverages, fast foods, and other products that may contribute to caloric overconsumption. This qualitative study assessed Black consumers’ responses to targeted marketing. Black adults (2 mixed gender groups; total n = 30) and youth (2 gender specific groups; total n = 35) from two U.S. communities participated before and after a sensitization procedure—a critical practice used to understand social justice concerns. Pre-sensitization focus groups elicited responses to scenarios about various targeted marketing tactics. Participants were then given an informational booklet about targeted marketing to Black Americans, and all returned for the second (post-sensitization) focus group one week later. Conventional qualitative content analysis of transcripts identified several salient themes: seeing the marketer’s perspective (“it’s about demand”; “consumers choose”), respect for community (“marketers are setting us up for failure”; “making wrong assumptions”), and food environments as a social justice issue (“no one is watching the door”; “I didn’t realize”). Effects of sensitization were reflected in participants’ stated reactions to the information in the booklet, and also in the relative occurrence of marketer-oriented themes and social justice-oriented themes, respectively, less and more after sensitization. PMID:29109377
Characteristics and consumption patterns of Australian organic consumers.
Oates, Liza; Cohen, Marc; Braun, Lesley
2012-11-01
Increasingly, Australians are choosing to consume organically produced food, but only a small percentage consume organic food exclusively, and there is little information in the scientific literature that describes their actual level of intake. In order to provide a more meaningful description of Australian organic consumers the 'Organic Consumption Survey' and 'Organic Food Intake Survey' were conducted online in 2010. The aims were to provide information about the characteristics of regular organic consumers and quantify levels of organic consumption. The majority of participants (n = 318) were female (80.3%), 25-55 years old (80.3%), living in urban areas (61.2%), born in Australia (68.9%) and were in a healthy weight range (55.5%). Organic fruit and vegetables had the highest uptake by organic consumers and meat products the lowest. The majority of participants consumed at least 65% organic food in their diet, including 35% certified organic food. A better understanding of organic consumers may help to serve the long-term interests of the organic industry and other stakeholders of food marketing. Clearer definitions of organic consumers may also inform research evaluating the purported health benefits of organic foods. Copyright © 2012 Society of Chemical Industry.
Assessing contributory risk using economic input-output life-cycle analysis.
Miller, Ian; Shelly, Michael; Jonmaire, Paul; Lee, Richard V; Harbison, Raymond D
2005-04-01
The contribution of consumer purchases of non-essential products to environmental pollution is characterized. Purchase decisions by consumers induce a complex sequence of economy-wide production interactions that influence the production and consumption of chemicals and subsequent exposure and possible public health risks. An economic input-output life-cycle analysis (EIO-LCA) was used to link resource consumption and production by manufacturers to corresponding environmental impacts. Using the US Department of Commerce's input-output tables together with the US Environmental Protection Agency's Toxics Release Inventory and AIRData databases, the economy-wide air discharges resulting from purchases of household appliances, motor homes, and games and toys were quantified. The economic and environmental impacts generated from a hypothetical 10,000 US dollar purchase for selected consumer items were estimated. The analysis shows how purchases of seemingly benign consumer products increase the output of air pollutants along the supply chain and contribute to the potential risks associated with environmental chemical exposures to both consumers and non-consumers alike.
Involving consumers in product design through collaboration: the case of online role-playing games.
Yeh, Shu-Yu
2010-12-01
The release of software attributes to users by software designers for the creation of user-designed forms is regarded as a producer-consumer collaboration, leading consumers to expend significant effort on a specific product. This article identifies such software/product attributes within online role-playing games and then explores how consumers' prior experience affects the evaluation of such attributes. In this article, product attributes comprise customized, content, and interactive externality-sensitive and complementary externality-sensitive attributes, with the value of each attribute being greater for experts than for novices. In Study 1, data were collected and analyzed for the purpose of identifying such features in online role-playing games. The results can also be generalized to convergent products, such as TV games that have been redesigned as online games or mobile games found in Study 2. For the introduction of a convergent product to be successful, our research suggests that the potential market-segment focus should be on knowledgeable consumers who accept such products more readily.
Song, Anna V; Brown, Paul; Glantz, Stanton A
2014-02-01
In its graphic warning label regulations on cigarette packages, the Food and Drug Administration severely discounts the benefits of reduced smoking because of the lost "pleasure" smokers experience when they stop smoking; this is quantified as lost "consumer surplus." Consumer surplus is grounded in rational choice theory. However, empirical evidence from psychological cognitive science and behavioral economics demonstrates that the assumptions of rational choice are inconsistent with complex multidimensional decisions, particularly smoking. Rational choice does not account for the roles of emotions, misperceptions, optimistic bias, regret, and cognitive inefficiency that are germane to smoking, particularly because most smokers begin smoking in their youth. Continued application of a consumer surplus discount will undermine sensible policies to reduce tobacco use and other policies to promote public health.
An analysis of strategic price setting in retail gasoline markets
NASA Astrophysics Data System (ADS)
Jaureguiberry, Florencia
This dissertation studies price-setting behavior in the retail gasoline industry. The main questions addressed are: How important is a retail station's brand and proximity to competitors when retail stations set price? How do retailers adjust their pricing when they cater to consumers who are less aware of competing options or have less discretion over where they purchase gasoline? These questions are explored in two separate analyses using a unique datasets containing retail pricing behavior of stations in California and in 24 different metropolitan areas. The evidence suggests that brand and location generate local market power for gasoline stations. After controlling for market and station characteristics, the analysis finds a spread of 11 cents per gallon between the highest and the lowest priced retail gasoline brands. The analysis also indicates that when the nearest competitor is located over 2 miles away as opposed to next door, consumers will pay an additional 1 cent per gallon of gasoline. In order to quantify the significance of local market power, data for stations located near major airport rental car locations are utilized. The presumption here is that rental car users are less aware or less sensitive to fueling options near the rental car return location and are to some extent "captured consumers". Retailers located near rental car locations have incentives to adjust their pricing strategies to exploit this. The analysis of pricing near rental car locations indicates that retailers charge prices that are 4 cent per gallon higher than other stations in the same metropolitan area. This analysis is of interest to regulators who are concerned with issues of consolidation, market power, and pricing in the retail gasoline industry. This dissertation concludes with a discussion of the policy implications of the empirical analysis.
Marine fisheries declines viewed upside down: human impacts on consumer-driven nutrient recycling.
Layman, Craig A; Allgeier, Jacob E; Rosemond, Amy D; Dahlgren, Craig P; Yeager, Lauren A
2011-03-01
We quantified how two human impacts (overfishing and habitat fragmentation) in nearshore marine ecosystems may affect ecosystem function by altering the role of fish as nutrient vectors. We empirically quantified size-specific excretion rates of one of the most abundant fishes (gray snapper, Lutjanus griseus) in The Bahamas and combined these with surveys of fish abundance to estimate population-level excretion rates. The study was conducted across gradients of two human disturbances: overfishing and ecosystem fragmentation (estuaries bisected by roads), to evaluate how each could result in reduced population-level nutrient cycling by consumers. Mean estimated N and P excretion rates for gray snapper populations were on average 456% and 541% higher, respectively, in unfished sites. Ecosystem fragmentation resulted in significant reductions of recycling rates by snapper, with degree of creek fragmentation explaining 86% and 72% of the variance in estimated excretion for dissolved N and P, respectively. Additionally, we used nutrient limitation assays and primary producer nutrient content to provide a simple example of how marine fishery declines may affect primary production. This study provides an initial step toward integrating marine fishery declines and consumer-driven nutrient recycling to more fully understand the implications of human impacts in marine ecosystems.
Use of automated monitoring to assess behavioral toxicology in fish: Linking behavior and physiology
Brewer, S.K.; DeLonay, A.J.; Beauvais, S.L.; Little, E.E.; Jones, S.B.
1999-01-01
We measured locomotory behaviors (distance traveled, speed, tortuosity of path, and rate of change in direction) with computer-assisted analysis in 30 day posthatch rainbow trout (Oncorhynchus mykiss) exposed to pesticides. We also examined cholinesterase inhibition as a potential endpoint linking physiology and behavior. Sublethal exposure to chemicals often causes changes in swimming behavior, reflecting alterations in sensory and motor systems. Swimming behavior also integrates functions of the nervous system. Rarely are the connections between physiology and behavior made. Although behavior is often suggested as a sensitive, early indicator of toxicity, behavioral toxicology has not been used to its full potential because conventional methods of behavioral assessment have relied on manual techniques, which are often time-consuming and difficult to quantify. This has severely limited the application and utility of behavioral procedures. Swimming behavior is particularly amenable to computerized assessment and automated monitoring. Locomotory responses are sensitive to toxicants and can be easily measured. We briefly discuss the use of behavior in toxicology and automated techniques used in behavioral toxicology. We also describe the system we used to determine locomotory behaviors of fish, and present data demonstrating the system's effectiveness in measuring alterations in response to chemical challenges. Lastly, we correlate behavioral and physiological endpoints.
Cybersecurity: Authoritative Reports and Resources
2013-10-25
Security Technologies Reporting Data Breaches: Is Federal Legislation Needed to Protect Consumers ? July 18, 2013 Energy and Commerce Commerce ...Protect Consumers ? July 18, 2013 Energy and Commerce Oversight and Investigation Cyber Espionage and the Theft of U.S. Intellectual Property and...protection for sensitive consumer data and timely notification in case of breach June 15, 2011 Energy and Commerce Commerce , Manufacturing, and
Cybersecurity: Authoritative Reports and Resources
2013-04-17
Consumer Credit Cybersecurity: An Overview of Risks to Critical Infrastructure July 26, 2011 Energy and Commerce Oversight and Investigations...require greater protection for sensitive consumer data and timely notification in case of breach June 15, 2011 Energy and Commerce Commerce ...Financial Sector June 21, 2011 Commerce , Science and Transportation Privacy and Data Security: Protecting Consumers in the Modern World June 29, 2011
Ranjit, Suman; Dobrinskikh, Evgenia; Montford, John; Dvornikov, Alexander; Lehman, Allison; Orlicky, David J.; Nemenoff, Raphael; Gratton, Enrico; Levi, Moshe; Furgeson, Seth
2017-01-01
All forms of progressive renal diseases develop a final pathway of tubulointerstitial fibrosis and glomerulosclerosis. Renal fibrosis is usually quantified using histological staining, a process that is time-consuming and pathologist dependent. The work described here shows the development of a fast and operator-independent method to measure fibrosis. To study renal fibrosis, the unilateral ureteral obstruction (UUO) model was chosen. Mice develop a time-dependent increase in obstructed kidneys; contralateral kidneys are used as controls. After UUO, kidneys were analyzed at three time points: 7 days, 14 days, and 21 days. Fibrosis was investigated using FLIM (Fluorescence Lifetime Imaging) and SHG (Second Harmonic Generation) in the deep tissue imaging microscope called DIVER (Deep Imaging via Enhanced photon Recovery). This microscope was developed for deep tissue and SHG and THG (Third Harmonic Generation) imaging and has extraordinary sensitivity towards harmonic generation. SHG data suggests the presence of more fibrillar collagen in the diseased kidneys. The combinations of short wavelength FLIM and SHG analysis results in a robust analysis procedure independent of observer interpretation and let us create a criterion to quantify the extent of fibrosis directly from the image. The progression of fibrosis in UUO model has been studied using this new FLIM-SHG technique and it shows remarkable improvement in quantification of fibrosis compared to standard histological techniques. PMID:27555119
Moisture-induced caking of beverage powders.
Chávez Montes, Edgar; Santamaría, Nadia Ardila; Gumy, Jean-Claude; Marchal, Philippe
2011-11-01
Beverage powders can exhibit caking during storage due to high temperature and moisture conditions, leading to consumer dissatisfaction. Caking problems can be aggravated by the presence of sensitive ingredients. The caking behaviour of cocoa beverage powders, with varying amounts of a carbohydrate sensitive ingredient, as affected by climate conditions was studied in this work. Sorption isotherms of beverage powders were determined at water activities (a(w) ) ranging from 0.1 to 0.6 in a moisture sorption analyser by gravimetry and fitted to the Brunauer-Emmett-Teller (BET) or the Guggenheim-Anderson-de Boer (GAB) equation. Glass transition temperatures (T(g) ) at several a(w) were analysed by differential scanning calorimetry and fitted to the Gordon-Taylor equation. Deduced T(g) = f(a(w) ) functions helped to identify stability or caking zones. Specific experimental methods, based on the analysis of mechanical properties of powder cakes formed under compression, were used to quantify the degree of caking. Pantry tests complemented this study to put in evidence the visual perception of powder caking with increasing a(w) . The glass transition approach was useful to predict the risks of caking but was limited to products where T(g) can be measured. On the other hand, quantification of the caking degree by analysis of mechanical properties allowed estimation of the extent of degradation for each product. This work demonstrated that increasing amounts of a carbohydrate sensitive ingredient in cocoa beverages negatively affected their storage stability. Copyright © 2011 Society of Chemical Industry.
Attomole quantitation of protein separations with accelerator mass spectrometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vogel, J S; Grant, P G; Buccholz, B A
2000-12-15
Quantification of specific proteins depends on separation by chromatography or electrophoresis followed by chemical detection schemes such as staining and fluorophore adhesion. Chemical exchange of short-lived isotopes, particularly sulfur, is also prevalent despite the inconveniences of counting radioactivity. Physical methods based on isotopic and elemental analyses offer highly sensitive protein quantitation that has linear response over wide dynamic ranges and is independent of protein conformation. Accelerator mass spectrometry quantifies long-lived isotopes such as 14C to sub-attomole sensitivity. We quantified protein interactions with small molecules such as toxins, vitamins, and natural biochemicals at precisions of 1-5% . Micro-proton-induced-xray-emission quantifies elemental abundancesmore » in separated metalloprotein samples to nanogram amounts and is capable of quantifying phosphorylated loci in gels. Accelerator-based quantitation is a possible tool for quantifying the genome translation into proteome.« less
Newsome, Seth D.; Bentall, Gena B.; Tinker, M. Tim; Oftedal, Olav T.; Ralls, Katherine; Estes, James A.; Fogel, Marilyn L.
2010-01-01
The ability to quantify dietary inputs using stable isotope data depends on accurate estimates of isotopic differences between a consumer (c) and its diet (d), commonly referred to as trophic discrimination factors (TDFs) and denoted by Δc-d. At present, TDFs are available for only a few mammals and are usually derived in captive settings. The magnitude of TDFs and the degree to which they vary in wild populations is unknown. We determined δ13C and δ15N TDFs for vibrissae (i.e., whiskers), a tissue that is rapidly becoming an informative isotopic substrate for ecologists, of a wild population of sea otters for which individual diet has been quantified through extensive observational study. This is one of the very few studies that report TDFs for free-living wild animals feeding on natural diets. Trophic discrimination factors of 2.2 ± 0.7 for δ13C and 3.5 ± 0.6 for δ15N (mean ± SD) were similar to those reported for captive carnivores, and variation in individual δ13C TDFs was negatively but significantly related to sea urchin consumption. This pattern may relate to the lipid-rich diet consumed by most sea otters in this population and suggests that it may not be appropriate to lipid-extract prey samples when using the isotopic composition of keratinaceous tissues to examine diet in consumers that frequently consume lipid-rich foods, such as many marine mammals and seabirds. We suggest that inherent variation in TDFs should be included in isotopically based estimates of trophic level, food chain length, and mixing models used to quantify dietary inputs in wild populations; this practice will further define the capabilities and limitations of isotopic approaches in ecological studies.
NASA Astrophysics Data System (ADS)
Sakai, Takamasa; Kohno, Motohiro; Hirae, Sadao; Nakatani, Ikuyoshi; Kusuda, Tatsufumi
1993-09-01
In this paper, we discussed a novel approach to semiconductor surface inspection, which is analysis using the C--V curve measured in a noncontact method by the metal-air-semiconductor (MAIS) technique. A new gap sensing method using the so-called Goos-Haenchen effect was developed to achieve the noncontact C--V measurement. The MAIS technique exhibited comparable sensitivity and repeatability to those of conventional C--V measurement, and hence, good reproducibility and resolution for quantifying the electrically active impurity on the order of 1× 109/cm2, which is better than most spectrometric techniques, such as secondary ion mass spectroscopy (SIMS), electron spectroscopy for chemical analysis (ESCA) and Auger electron spectrocopy (AES) which are time-consuming and destructive. This measurement without preparation of any electrical contact metal electrode suggested, for the first time, the possibility of measuring an intrinsic characteristic of the semiconductor surface, using the examples of a concrete examination.
Bliem, Rupert; Schauer, Sonja; Plicka, Helga; Obwaller, Adelheid; Sommer, Regina; Steinrigl, Adolf; Alam, Munirul; Reischer, Georg H.; Farnleitner, Andreas H.
2015-01-01
Vibrio cholerae is a severe human pathogen and a frequent member of aquatic ecosystems. Quantification of V. cholerae in environmental water samples is therefore fundamental for ecological studies and health risk assessment. Beside time-consuming cultivation techniques, quantitative PCR (qPCR) has the potential to provide reliable quantitative data and offers the opportunity to quantify multiple targets simultaneously. A novel triplex qPCR strategy was developed in order to simultaneously quantify toxigenic and nontoxigenic V. cholerae in environmental water samples. To obtain quality-controlled PCR results, an internal amplification control was included. The qPCR assay was specific, highly sensitive, and quantitative across the tested 5-log dynamic range down to a method detection limit of 5 copies per reaction. Repeatability and reproducibility were high for all three tested target genes. For environmental application, global DNA recovery (GR) rates were assessed for drinking water, river water, and water from different lakes. GR rates ranged from 1.6% to 76.4% and were dependent on the environmental background. Uncorrected and GR-corrected V. cholerae abundances were determined in two lakes with extremely high turbidity. Uncorrected abundances ranged from 4.6 × 102 to 2.3 × 104 cell equivalents liter−1, whereas GR-corrected abundances ranged from 4.7 × 103 to 1.6 × 106 cell equivalents liter−1. GR-corrected qPCR results were in good agreement with an independent cell-based direct detection method but were up to 1.6 log higher than cultivation-based abundances. We recommend the newly developed triplex qPCR strategy as a powerful tool to simultaneously quantify toxigenic and nontoxigenic V. cholerae in various aquatic environments for ecological studies as well as for risk assessment programs. PMID:25724966
Is competition needed for ecological character displacement? Does displacement decrease competition?
Abrams, Peter A.; Cortez, Michael H.
2015-01-01
Interspecific competition for resources is generally considered to be the selective force driving ecological character displacement, and displacement is assumed to reduce competition. Skeptics of the prevalence of character displacement often cite lack of evidence of competition. The present article uses a simple model to examine whether competition is needed for character displacement and whether displacement reduces competition. It treats systems with competing resources, and considers cases when only one consumer evolves. It quantifies competition using several different measures. The analysis shows that selection for divergence of consumers occurs regardless of the level of between‐resource competition or whether the indirect interaction between the consumers is competition (−,−), mutualism (+,+), or contramensalism (+,−). Also, divergent evolution always decreases the equilibrium population size of the evolving consumer. Whether divergence of one consumer reduces or increases the impact of a subsequent perturbation of the other consumer depends on the parameters and the method chosen for measuring competition. Divergence in mutualistic interactions may reduce beneficial effects of subsequent increases in the other consumer's population. The evolutionary response is driven by an increase in the relative abundance of the resource the consumer catches more rapidly. Such an increase can occur under several types of interaction. PMID:26548922
Performance evaluation of spectral vegetation indices using a statistical sensitivity function
Ji, Lei; Peters, Albert J.
2007-01-01
A great number of spectral vegetation indices (VIs) have been developed to estimate biophysical parameters of vegetation. Traditional techniques for evaluating the performance of VIs are regression-based statistics, such as the coefficient of determination and root mean square error. These statistics, however, are not capable of quantifying the detailed relationship between VIs and biophysical parameters because the sensitivity of a VI is usually a function of the biophysical parameter instead of a constant. To better quantify this relationship, we developed a “sensitivity function” for measuring the sensitivity of a VI to biophysical parameters. The sensitivity function is defined as the first derivative of the regression function, divided by the standard error of the dependent variable prediction. The function elucidates the change in sensitivity over the range of the biophysical parameter. The Student's t- or z-statistic can be used to test the significance of VI sensitivity. Additionally, we developed a “relative sensitivity function” that compares the sensitivities of two VIs when the biophysical parameters are unavailable.
Song, Anna V.; Brown, Paul
2014-01-01
In its graphic warning label regulations on cigarette packages, the Food and Drug Administration severely discounts the benefits of reduced smoking because of the lost “pleasure” smokers experience when they stop smoking; this is quantified as lost “consumer surplus.” Consumer surplus is grounded in rational choice theory. However, empirical evidence from psychological cognitive science and behavioral economics demonstrates that the assumptions of rational choice are inconsistent with complex multidimensional decisions, particularly smoking. Rational choice does not account for the roles of emotions, misperceptions, optimistic bias, regret, and cognitive inefficiency that are germane to smoking, particularly because most smokers begin smoking in their youth. Continued application of a consumer surplus discount will undermine sensible policies to reduce tobacco use and other policies to promote public health. PMID:24328661
Consumer price sensitivity in Dutch health insurance.
van Dijk, Machiel; Pomp, Marc; Douven, Rudy; Laske-Aldershof, Trea; Schut, Erik; de Boer, Willem; de Boo, Anne
2008-12-01
To estimate the price sensitivity of consumer choice of health insurance firm. Using paneldata of the flows of insured between pairs of Dutch sickness funds during the period 1993-2002, we estimate the sensitivity of these flows to differences in insurance premium. The price elasticity of residual demand for health insurance was low during the period 1993-2002, confirming earlier findings based on annual changes in market share. We find small but significant elasticities for basic insurance but insignificant elasticities for supplementary insurance. Young enrollees are more price sensitive than older enrollees. Competition was weak in the market for health insurance during the period under study. For the market-based reforms that are currently under way, this implies that measures to promote competition in the health insurance industry may be needed.
Schroeder, Thomas J; Rodgers, Gregory B
2013-10-01
While unintentional injuries and hazard patterns involving consumer products have been studied extensively in recent years, little attention has focused on the characteristics of those who are hospitalized after treatment in emergency departments, as opposed to those treated and released. This study quantifies the impact of the age and sex of the injury victims, and other factors, on the likelihood of hospitalization. The analysis focuses on consumer product injuries, and was based on approximately 400,000 injury cases reported through the U.S. Consumer Product Safety Commission's National Electronic Injury Surveillance System, a national probability sample of U.S. hospital emergency departments. Logistic regression was used to quantify the factors associated with the likelihood of hospitalization. The analysis suggests a smooth U-shaped relationship between the age of the victim and the likelihood of hospitalization, declining from about 3.4% for children under age 5 years to 1.9% for 15-24 year-olds, but then rising to more than 25% for those ages 75 years and older. The likelihood of hospitalization was also significantly affected by the victim's sex, as well as by the types of products involved, fire involvement, and the size and type of hospital at which the injury was treated. This study shows that the probability of hospitalization is strongly correlated with the characteristics of those who are injured, as well as other factors. Published by Elsevier Ltd.
National Prevalence and Effects of Multiple Chemical Sensitivities
Steinemann, Anne
2018-01-01
Objective: The aim of this study was to assess the prevalence of multiple chemical sensitivities (MCS), its co-occurrence with asthma and fragrance sensitivity, and effects from exposure to fragranced consumer products. Methods: A nationally representative cross-sectional population-based sample of adult Americans (n = 1137) was surveyed in June 2016. Results: Among the population, 12.8% report medically diagnosed MCS and 25.9% report chemical sensitivity. Of those with MCS, 86.2% experience health problems, such as migraine headaches, when exposed to fragranced consumer products; 71.0% are asthmatic; 70.3% cannot access places that use fragranced products such as air fresheners; and 60.7% lost workdays or a job in the past year due to fragranced products in the workplace. Conclusion: Prevalence of diagnosed MCS has increased over 300%, and self-reported chemical sensitivity over 200%, in the past decade. Reducing exposure to fragranced products could help reduce adverse health and societal effects. PMID:29329146
Nanomaterials containing metals are finding increasing use in consumer, industrial, and medical products, and they are subsequently being released into the environment. Methods for detecting, quantifying, and characterizing these materials in complex matrices are critical for the...
USDA-ARS?s Scientific Manuscript database
Color is an important attribute that contributes to the appearance of a sweetpotato genotype. A consumer uses color, along with geometric attributes (e.g., gloss, luster, sheen, texture, opaqueness, shape), to subjectively evaluate the appearance of a sweetpotato root. Color can be quantified by t...
USDA-ARS?s Scientific Manuscript database
All agricultural systems have environmental and societal costs and benefits that should be objectively quantified before recommending specific management practices. Agricultural biotechnology, which takes advantage of genetically engineered organisms (GEOs), along with organic cropping systems, econ...
Borrisser-Pairó, F; Panella-Riera, N; Gil, M; Kallas, Z; Linares, M B; Egea, M; Garrido, M D; Oliver, M A
2017-01-01
Boar taint is an unpleasant odour and flavour present in some entire male pigs that is due to the presence of androstenone and skatole. The aim of the study was to assess the sensitivity of 150 consumers to androstenone and to compare the acceptability and liking of meat from castrated and entire pigs, cooked with different cooking methods. Meat samples consisted of loins from castrated (CM) and entire male pigs (EM) with high levels of androstenone cooked by two cooking methods: sous-vide and fried/breaded with garlic and parsley. Consumers evaluated smell and flavour acceptability, and overall liking of CM and EM for each cooking method. The results of the study showed that dislike of androstenone odour increased significantly with sensitivity. The results of acceptability and overall liking were similar in CM and EM for both cooking methods. Therefore, the two cooking methods used in the study may be useful to mask boar taint. Copyright © 2016 Elsevier Ltd. All rights reserved.
An initial assessment of freight bottlenecks on highways.
DOT National Transportation Integrated Search
2005-10-01
This white paper is an initial effort to identify and quantify, on a national basis, highway bottlenecks that delay trucks and increase costs to businesses and consumers. The paper is the first to look specifically at the impacts and costs of highway...
Comparative Exposure Assessment of ESBL-Producing Escherichia coli through Meat Consumption
Pielaat, Annemarie; Smid, Joost H.; van Duijkeren, Engeline; Vennemann, Francy B. C.; Wijnands, Lucas M.; Chardon, Jurgen E.
2017-01-01
The presence of extended-spectrum β-lactamase (ESBL) and plasmidic AmpC (pAmpC) producing Escherichia coli (EEC) in food animals, especially broilers, has become a major public health concern. The aim of the present study was to quantify the EEC exposure of humans in The Netherlands through the consumption of meat from different food animals. Calculations were done with a simplified Quantitative Microbiological Risk Assessment (QMRA) model. The model took the effect of pre-retail processing, storage at the consumers home and preparation in the kitchen (cross-contamination and heating) on EEC numbers on/in the raw meat products into account. The contribution of beef products (78%) to the total EEC exposure of the Dutch population through the consumption of meat was much higher than for chicken (18%), pork (4.5%), veal (0.1%) and lamb (0%). After slaughter, chicken meat accounted for 97% of total EEC load on meat, but chicken meat experienced a relatively large effect of heating during food preparation. Exposure via consumption of filet americain (a minced beef product consumed raw) was predicted to be highest (61% of total EEC exposure), followed by chicken fillet (13%). It was estimated that only 18% of EEC exposure occurred via cross-contamination during preparation in the kitchen, which was the only route by which EEC survived for surface-contaminated products. Sensitivity analysis showed that model output is not sensitive for most parameters. However, EEC concentration on meat other than chicken meat was an important data gap. In conclusion, the model assessed that consumption of beef products led to a higher exposure to EEC than chicken products, although the prevalence of EEC on raw chicken meat was much higher than on beef. The (relative) risk of this exposure for public health is yet unknown given the lack of a modelling framework and of exposure studies for other potential transmission routes. PMID:28056081
Quantifying the sensitivity of post-glacial sea level change to laterally varying viscosity
NASA Astrophysics Data System (ADS)
Crawford, Ophelia; Al-Attar, David; Tromp, Jeroen; Mitrovica, Jerry X.; Austermann, Jacqueline; Lau, Harriet C. P.
2018-05-01
We present a method for calculating the derivatives of measurements of glacial isostatic adjustment (GIA) with respect to the viscosity structure of the Earth and the ice sheet history. These derivatives, or kernels, quantify the linearised sensitivity of measurements to the underlying model parameters. The adjoint method is used to enable efficient calculation of theoretically exact sensitivity kernels within laterally heterogeneous earth models that can have a range of linear or non-linear viscoelastic rheologies. We first present a new approach to calculate GIA in the time domain, which, in contrast to the more usual formulation in the Laplace domain, is well suited to continuously varying earth models and to the use of the adjoint method. Benchmarking results show excellent agreement between our formulation and previous methods. We illustrate the potential applications of the kernels calculated in this way through a range of numerical calculations relative to a spherically symmetric background model. The complex spatial patterns of the sensitivities are not intuitive, and this is the first time that such effects are quantified in an efficient and accurate manner.
Economic Assessment of FMDv Releases from the National Bio and Agro Defense Facility
Pendell, Dustin L.; Marsh, Thomas L.; Coble, Keith H.; Lusk, Jayson L.; Szmania, Sara C.
2015-01-01
This study evaluates the economic consequences of hypothetical foot-and-mouth disease releases from the future National Bio and Agro Defense Facility in Manhattan, Kansas. Using an economic framework that estimates the impacts to agricultural firms and consumers, quantifies costs to non-agricultural activities in the epidemiologically impacted region, and assesses costs of response to the government, we find the distribution of economic impacts to be very significant. Furthermore, agricultural firms and consumers bear most of the impacts followed by the government and the regional non-agricultural firms. PMID:26114546
Gannon, Bryan M; Pungarcher, India; Mourao, Luciana; Davis, Christopher R; Simon, Philipp; Pixley, Kevin V; Tanumihardjo, Sherry A
2016-07-01
Crops such as maize, sorghum, and millet are being biofortified with provitamin A carotenoids to ensure adequate vitamin A (VA) intakes. VA assessment can be challenging because serum retinol concentrations are homeostatically controlled and more sensitive techniques are resource-intensive. We investigated changes in serum retinol relative differences of isotope amount ratios of (13)C/(12)C (δ(13)C) caused by natural (13)C fractionation in C3 compared with C4 plants as a biomarker to detect provitamin A efficacy from biofortified (orange) maize and high-carotene carrots. The design was a 2 × 2 × 2 maize (orange compared with white) by carrot (orange compared with white) by a VA fortificant (VA+ compared with VA-) in weanling male Mongolian gerbils (n = 55), which included a 14-d VA depletion period and a 62-d treatment period (1 baseline and 8 treatment groups; n = 5-7/group). Liver VA and serum retinol were quantified, purified by HPLC, and analyzed by GC combustion isotope ratio mass spectrometry for (13)C. Treatments affected liver VA concentrations (0.048 ± 0.039 to 0.79 ± 0.24 μmol/g; P < 0.0001) but not overall serum retinol concentrations (1.38 ± 0.22 μmol/L). Serum retinol and liver VA δ(13)C were significantly correlated (R(2) = 0.92; P < 0.0001). Serum retinol δ(13)C differentiated control groups that consumed white maize and white carrots (-27.1 ± 1.2 δ(13)C‰) from treated groups that consumed orange maize and white carrots (-21.6 ± 1.4 δ(13)C‰ P < 0.0001) and white maize and orange carrots (-30.6 ± 0.7 δ(13)C‰ P < 0.0001). A prediction model demonstrated the relative contribution of orange maize to total dietary VA for groups that consumed VA from mixed sources. Provitamin A efficacy and quantitative estimation of the relative contribution to dietary VA were demonstrated with the use of serum retinol δ(13)C. This method could be used for maize efficacy or effectiveness studies and with other C4 crops biofortified with provitamin A carotenoids (e.g., millet, sorghum). Advantages include no extrinsic tracer dose, 1 blood sample, and higher sensitivity than serum retinol concentrations alone.
A Simple Visual Estimation of Food Consumption in Carnivores
Potgieter, Katherine R.; Davies-Mostert, Harriet T.
2012-01-01
Belly-size ratings or belly scores are frequently used in carnivore research as a method of rating whether and how much an animal has eaten. This method provides only a rough ordinal measure of fullness and does not quantify the amount of food an animal has consumed. Here we present a method for estimating the amount of meat consumed by individual African wild dogs Lycaon pictus. We fed 0.5 kg pieces of meat to wild dogs being temporarily held in enclosures and measured the corresponding change in belly size using lateral side photographs taken perpendicular to the animal. The ratio of belly depth to body length was positively related to the mass of meat consumed and provided a useful estimate of the consumption. Similar relationships could be calculated to determine amounts consumed by other carnivores, thus providing a useful tool in the study of feeding behaviour. PMID:22567086
Buchmueller, Thomas C
2009-12-01
For many years, leading health care reform proposals have been based on market-oriented strategies. In the 1990s, a number of reform proposals were built around the concept of "managed competition," but more recently, "consumer-directed health care" models have received attention. Although price-conscious consumer demand plays a critical role in both the managed competition and consumer-directed health care models, the two strategies are based on different visions of the health care marketplace and the best way to use market forces to achieve greater systemwide efficiencies. This article reviews the research literature that tests the main hypotheses concerning the two policy strategies. Numerous studies provide consistent evidence that consumers' health plan choices are sensitive to out-of-pocket premiums. The elasticity of demand appears to vary with consumers' health risk, with younger, healthier individuals being more price sensitive. This heterogeneity increases the potential for adverse selection. Biased risk selection also is a concern when the menu of health plan options includes consumer-directed health plans. Several studies confirm that such plans tend to attract healthier enrollees. A smaller number of studies test the main hypothesis regarding consumer-directed health plans, which is that they result in lower medical spending than do more generous plans. These studies find little support for this claim. The experiences of employers that have adopted key elements of managed competition are generally consistent with the key hypotheses underlying that strategy. Research in this area, however, has focused on only a narrow range of questions. Because consumer-directed health care is such a recent phenomenon, research on this strategy is even more limited. Additional studies on both topics would be valuable.
Recent developments in identifying and quantifying emotions during food consumption.
Kenney, Erica; Adhikari, Koushik
2016-08-01
Emotions and the consumption of food and beverages are inextricably intertwined. As the fields of sensory and consumer science seek to better conceptualize the consumer experience, interest in emotion measurement is growing. Emotions can provide key information to differentiate between products and predict consumer choice as well as give more detail about product perception. There are several emotion measurement instruments, including physiological methods and facial recognition, self-reported verbal emotion measurement and self-reported visual emotion measurement. This review discusses the purpose of measuring emotions, what is the definition of an emotion, what different instruments are available, and touches upon some promising research to deepen the connection between food and emotions. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Consumer perceptions of strain differences in Cannabis aroma
DiVerdi, Joseph A.
2018-01-01
The smell of marijuana (Cannabis sativa L.) is of interest to users, growers, plant breeders, law enforcement and, increasingly, to state-licensed retail businesses. The numerous varieties and strains of Cannabis produce strikingly different scents but to date there have been few, if any, attempts to quantify these olfactory profiles directly. Using standard sensory evaluation techniques with untrained consumers we have validated a preliminary olfactory lexicon for dried cannabis flower, and characterized the aroma profile of eleven strains sold in the legal recreational market in Colorado. We show that consumers perceive differences among strains, that the strains form distinct clusters based on odor similarity, and that strain aroma profiles are linked to perceptions of potency, price, and smoking interest. PMID:29401526
Alcohol consumption promotes mammary tumor growth and insulin sensitivity
Hong, Jina; Holcomb, Valerie B.; Tekle, Samrawit A.; Fan, Betty; Núñez, Nomelí P.
2010-01-01
Epidemiological data show that in women, alcohol has a beneficial effect by increasing insulin sensitivity but also a deleterious effect by increasing breast cancer risk. These effects have not been shown concurrently in an animal model of breast cancer. Our objective is to identify a mouse model of breast cancer whereby alcohol increases insulin sensitivity and promotes mammary tumorigenesis. Our results from the glucose tolerance test and the homeostasis model assessment show that alcohol consumption improved insulin sensitivity. However, alcohol-consuming mice developed larger mammary tumors and developed them earlier than water-consuming mice. In vitro results showed that alcohol exposure increased the invasiveness of breast cancer cells in a dose-dependent manner. Thus, this animal model, an in vitro model of breast cancer, may be used to elucidate the mechanism(s) by which alcohol affects breast cancer. PMID:20202743
Is competition needed for ecological character displacement? Does displacement decrease competition?
Abrams, Peter A; Cortez, Michael H
2015-12-01
Interspecific competition for resources is generally considered to be the selective force driving ecological character displacement, and displacement is assumed to reduce competition. Skeptics of the prevalence of character displacement often cite lack of evidence of competition. The present article uses a simple model to examine whether competition is needed for character displacement and whether displacement reduces competition. It treats systems with competing resources, and considers cases when only one consumer evolves. It quantifies competition using several different measures. The analysis shows that selection for divergence of consumers occurs regardless of the level of between-resource competition or whether the indirect interaction between the consumers is competition (-,-), mutualism (+,+), or contramensalism (+,-). Also, divergent evolution always decreases the equilibrium population size of the evolving consumer. Whether divergence of one consumer reduces or increases the impact of a subsequent perturbation of the other consumer depends on the parameters and the method chosen for measuring competition. Divergence in mutualistic interactions may reduce beneficial effects of subsequent increases in the other consumer's population. The evolutionary response is driven by an increase in the relative abundance of the resource the consumer catches more rapidly. Such an increase can occur under several types of interaction. © 2015 The Author(s). Evolution published by Wiley Periodicals, Inc. on behalf of The Society for the Study of Evolution.
Hart, Andy; Hoekstra, Jeljer; Owen, Helen; Kennedy, Marc; Zeilmaker, Marco J; de Jong, Nynke; Gunnlaugsdottir, Helga
2013-04-01
The EU project BRAFO proposed a framework for risk-benefit assessment of foods, or changes in diet, that present both potential risks and potential benefits to consumers (Hoekstra et al., 2012a). In higher tiers of the BRAFO framework, risks and benefits are integrated quantitatively to estimate net health impact measured in DALYs or QALYs (disability- or quality-adjusted life years). This paper describes a general model that was developed by a second EU project, Qalibra, to assist users in conducting these assessments. Its flexible design makes it applicable to a wide range of dietary questions involving different nutrients, contaminants and health effects. Account can be taken of variation between consumers in their diets and also other characteristics relevant to the estimation of risk and benefit, such as body weight, gender and age. Uncertainty in any input parameter may be quantified probabilistically, using probability distributions, or deterministically by repeating the assessment with alternative assumptions. Uncertainties that are not quantified should be evaluated qualitatively. Outputs produced by the model are illustrated using results from a simple assessment of fish consumption. More detailed case studies on oily fish and phytosterols are presented in companion papers. The model can be accessed as web-based software at www.qalibra.eu. Copyright © 2012. Published by Elsevier Ltd.
Jongeneel, W P; Delmaar, J E; Bokkers, B G H
2018-06-08
A methodology to assess the health impact of skin sensitizers is introduced, which consists of the comparison of the probabilistic aggregated exposure with a probabilistic (individual) human sensitization or elicitation induction dose. The health impact of potential policy measures aimed at reducing the concentration of a fragrance allergen, geraniol, in consumer products is analysed in a simulated population derived from multiple product use surveys. Our analysis shows that current dermal exposure to geraniol from personal care and household cleaning products lead to new cases of contact allergy and induce clinical symptoms for those already sensitized. We estimate that this exposure results yearly in 34 new cases of geraniol contact allergy per million consumers in Western and Northern Europe, mainly due to exposure to household cleaning products. About twice as many consumers (60 per million) are projected to suffer from clinical symptoms due to re-exposure to geraniol. Policy measures restricting geraniol concentrations to <0.01% will noticeably reduce new cases of sensitization and decrease the number of people with clinical symptoms as well as the frequency of occurrence of these clinical symptoms. The estimated numbers should be interpreted with caution and provide only a rough indication of the health impact. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Boonen, Lieke H H M; Schut, Frederik T; Koolman, Xander
2008-03-01
Consumer channeling is an important element in the insurer-provider bargaining process. Health insurers can influence provider choice by offering insurance contracts with restricted provider networks. Alternatively, they can offer contracts with unrestricted access and use incentives to motivate consumers to visit preferred providers. Little is known, however, about the effectiveness of this alternative strategy of consumer channeling. Using data from two natural experiments in the Dutch pharmacy market, we examine how consumers respond to incentives used by health insurers to influence their choice of provider. We find that consumers are sensitive to rather small incentives and that temporary incentives may sort a long-term effect on provider choice. In addition, we find that both consumer and provider characteristics determine whether consumers are willing to switch to preferred pharmacies.
Medical tourists: who goes and what motivates them?
Gan, Lydia L; Frederick, James R
2013-01-01
This study relates consumers' attitudes toward medical tourism to a number of consumer characteristics, such as age, education, income, and insurance status. Principal components analysis of the attitudes of 289 consumers from various communities of North Carolina resulted in three attitude-related factors: economic, treatment-related, and travel-related. Major findings include: (a) the uninsured and low-income consumers are more sensitive to economic factors than the insured and the middle-income consumers; (b) the 51- to 64-year-olds are less motivated by economic factors than young adults; (c) surprisingly, the better one's health, the more one is motivated by treatment-related factors.
Probabilistic modeling of the flows and environmental risks of nano-silica.
Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd
2016-03-01
Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053-3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg · y in the EU (0.19-12 mg/kg · y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data. Copyright © 2015 Elsevier B.V. All rights reserved.
Assessing the Potential of Low-Cost 3D Cameras for the Rapid Measurement of Plant Woody Structure
Nock, Charles A; Taugourdeau, Olivier; Delagrange, Sylvain; Messier, Christian
2013-01-01
Detailed 3D plant architectural data have numerous applications in plant science, but many existing approaches for 3D data collection are time-consuming and/or require costly equipment. Recently, there has been rapid growth in the availability of low-cost, 3D cameras and related open source software applications. 3D cameras may provide measurements of key components of plant architecture such as stem diameters and lengths, however, few tests of 3D cameras for the measurement of plant architecture have been conducted. Here, we measured Salix branch segments ranging from 2–13 mm in diameter with an Asus Xtion camera to quantify the limits and accuracy of branch diameter measurement with a 3D camera. By scanning at a variety of distances we also quantified the effect of scanning distance. In addition, we also test the sensitivity of the program KinFu for continuous 3D object scanning and modeling as well as other similar software to accurately record stem diameters and capture plant form (<3 m in height). Given its ability to accurately capture the diameter of branches >6 mm, Asus Xtion may provide a novel method for the collection of 3D data on the branching architecture of woody plants. Improvements in camera measurement accuracy and available software are likely to further improve the utility of 3D cameras for plant sciences in the future. PMID:24287538
Quantifying outdoor water consumption of urban land use/land cover: sensitivity to drought.
Kaplan, Shai; Myint, Soe W; Fan, Chao; Brazel, Anthony J
2014-04-01
Outdoor water use is a key component in arid city water systems for achieving sustainable water use and ensuring water security. Using evapotranspiration (ET) calculations as a proxy for outdoor water consumption, the objectives of this research are to quantify outdoor water consumption of different land use and land cover types, and compare the spatio-temporal variation in water consumption between drought and wet years. An energy balance model was applied to Landsat 5 TM time series images to estimate daily and seasonal ET for the Central Arizona Phoenix Long-Term Ecological Research region (CAP-LTER). Modeled ET estimations were correlated with water use data in 49 parks within CAP-LTER and showed good agreement (r² = 0.77), indicating model effectiveness to capture the variations across park water consumption. Seasonally, active agriculture shows high ET (>500 mm) for both wet and dry conditions, while the desert and urban land cover types experienced lower ET during drought (<300 mm). Within urban locales of CAP-LTER, xeric neighborhoods show significant differences from year to year, while mesic neighborhoods retain their ET values (400-500 mm) during drought, implying considerable use of irrigation to sustain their greenness. Considering the potentially limiting water availability of this region in the future due to large population increases and the threat of a warming and drying climate, maintaining large water-consuming, irrigated landscapes challenges sustainable practices of water conservation and the need to provide amenities of this desert area for enhancing quality of life.
Improving the accuracy of energy baseline models for commercial buildings with occupancy data
Liang, Xin; Hong, Tianzhen; Shen, Geoffrey Qiping
2016-07-07
More than 80% of energy is consumed during operation phase of a building's life cycle, so energy efficiency retrofit for existing buildings is considered a promising way to reduce energy use in buildings. The investment strategies of retrofit depend on the ability to quantify energy savings by “measurement and verification” (M&V), which compares actual energy consumption to how much energy would have been used without retrofit (called the “baseline” of energy use). Although numerous models exist for predicting baseline of energy use, a critical limitation is that occupancy has not been included as a variable. However, occupancy rate is essentialmore » for energy consumption and was emphasized by previous studies. This study develops a new baseline model which is built upon the Lawrence Berkeley National Laboratory (LBNL) model but includes the use of building occupancy data. The study also proposes metrics to quantify the accuracy of prediction and the impacts of variables. However, the results show that including occupancy data does not significantly improve the accuracy of the baseline model, especially for HVAC load. The reasons are discussed further. In addition, sensitivity analysis is conducted to show the influence of parameters in baseline models. To conclude, the results from this study can help us understand the influence of occupancy on energy use, improve energy baseline prediction by including the occupancy factor, reduce risks of M&V and facilitate investment strategies of energy efficiency retrofit.« less
Lewis, Sarah R; Dym, Cheryl; Chai, Christina; Singh, Amreeta; Kest, Benjamin; Bodnar, Richard J
2007-01-30
Genetic variation across inbred and outbred mouse strains have been observed for intake of sweet solutions, salts, bitter tastants and a high-fat diet. Our laboratory recently reported marked strain differences in the amounts and/or percentages of kilocalories of sucrose consumed among 11 inbred and one outbred mouse strains exposed to a wide range of nine sucrose concentrations (0.0001-5%) in two-bottle 24-h preference tests. To assess whether differences in fat intake were similarly associated with genetic variation, the present study examined intake of chow, water and an emulsified fat source (Intralipid) across nine different concentrations (0.00001-5%) in the same 11 inbred and 1 outbred mouse strains using two-bottle 24-h preference tests, which controlled for Intralipid concentration presentation effects, Intralipid and water bottle positions, and measurement of kilocalorie intake consumed as Intralipid or chow. Strains displayed differential increases in Intralipid intake relative to corresponding water with significant effects observed at the seven (BALB/cJ: 0.001% threshold sensitivity), four (AKR/J, C57BL/6J, DBA/2J, SWR/J: 0.5% threshold sensitivity), three (CD-1, C57BL/10J, SJL/J: 1% threshold sensitivity) and two (A/J, CBA/J, C3H/HeJ, 129P3/J: 2% threshold sensitivity) highest concentrations. In assessing the percentage of kilocalories consumed as Intralipid, SWR/J mice consumed significantly more at the three highest concentrations to a greater degree than BALB/cJ, C57BL/6J, CD-1, C3H/HeJ, DBA/J and 129P3/J strains which in turn consumed more than A/J, AKR/J, CBA/J, C57BL/10J and SJL/J mice. Relatively strong (h2 = 0.73-0.79) heritability estimates were obtained for weight-adjusted Intralipid intake at those concentrations (0.001-1%) that displayed the largest strain-specific effects in sensitivity to Intralipid. The identification of strains with diverging abilities to regulate kilocalorie intake when presented with high Intralipid concentrations may lead to the successful mapping of genes related to hedonics and obesity.
Basketter, D A; Broekhuizen, C; Fieldsend, M; Kirkwood, S; Mascarenhas, R; Maurer, K; Pedersen, C; Rodriguez, C; Schiff, H-E
2010-02-09
A wide range of substances have been recognized as sensitizing, either to the skin and/or to the respiratory tract. Many of these are useful materials, so to ensure that they can be used safely it is necessary to characterize the hazards and establish appropriate exposure limits. Under new EU legislation (REACH), there is a requirement to define a derived no effect level (DNEL). Where a DNEL cannot be established, e.g. for sensitizing substances, then a derived minimal effect level (DMEL) is recommended. For the bacterial and fungal enzymes which are well recognized respiratory sensitizers and have widespread use industrially as well as in a range of consumer products, a DMEL can be established by thorough retrospective review of occupational and consumer experience. In particular, setting the validated employee medical surveillance data against exposure records generated over an extended period of time is vital in informing the occupational DMEL. This experience shows that a long established limit of 60 ng/m(3) for pure enzyme protein has been a successful starting point for the definition of occupational health limits for sensitization in the detergent industry. Application to this of adjustment factors has limited sensitization induction, avoided any meaningful risk of the elicitation of symptoms with known enzymes and provided an appropriate level of security for new enzymes whose potency has not been fully characterized. For example, in the detergent industry, this has led to general use of occupational exposure limits 3-10 times lower than the 60 ng/m(3) starting point. In contrast, consumer exposure limits vary because the types of exposure themselves cover a wide range. The highest levels shown to be safe in use, 15 ng/m(3), are associated with laundry trigger sprays, but very much lower levels (e.g. 0.01 ng/m(3)) are commonly associated with other types of safe exposure. Consumer limits typically will lie between these values and depend on the actual exposure associated with product use. (c) 2009 Elsevier Ireland Ltd. All rights reserved.
Santo, Antonio S; Santo, Ariana M; Browne, Richard W; Burton, Harold; Leddy, John J; Horvath, Steven M; Horvath, Peter J
2010-12-01
Studies examining the effect of soy protein on cardiovascular disease (CVD) risk factors have not taken advantage of the postprandial state as an adjunct to the fasting lipid profile. The American Heart Association has acknowledged the efficacy of soy protein in reducing CVD risk factors to be limited. We hypothesized that the postprandial state would be more sensitive to any favorable changes associated with consuming soy protein compared with the fasting lipid profile. Furthermore, the presence of isoflavones in soy would enhance this effect. Thirty sedentary males aged 18-30 years were randomly assigned to milk protein (Milk), isoflavone-poor soy (Soy-), or isoflavone-rich soy (Soy+). Usual diets were supplemented with 25 g/day of protein for 28 days. Serum samples were collected before and after supplementation in a fasted state and postprandially at 30, 60, 120, 240, and 360 min after a high-fat, 1,000 kcal shake. Triacylglycerol (TAG), total cholesterol, non-esterified fatty acids, apolipoproteins B-100 and A-I and glucose concentrations were quantified. Fasting concentrations were not different after any protein supplementation. Postprandial TAG and TAG AUC increased after Soy-consumption supporting the postprandial state as a more sensitive indicator of soy ingestion effects on CVD risk factors compared with the fasting lipid profile. Furthermore, the absence of isoflavones in soy protein may have deleterious consequences on purported cardio-protective effects.
Detecting Cancer Quickly and Accurately
NASA Astrophysics Data System (ADS)
Gourley, Paul; McDonald, Anthony; Hendricks, Judy; Copeland, Guild; Hunter, John; Akhil, Ohmar; Capps, Heather; Curry, Marc; Skirboll, Steve
2000-03-01
We present a new technique for high throughput screening of tumor cells in a sensitive nanodevice that has the potential to quickly identify a cell population that has begun the rapid protein synthesis and mitosis characteristic of cancer cell proliferation. Currently, pathologists rely on microscopic examination of cell morphology using century-old staining methods that are labor-intensive, time-consuming and frequently in error. New micro-analytical methods for automated, real time screening without chemical modification are critically needed to advance pathology and improve diagnoses. We have teamed scientists with physicians to create a microlaser biochip (based upon our R&D award winning bio-laser concept)1 which evaluates tumor cells by quantifying their growth kinetics. The key new discovery was demonstrating that the lasing spectra are sensitive to the biomolecular mass in the cell, which changes the speed of light in the laser microcavity. Initial results with normal and cancerous human brain cells show that only a few hundred cells -- the equivalent of a billionth of a liter -- are required to detect abnormal growth. The ability to detect cancer in such a minute tissue sample is crucial for resecting a tumor margin or grading highly localized tumor malignancy. 1. P. L. Gourley, NanoLasers, Scientific American, March 1998, pp. 56-61. This work supported under DOE contract DE-AC04-94AL85000 and the Office of Basic Energy Sciences.
Quantifying hypoxia in human cancers using static PET imaging.
Taylor, Edward; Yeung, Ivan; Keller, Harald; Wouters, Bradley G; Milosevic, Michael; Hedley, David W; Jaffray, David A
2016-11-21
Compared to FDG, the signal of 18 F-labelled hypoxia-sensitive tracers in tumours is low. This means that in addition to the presence of hypoxic cells, transport properties contribute significantly to the uptake signal in static PET images. This sensitivity to transport must be minimized in order for static PET to provide a reliable standard for hypoxia quantification. A dynamic compartmental model based on a reaction-diffusion formalism was developed to interpret tracer pharmacokinetics and applied to static images of FAZA in twenty patients with pancreatic cancer. We use our model to identify tumour properties-well-perfused without substantial necrosis or partitioning-for which static PET images can reliably quantify hypoxia. Normalizing the measured activity in a tumour voxel by the value in blood leads to a reduction in the sensitivity to variations in 'inter-corporal' transport properties-blood volume and clearance rate-as well as imaging study protocols. Normalization thus enhances the correlation between static PET images and the FAZA binding rate K 3 , a quantity which quantifies hypoxia in a biologically significant way. The ratio of FAZA uptake in spinal muscle and blood can vary substantially across patients due to long muscle equilibration times. Normalized static PET images of hypoxia-sensitive tracers can reliably quantify hypoxia for homogeneously well-perfused tumours with minimal tissue partitioning. The ideal normalizing reference tissue is blood, either drawn from the patient before PET scanning or imaged using PET. If blood is not available, uniform, homogeneously well-perfused muscle can be used. For tumours that are not homogeneously well-perfused or for which partitioning is significant, only an analysis of dynamic PET scans can reliably quantify hypoxia.
Quantifying hypoxia in human cancers using static PET imaging
NASA Astrophysics Data System (ADS)
Taylor, Edward; Yeung, Ivan; Keller, Harald; Wouters, Bradley G.; Milosevic, Michael; Hedley, David W.; Jaffray, David A.
2016-11-01
Compared to FDG, the signal of 18F-labelled hypoxia-sensitive tracers in tumours is low. This means that in addition to the presence of hypoxic cells, transport properties contribute significantly to the uptake signal in static PET images. This sensitivity to transport must be minimized in order for static PET to provide a reliable standard for hypoxia quantification. A dynamic compartmental model based on a reaction-diffusion formalism was developed to interpret tracer pharmacokinetics and applied to static images of FAZA in twenty patients with pancreatic cancer. We use our model to identify tumour properties—well-perfused without substantial necrosis or partitioning—for which static PET images can reliably quantify hypoxia. Normalizing the measured activity in a tumour voxel by the value in blood leads to a reduction in the sensitivity to variations in ‘inter-corporal’ transport properties—blood volume and clearance rate—as well as imaging study protocols. Normalization thus enhances the correlation between static PET images and the FAZA binding rate K 3, a quantity which quantifies hypoxia in a biologically significant way. The ratio of FAZA uptake in spinal muscle and blood can vary substantially across patients due to long muscle equilibration times. Normalized static PET images of hypoxia-sensitive tracers can reliably quantify hypoxia for homogeneously well-perfused tumours with minimal tissue partitioning. The ideal normalizing reference tissue is blood, either drawn from the patient before PET scanning or imaged using PET. If blood is not available, uniform, homogeneously well-perfused muscle can be used. For tumours that are not homogeneously well-perfused or for which partitioning is significant, only an analysis of dynamic PET scans can reliably quantify hypoxia.
Nougadère, Alexandre; Sirot, Véronique; Kadar, Ali; Fastier, Antony; Truchot, Eric; Vergnet, Claude; Hommet, Frédéric; Baylé, Joëlle; Gros, Philippe; Leblanc, Jean-Charles
2012-09-15
Chronic dietary exposure to pesticide residues was assessed for the French population using a total diet study (TDS) to take into account realistic levels in foods as consumed at home (table-ready). Three hundred and twenty-five pesticides and their transformation products, grouped into 283 pesticides according to their residue definition, were sought in 1235 composite samples corresponding to 194 individual food items that cover 90% of the adult and child diet. To make up the composite samples, about 19,000 food products were bought during different seasons from 2007 to 2009 in 36 French cities and prepared according to the food preparation practices recorded in the individual and national consumption survey (INCA2). The results showed that 37% of the samples contained one or more residues. Seventy-three pesticides were detected and 55 quantified at levels ranging from 0.003 to 8.7mg/kg. The most frequently detected pesticides, identified as monitoring priorities in 2006, were the post-harvest insecticides pirimiphos-methyl and chlorpyrifos-methyl-particularly in wheat-based products-together with chlorpyrifos, iprodione, carbendazim and imazalil, mainly in fruit and fruit juices. Dietary intakes were estimated for each subject of INCA2 survey, under two contamination scenarios to handle left-censored data: lower-bound scenario (LB) where undetected results were set to zero, and upper-bound (UB) scenario where undetected results were set to the detection limit. For 90% of the pesticides, exposure levels were below the acceptable daily intake (ADI) under the two scenarios. Under the LB scenario, which tends to underestimate exposure levels, only dimethoate intakes exceeded the ADI for high level consumers of cherry (0.6% of children and 0.4% of adults). This pesticide, authorised in Europe, and its metabolite were detected in both cherries and endives. Under the UB scenario, that overestimates exposure, a chronic risk could not be excluded for nine other pesticides (dithiocarbamates, ethoprophos, carbofuran, diazinon, methamidophos, disulfoton, dieldrin, endrin and heptachlor). For these pesticides, more sensitive analyses of the main food contributors are needed in order to refine exposure assessment. Copyright © 2012 Elsevier Ltd. All rights reserved.
INCORPORATING CONCENTRATION DEPENDENCE IN STABLE ISOTOPE MIXING MODELS
Stable isotopes are frequently used to quantify the contributions of multiple sources to a mixture; e.g., C and N isotopic signatures can be used to determine the fraction of three food sources in a consumer's diet. The standard dual isotope, three source linear mixing model ass...
INCORPORATING CONCENTRATION DEPENDENCE IN STABLE ISOTOPE MIXING MODELS
Stable isotopes are often used as natural labels to quantify the contributions of multiple sources to a mixture. For example, C and N isotopic signatures can be used to determine the fraction of three food sources in a consumer's diet. The standard dual isotope, three source li...
Mobile monitoring of fugitive methane emissions from natural gas consumer industries
Natural gas is used as a feedstock for major industrial processes, such as ammonia and fertilizer production. However, fugitive methane emissions from many major end-use sectors of the natural gas supply chain have not yet been well quantified. This presentation introduces new m...
The Mobile Monitoring of fugitive methane emissions from natural gas consumer industries
Natural gas is used as a feedstock for major industrial processes, such as ammonia and fertilizer production. However, fugitive methane emissions from many major end-use sectors of the natural gas supply chain have not been quantified yet. This presentation introduces new tools ...
Advances in spectroscopic methods for quantifying soil carbon
USDA-ARS?s Scientific Manuscript database
The gold standard for soil C determination is combustion. However, this method requires expensive consumables, is limited to the determination of the total carbon and in the number of samples which can be processed (~100/d). With increased interest in soil C sequestration, faster methods are needed....
Abstract: Native Americans who consume seafood often have higher seafood consumption rates and consequently greater exposures to contaminants in seafood than the general U.S. population. Defensible and quantifiable tribal seafood consumption rates are needed for development of ...
Best practices for use of stable isotope mixing models in food-web studies
Stable isotope mixing models are increasingly used to quantify contributions of resources to consumers. While potentially powerful tools, these mixing models have the potential to be misused, abused, and misinterpreted. Here we draw on our collective experiences to address the qu...
Schaefer, Alexandre; Buratto, Luciano G; Goto, Nobuhiko; Brotherhood, Emilie V
A large body of evidence shows that buying behaviour is strongly determined by consumers' price expectations and the extent to which real prices violate these expectations. Despite the importance of this phenomenon, little is known regarding its neural mechanisms. Here we show that two patterns of electrical brain activity known to index prediction errors-the Feedback-Related Negativity (FRN) and the feedback-related P300 -were sensitive to price offers that were cheaper than participants' expectations. In addition, we also found that FRN amplitude time-locked to price offers predicted whether a product would be subsequently purchased or not, and further analyses suggest that this result was driven by the sensitivity of the FRN to positive price expectation violations. This finding strongly suggests that ensembles of neurons coding positive prediction errors play a critical role in real-life consumer behaviour. Further, these findings indicate that theoretical models based on the notion of prediction error, such as the Reinforcement Learning Theory, can provide a neurobiologically grounded account of consumer behavior.
Skin sensitization remains an important endpoint for consumers, manufacturers and regulators. Although the development of alternative approaches to assess skin sensitization potential has been extremely active over many years, the implication of regulations such as REACH and the ...
Analysis of carbohydrate deficient transferrin by capillary zone electrophoresis.
Prasad, R; Stout, R L; Coffin, D; Smith, J
1997-09-01
We report a capillary zone electrophoresis method to separate the various sialylated isoforms of transferrin. The separation is carried out under nondenaturing conditions and at basic pH. Under these conditions, transferrin exhibits two major and three minor peaks. Plasma samples from a population consuming varying amounts of alcohol at different intervals were studied. A cut-off value of 3% carbohydrate deficient transferrin (CDT: disialo, monosialo, and asialo transferrin), results in a clinical sensitivity of 88% in a population consuming at least 70 g/day alcohol for a minimum of two weeks. The sensitivity dropped significantly in a population consuming less than 70 g/day. This confirms previous reports of CDT as a specific marker for significant and chronic use of alcohol. Capillary electrophoresis offers an alternative method with respect to analysis time and throughput in the clinical laboratory.
Comparative sensitizing potencies of fragrances, preservatives, and hair dyes.
Lidén, Carola; Yazar, Kerem; Johansen, Jeanne D; Karlberg, Ann-Therese; Uter, Wolfgang; White, Ian R
2016-11-01
The local lymph node assay (LLNA) is used for assessing sensitizing potential in hazard identification and risk assessment for regulatory purposes. Sensitizing potency on the basis of the LLNA is categorized into extreme (EC3 value of ≤0.2%), strong (>0.2% to ≤2%), and moderate (>2%). To compare the sensitizing potencies of fragrance substances, preservatives, and hair dye substances, which are skin sensitizers that frequently come into contact with the skin of consumers and workers, LLNA results and EC3 values for 72 fragrance substances, 25 preservatives and 107 hair dye substances were obtained from two published compilations of LLNA data and opinions by the Scientific Committee on Consumer Safety and its predecessors. The median EC3 values of fragrances (n = 61), preservatives (n = 19) and hair dyes (n = 59) were 5.9%, 0.9%, and 1.3%, respectively. The majority of sensitizing preservatives and hair dyes are thus strong or extreme sensitizers (EC3 value of ≤2%), and fragrances are mostly moderate sensitizers. Although fragrances are typically moderate sensitizers, they are among the most frequent causes of contact allergy. This indicates that factors other than potency need to be addressed more rigorously in risk assessment and risk management. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Quantifying the morphodynamics of river restoration schemes using Unmanned Aerial Vehicles (UAVs)
NASA Astrophysics Data System (ADS)
Williams, Richard; Byrne, Patrick; Gilles, Eric; Hart, John; Hoey, Trevor; Maniatis, George; Moir, Hamish; Reid, Helen; Ves, Nikolas
2017-04-01
River restoration schemes are particularly sensitive to morphological adjustment during the first set of high-flow events that they are subjected to. Quantifying elevation change associated with morphological adjustment can contribute to improved adaptive decision making to ensure river restoration scheme objectives are achieved. To date the relatively high cost, technical demands and challenging logistics associated with acquiring repeat, high-resolution topographic surveys has resulted in a significant barrier to monitoring the three-dimensional morphodynamics of river restoration schemes. The availability of low-cost, consumer grade Unmanned Aerial Vehicles that are capable of acquiring imagery for processing using Structure-from-Motion Multi-View Stereo (SfM MVS) photogrammetry has the potential to transform the survey the morphodynamics of river restoration schemes. Application guidance does, however, need to be developed to fully exploit the advances of UAV technology and SfM MVS processing techniques. In particular, there is a need to quantify the effect of the number and spatial distribution of ground targets on vertical error. This is particularly significant because vertical errors propagate when mapping morphological change, and thus determine the evidence that is available for decision making. This presentation presents results from a study that investigated how the number and spatial distribution of targets influenced vertical error, and then used the findings to determine survey protocols for a monitoring campaign that has quantified morphological change across a number of restoration schemes. At the Swindale river restoration scheme, Cumbria, England, 31 targets were distributed across a 700 m long reach and the centre of each target was surveyed using RTK-GPS. Using the targets as General Control Points (GCPs) or checkpoints, they were divided into three different spatial patterns (centre, edge and random) and used for processing images acquired from a SenseFly Swinglet CAM UAV with a Canon IXUS 240 HS camera. Results indicate that if targets were distributed centrally then vertical distortions would be most notable in outer region of the processing domain; if an edge pattern was used then vertical errors were greatest in the central region of the processing domain; if targets were distributed randomly then errors were more evenly distributed. For this optimal random layout, vertical errors were lowest when 15 to 23 targets were used as GCPs. The best solution achieved planimetric (XY) errors of 0.006 m and vertical (Z) errors of 0.05 m. This result was used to determine target density and distribution for repeat surveys on two other restoration schemes, Whit Beck (Cumbria, England) and Allt Lorgy (Highlands, Scotland). These repeat surveys have been processed to produce DEMs of Difference (DoDs). The DoDs have been used to quantify the spatial distribution of erosion and deposition of these schemes due to high-flow events. Broader interpretation enables insight into patterns of morphological sensitivity that are related to scheme design.
Tungsten recycling in the United States in 2000
Shedd, Kim B.
2011-01-01
This report, which is one of a series of reports on metals recycling, defines and quantifies the flow of tungsten-bearing materials in the United States from imports and stock releases through consumption and disposition in 2000, with particular emphasis on the recycling of industrial scrap (new scrap) and used products (old scrap). Because of tungsten's many diverse uses, numerous types of scrap were available for recycling by a wide variety of processes. In 2000, an estimated 46 percent of U.S. tungsten supply was derived from scrap. The ratio of tungsten consumed from new scrap to that consumed from old scrap was estimated to be 20:80. Of all the tungsten in old scrap available for recycling, an estimated 66 percent was either consumed in the United States or exported to be recycled.
Colyar, Jessica M; Eggett, Dennis L; Steele, Frost M; Dunn, Michael L; Ogden, Lynn V
2009-09-01
The relative sensitivity of side-by-side and sequential monadic consumer liking protocols was compared. In the side-by-side evaluation, all samples were presented at once and evaluated together 1 characteristic at a time. In the sequential monadic evaluation, 1 sample was presented and evaluated on all characteristics, then returned before panelists received and evaluated another sample. Evaluations were conducted on orange juice, frankfurters, canned chili, potato chips, and applesauce. Five commercial brands, having a broad quality range, were selected as samples for each product category to assure a wide array of consumer liking scores. Without their knowledge, panelists rated the same 5 retail brands by 1 protocol and then 3 wk later by the other protocol. For 3 of the products, both protocols yielded the same order of overall liking. Slight differences in order of overall liking for the other 2 products were not significant. Of the 50 pairwise overall liking comparisons, 44 were in agreement. The different results obtained by the 2 protocols in order of liking and significance of paired comparisons were due to the experimental variation and differences in sensitivity. Hedonic liking scores were subjected to statistical power analyses and used to calculate minimum number of panelists required to achieve varying degrees of sensitivity when using side-by-side and sequential monadic protocols. In most cases, the side-by-side protocol was more sensitive, thus providing the same information with fewer panelists. Side-by-side protocol was less sensitive in cases where sensory fatigue was a factor.
Reconstructing fish movements between coastal wetland and nearshore habitats of the Great Lakes
The use of resources from multiple habitats has been shown to be important to the production of aquatic consumers. To quantify the support of Great Lakes coastal wetland (WL) and nearshore (NS) habitats to yellow perch, we used otolith microchemistry to trace movements between th...
USDA-ARS?s Scientific Manuscript database
In this study, we quantified anthocyanin (ANC), proanthocyanidin (PAC), and chlorogenic acid (CA) concentrations in wild blueberry fruit (WBB) exposed to a variety of postharvest handling practices relevant to consumers and to industry. Additionally, we analyzed the bioactive potential of WBB subjec...
Perfluorinated compounds (PFCs) have been widely used in industrial applications and consumer products. Their persistent nature and potential health impacts are of concern. Given the high cost of collecting serum samples, this study is to understand whether we can quantify PFC se...
Cutting Costs and Improving Outcomes for Janitorial Services
ERIC Educational Resources Information Center
Campbell, Jeffery L.
2011-01-01
Recent research reveals that janitorial services account for nearly 30 percent of facility budgets, which translates into billions of dollars annually. With janitorial services consuming such a large share of budgets, other industry findings are alarming. Most cleaning systems: 1) have no quantifiable standards; 2) are based solely on appearance;…
The Insulation Board Industry - An Economic Analysis
Albert T. Schuler
1978-01-01
An econometric model of the domestic insulation board industry was developed to identify and quantify the major factors affecting quantity consumed and price. The factors identified were housing starts, residential improvement activity, disposable personal income, productivity, pulpwood and residue prices, and power costs. Disposable personal income was the most...
A Quantitative ADME-base Tool for Exploring Human Exposure to Consumer Product Ingredients
Exposure to a wide range of chemicals through our daily habits and routines is ubiquitous and largely unavoidable within modern society. The potential for human exposure, however, has not been quantified for the vast majority of chemicals with wide commercial use. Creative advanc...
49 CFR 211.9 - Content of rulemaking and waiver petitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... including an evaluation of anticipated impacts of the action sought; each evaluation shall include an estimate of resulting costs to the private sector, to consumers, and to Federal, State and local governments as well as an evaluation of resulting benefits, quantified to the extent practicable. Each...
High Throughput Modeling of Indoor Exposures to Chemicals (SOT)
Risk due to chemical exposure is a function of both chemical hazard and exposure. Proximate sources of exposure due to the presence of a chemical in consumer products (i.e. near-field exposure) are identified as key drivers of exposure and yet are not well quantified or understo...
USDA-ARS?s Scientific Manuscript database
The standard sampling technique used to quantify cotton fleahopper, Pseudatomoscelis seriatus (Reuter), abundance in cotton, Gossypium hirsutum L., involves direct counts of adults and nymphs on plants. This method, however, becomes increasingly laborious and time consuming as plants increase in si...
Yawson, A E; Biritwum, R B; Nimo, P K
2012-12-01
In 2003, Ghana introduced the national health insurance scheme (NHIS) to promote access to healthcare. This study determines consumer and provider factors which most influence the NHIS at a municipal health facility in Ghana. This is an analytical cross-sectional study at the Winneba Municipal Hospital (WHM) in Ghana between January-March 2010. A total of 170 insured and 175 uninsured out-patients were interviewed and information extracted from their folders using a questionnaire. Consumers were from both the urban and rural areas of the municipality. The mean number of visits by insured consumers to a health facility in previous six months was 2.48 +/- 1.007 and that for uninsured consumers was 1.18 +/- 0.387(p-value<0.001). Insured consumers visited the health facility at significantly more frequent intervals than uninsured consumers (χ(2) = 55.413, p-value< 0.001). Overall, insured consumers received more different types of medications for similar disease conditions and more laboratory tests per visit than the uninsured. In treating malaria (commonest condition seen), providers added multivitamins, haematinics, vitamin C and intramuscular injections as additional medications more for insured consumers than for uninsured consumers. Findings suggest consumer and provider moral hazard may be two critical factors affecting the NHIS in the Effutu Municipality. These have implications for the optimal functioning of the NHIS and may affect long-term sustainability of NHIS in the municipality. Further studies to quantify financial/ economic cost to NHIS arising from moral hazard, will be of immense benefit to the optimal functioning of the NHIS.
Rudner Lugo, Nancy; O'Grady, Eileen T; Hodnicki, Donna; Hanson, Charlene
2010-01-01
The widely varied regulations in the 50 states often limit consumer access to nurse practitioners (NPs). In 22 states, the Board of Nursing (BON) must share NP regulatory authority with another profession, usually physicians. This study examines the relationship between the BON as the sole authority regulating NPs or sharing that authority with another profession and the NP regulatory environment. Independent t tests compared the NP regulatory environments for consumer access and choice in states with sole BON regulation with those in states with involvement of another profession. The states' NP regulatory environments were quantified with an 11-measure tool assessing domains of consumer access to NPs, NP patients' access to service, and NP patients' access to prescription medications. BON-regulated states were less restrictive (P < .01, effect size 1.02) and supported NP professional autonomy. Entry into practice regulations did not differ in the two groups of states. Having another profession involved in regulation correlates with more restrictions on consumer access to NPs and more restrictions to the full deployment of NPs. Copyright 2010 Elsevier Inc. All rights reserved.
O'Quinn, T G; Woerner, D R; Engle, T E; Chapman, P L; Legako, J F; Brooks, J C; Belk, K E; Tatum, J D
2016-02-01
Sensory analysis of ground LL samples representing 12 beef product categories was conducted in 3 different regions of the U.S. to identify flavor preferences of beef consumers. Treatments characterized production-related flavor differences associated with USDA grade, cattle type, finishing diet, growth enhancement, and postmortem aging method. Consumers (N=307) rated cooked samples for 12 flavors and overall flavor desirability. Samples were analyzed to determine fatty acid content. Volatile compounds produced by cooking were extracted and quantified. Overall, consumers preferred beef that rated high for beefy/brothy, buttery/beef fat, and sweet flavors and disliked beef with fishy, livery, gamey, and sour flavors. Flavor attributes of samples higher in intramuscular fat with greater amounts of monounsaturated fatty acids and lesser proportions of saturated, odd-chain, omega-3, and trans fatty acids were preferred by consumers. Of the volatiles identified, diacetyl and acetoin were most closely correlated with desirable ratings for overall flavor and dimethyl sulfide was associated with an undesirable sour flavor. Copyright © 2015 Elsevier Ltd. All rights reserved.
Long-range airplane study: The consumer looks at SST travel
NASA Technical Reports Server (NTRS)
Landes, K. H.; Matter, J. A.
1980-01-01
The attitudes of long-range air travelers toward several basic air travel decisions, were surveyed. Of interest were tradeoffs involving time versus comfort and time versus cost as they pertain to supersonic versus conventional wide-body aircraft on overseas routes. The market focused upon was the segment of air travelers most likely to make that type of tradeoff decision: those having flown overseas routes for business or personal reasons in the recent past. The information generated is intended to provide quantifiable insight into consumer demand for supersonic as compared to wide-body aircraft alternatives for long-range overseas air travel.
Adverse Selection and Inertia in Health Insurance Markets: When Nudging Hurts.
Handel, Benjamin R
2013-12-01
This paper investigates consumer inertia in health insurance markets, where adverse selection is a potential concern. We leverage a major change to insurance provision that occurred at a large firm to identify substantial inertia, and develop and estimate a choice model that also quantifies risk preferences and ex ante health risk. We use these estimates to study the impact of policies that nudge consumers toward better decisions by reducing inertia. When aggregated, these improved individual-level choices substantially exacerbate adverse selection in our setting, leading to an overall reduction in welfare that doubles the existing welfare loss from adverse selection.
Nijkamp, M M; Bokkers, B G H; Bakker, M I; Ezendam, J; Delmaar, J E
2015-10-01
A quantitative risk assessment was performed to establish if consumers are at risk for being dermally sensitized by the fragrance geraniol. Aggregate dermal exposure to geraniol was estimated using the Probabilistic Aggregate Consumer Exposure Model, containing data on the use of personal care products and household cleaning agents. Consumer exposure to geraniol via personal care products appeared to be higher than via household cleaning agents. The hands were the body parts receiving the highest exposure to geraniol. Dermal sensitization studies were assessed to derive the point of departure needed for the estimation of the Acceptable Exposure Level (AEL). Two concentrations were derived, one based on human studies and the other from dose-response analysis of the available murine local lymph node assay data. The aggregate dermal exposure assessment resulted in body part specific median exposures up to 0.041 μg/cm(2) (highest exposure 102 μg/cm(2)) for hands. Comparing the exposure to the lowest AEL (55 μg/cm(2)), shows that a range of 0.02-0.86% of the population may have an aggregated exposure which exceeds the AEL. Furthermore, it is demonstrated that personal care products contribute more to the consumer's geraniol exposure compared to household cleaning agents. Copyright © 2015 Elsevier Inc. All rights reserved.
Pemberton, Mark A; Lohmann, Barbara S
2014-08-01
Acrylic, Poly Methyl Methacrylate (PMMA) based polymers are found in many industrial, professional and consumer products and are of low toxicity, but do contain very low levels of residual monomers and process chemicals that can leach out during handling and use. Methyl Methacrylate, the principle monomer is of low toxicity, but is a recognized weak skin sensitizer. The risk of induction of contact allergy in consumers was determined using a method based upon the Exposure-based Quantitative Risk Assessment approach developed for fragrance ingredients. The No Expected Sensitization Induction Level (NESIL) was based on the threshold to induction of sensitization (EC3) in the Local Lymph Node Assay (LLNA) since no Human Repeat Insult Patch Test (HRIPT) data were available. Categorical estimation of Consumer Exposure Level was substituted with a worst case assumption based upon the quantitative determination of MMA monomer migration into simulants. Application of default and Chemical-Specific Adjustment Factors results in a Risk Characterization Ratio (RCR) of 10,000 and a high Margin of Safety for induction of Allergic Contact Dermatitis (ACD) in consumers handling polymers under conservative exposure conditions. Although there are no data available to derive a RCR for elicitation of ACD it is likely to be lower than that for induction. Crown Copyright © 2014. Published by Elsevier Inc. All rights reserved.
Colorimetric Sensor Array for White Wine Tasting.
Chung, Soo; Park, Tu San; Park, Soo Hyun; Kim, Joon Yong; Park, Seongmin; Son, Daesik; Bae, Young Min; Cho, Seong In
2015-07-24
A colorimetric sensor array was developed to characterize and quantify the taste of white wines. A charge-coupled device (CCD) camera captured images of the sensor array from 23 different white wine samples, and the change in the R, G, B color components from the control were analyzed by principal component analysis. Additionally, high performance liquid chromatography (HPLC) was used to analyze the chemical components of each wine sample responsible for its taste. A two-dimensional score plot was created with 23 data points. It revealed clusters created from the same type of grape, and trends of sweetness, sourness, and astringency were mapped. An artificial neural network model was developed to predict the degree of sweetness, sourness, and astringency of the white wines. The coefficients of determination (R2) for the HPLC results and the sweetness, sourness, and astringency were 0.96, 0.95, and 0.83, respectively. This research could provide a simple and low-cost but sensitive taste prediction system, and, by helping consumer selection, will be able to have a positive effect on the wine industry.
Colorimetric Sensor Array for White Wine Tasting
Chung, Soo; Park, Tu San; Park, Soo Hyun; Kim, Joon Yong; Park, Seongmin; Son, Daesik; Bae, Young Min; Cho, Seong In
2015-01-01
A colorimetric sensor array was developed to characterize and quantify the taste of white wines. A charge-coupled device (CCD) camera captured images of the sensor array from 23 different white wine samples, and the change in the R, G, B color components from the control were analyzed by principal component analysis. Additionally, high performance liquid chromatography (HPLC) was used to analyze the chemical components of each wine sample responsible for its taste. A two-dimensional score plot was created with 23 data points. It revealed clusters created from the same type of grape, and trends of sweetness, sourness, and astringency were mapped. An artificial neural network model was developed to predict the degree of sweetness, sourness, and astringency of the white wines. The coefficients of determination (R2) for the HPLC results and the sweetness, sourness, and astringency were 0.96, 0.95, and 0.83, respectively. This research could provide a simple and low-cost but sensitive taste prediction system, and, by helping consumer selection, will be able to have a positive effect on the wine industry. PMID:26213946
Kolla, SriDurgaDevi; Morcos, Mary; Martin, Brian; Vandenberg, Laura N
2018-03-08
Throughout life, mammary tissue is strongly influenced by hormones. Scientists have hypothesized that synthetic chemicals with hormonal activities could disrupt mammary gland development and contribute to breast diseases and dysfunction. Bisphenol S (BPS) is an estrogenic compound used in many consumer products. In this study, CD-1 mice were exposed to BPS (2 or 200 μg/kg/day) during pregnancy and lactation. Mice exposed to 0.01 or 1 μg/kg/day ethinyl estradiol (EE2), a pharmaceutical estrogen, were also evaluated. Mammary glands from female offspring were collected prior to the onset of puberty, during puberty, and in early adulthood. Growth parameters, histopathology, cell proliferation and expression of hormone receptors were quantified. Our evaluations revealed age- and dose-specific effects of BPS that were different from the effects of EE2, and distinct from the effects of BPA that have been reported previously. These assessments suggest that individual xenoestrogens may have unique effects on this sensitive tissue. Copyright © 2018 Elsevier Inc. All rights reserved.
Miniaturized system of a gas chromatograph coupled with a Paul ion trap mass spectrometer
NASA Technical Reports Server (NTRS)
Shortt, B. J.; Darrach, M. R.; Holland, Paul M.; Chutjian, A.
2005-01-01
Miniature gas chromatography (GC) and miniature mass spectrometry (MS) instrumentation has been developed to identify and quantify the chemical compounds present in complex mixtures of gases. The design approach utilizes micro-GC components coupled with a Paul quadrupole ion trap (QIT) mass spectrometer. Inherent to the system are high sensitivity, good dynamic range, good QIT resolution, low GC flow-rates to minimize vacuum requirements and the need for consumables; and the use of a modular approach to adapt to volatile organic compounds dissolved in water or present in sediment. Measurements are reported on system response to gaseous species at concentrations varying over four orders of magnitude. The ability of the system to deal with complicated mixtures is demonstrated, and future improvements are discussed. The GC/QIT system described herein has a mass, volume and power that are, conservatively, one-twentieth of those of commercial off-the-shelf systems. Potential applications are to spacecraft cabin-air monitoring, robotic planetary exploration and trace-species detection for residual gas analysis and environmental monitoring.
Can Canals Effectively Replace Groundwater Irrigation in Over-exploited Regions in India?
NASA Astrophysics Data System (ADS)
Jain, M.; Fishman, R.; Mondal, P.; Galford, G. L.; Bhattarai, N.; Naeem, S.; DeFries, R. S.
2017-12-01
We use high-resolution data on irrigation and cropping intensity across India to empirically estimate the impacts of losing access to groundwater irrigation in regions with critically exploited aquifers. India is the largest consumer of groundwater globally and is facing severe groundwater depletion. Canals are being promoted as an alternate irrigation source, yet few studies have quantified the effects that this transition may have on agricultural production. Our results suggest that farmers will be 50% less likely to plant a winter crop, have 20% less cropped area, and have cropped areas that are increasingly sensitive to rainfall variability when switching to canal irrigation. We estimate that national winter cropped area will decrease by approximately 13% if farmers lose access to groundwater irrigation in critically over-exploited regions, and 6% if farmers in these regions switch to canal irrigation. These results suggest that groundwater and canal irrigation are not substitutable, and farmers may have to switch to less water intensive crops or improve water use efficiency to maintain current levels of production in the future.
Tenenbaum, S; DiNardo, J; Morris, W E; Wolf, B A; Schnetzinger, R W
1984-10-01
A quantitative in vitro method for phototoxic evaluation of chemicals has been developed and validated. The assay uses Saccharomyces cerevisiae, seeded in an agar overlay on top of a plate count agar base. 8-Methoxy psoralen is used as a reference standard against which materials are measured. Activity is quantified by cytotoxicity measured as zones of inhibition. Several known phototoxins (heliotropine, lyral, phantolid, and bergamot oil) and photoallergens (6-methyl coumarin and musk ambrette) are used to validate the assay. An excellent correlation is observed between in vivo studies employing Hartley albino guinea pigs and the in vitro assay for several fragrance raw materials and other chemicals. The in vitro assay exhibits a greater sensitivity from 2-500 fold. For three fragrance oils, the in vitro assay detects low levels of photobiological activity while the in vivo assay is negative. Although the in vitro assay does not discriminate between phototoxins and photoallergens, it can be used for screening of raw materials so that reduction in animal usage can be achieved while maintaining the protection of the consumer.
Zeeman, Heidi; Kendall, Elizabeth; Whitty, Jennifer A; Wright, Courtney J; Townsend, Clare; Smith, Dianne; Lakhani, Ali; Kennerley, Samantha
2016-03-15
Identifying the housing preferences of people with complex disabilities is a much needed, but under-developed area of practice and scholarship. Despite the recognition that housing is a social determinant of health and quality of life, there is an absence of empirical methodologies that can practically and systematically involve consumers in this complex service delivery and housing design market. A rigorous process for making effective and consistent development decisions is needed to ensure resources are used effectively and the needs of consumers with complex disability are properly met. This 3-year project aims to identify how the public and private housing market in Australia can better respond to the needs of people with complex disabilities whilst simultaneously achieving key corporate objectives. First, using the Customer Relationship Management framework, qualitative (Nominal Group Technique) and quantitative (Discrete Choice Experiment) methods will be used to quantify the housing preferences of consumers and their carers. A systematic mixed-method, quasi-experimental design will then be used to quantify the development priorities of other key stakeholders (e.g., architects, developers, Government housing services etc.) in relation to inclusive housing for people with complex disabilities. Stakeholders randomly assigned to Group 1 (experimental group) will participate in a series of focus groups employing Analytical Hierarchical Process (AHP) methodology. Stakeholders randomly assigned to Group 2 (control group) will participate in focus groups employing existing decision making processes to inclusive housing development (e.g., Risk, Opportunity, Cost, Benefit considerations). Using comparative stakeholder analysis, this research design will enable the AHP methodology (a proposed tool to guide inclusive housing development decisions) to be tested. It is anticipated that the findings of this study will enable stakeholders to incorporate consumer housing preferences into commercial decisions. Housing designers and developers will benefit from the creation of a parsimonious set of consumer-led housing preferences by which to make informed investments in future housing and contribute to future housing policy. The research design has not been applied in the Australian research context or elsewhere, and will provide a much needed blueprint for market investment to develop viable, consumer directed inclusive housing options for people with complex disability.
Delistraty, Damon
2013-02-15
The purpose of this study was to quantify three groups of polychlorinated biphenyl (PCB) congeners (i.e., dioxin-like toxic equivalents [TEQ], non-dioxin-like PCBs, total PCBs) in fish in several species, tissues, and locations in the Columbia River near the Hanford Site. For TEQ and total PCBs, fish ecotoxicity and risk to human fish consumers were also evaluated. Non-dioxin-like PCBs were not assessed for toxicity, due to lack of available benchmarks. In sturgeon liver, TEQ was significantly higher (P<0.05) within the Hanford Site study areas, relative to upriver. However, this same spatial relationship in sturgeon liver did not attain statistical significance for non-dioxin-like PCBs and total PCBs. Non-dioxin-like PCBs and total PCBs were significantly higher (P<0.05) in whitefish fillet than in other species (except carp) and significantly higher (P<0.05) in carp fillet, relative to bass. All PCB residues in carcass were significantly elevated (P<0.005) in comparison to fillet. In addition to PCB source, many factors (e.g., dietary composition, tissue lipid content, fish mobility and home range, age, toxicokinetic processes, seasonal adaptations) influence patterns in PCB bioaccumulation across species, tissues, and locations. TEQ and total PCB residues in liver, fillet, and carcass, observed in this study, were below corresponding no effect residues for TEQ and Aroclors in the literature for fish survival, growth, and reproduction. In contrast, TEQ and total PCBs in fillet in this study exceeded USEPA tissue screening levels for cancer (1E-6 risk) and noncancer (hazard quotient [HQ]=1) toxicity for human fish consumers. Key uncertainties in these comparisons to assess toxicity relate to variation in fish species sensitivity to PCBs and use of Aroclor data in the literature to represent total PCBs. Copyright © 2012 Elsevier B.V. All rights reserved.
Buchmueller, Thomas C
2009-01-01
Context: For many years, leading health care reform proposals have been based on market-oriented strategies. In the 1990s, a number of reform proposals were built around the concept of “managed competition,” but more recently, “consumer-directed health care” models have received attention. Although price-conscious consumer demand plays a critical role in both the managed competition and consumer-directed health care models, the two strategies are based on different visions of the health care marketplace and the best way to use market forces to achieve greater systemwide efficiencies. Methods: This article reviews the research literature that tests the main hypotheses concerning the two policy strategies. Findings: Numerous studies provide consistent evidence that consumers’ health plan choices are sensitive to out-of-pocket premiums. The elasticity of demand appears to vary with consumers’ health risk, with younger, healthier individuals being more price sensitive. This heterogeneity increases the potential for adverse selection. Biased risk selection also is a concern when the menu of health plan options includes consumer-directed health plans. Several studies confirm that such plans tend to attract healthier enrollees. A smaller number of studies test the main hypothesis regarding consumer-directed health plans, which is that they result in lower medical spending than do more generous plans. These studies find little support for this claim. Conclusions: The experiences of employers that have adopted key elements of managed competition are generally consistent with the key hypotheses underlying that strategy. Research in this area, however, has focused on only a narrow range of questions. Because consumer-directed health care is such a recent phenomenon, research on this strategy is even more limited. Additional studies on both topics would be valuable. PMID:20021587
Sensitivity-Uncertainty Techniques for Nuclear Criticality Safety
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
2017-08-07
The sensitivity and uncertainty analysis course will introduce students to k eff sensitivity data, cross-section uncertainty data, how k eff sensitivity data and k eff uncertainty data are generated and how they can be used. Discussion will include how sensitivity/uncertainty data can be used to select applicable critical experiments, to quantify a defensible margin to cover validation gaps and weaknesses, and in development of upper subcritical limits.
Saha, S; Hollands, W; Needs, P W; Ostertag, L M; de Roos, B; Duthie, G G; Kroon, P A
2012-06-01
Epicatechin is a widely consumed dietary flavonoid and there is substantial evidence that it contributes to the health benefits reported for flavanol-rich cocoa products including dark chocolate. Numerous reports have described the appearance of epicatechin and epicatechin phase-2 conjugates (sulfates and glucuronides of epicatechin and methylepicatechin) in blood and urine samples of subjects following ingestion of epicatechin. The most widely reported method of quantifying total epicatechin in plasma and urine samples involves hydrolysis with a mixture of β-glucuronidase and sulfatase to convert the conjugates to epicatechin aglycone which is subsequently quantified. We observed a lack of hydrolysis of epicatechin sulfates and methylepicatechin sulfates using commercial sulfatases and investigated this further. Samples of urine or plasma from subjects who had consumed epicatechin were subjected to enzyme hydrolysis and then analysed using LC-MS/MS, or analysed without enzyme hydrolysis. Attempts to increase the extent of hydrolysis of epicatechin conjugates were made by increasing the amount of enzyme, hydrolysis pH and length of incubations, and using alternative sources of enzyme. The standard hydrolysis conditions failed to hydrolyse the majority of epicatechin sulfates and methylepicatechin sulfates. Even when the quantity of enzyme and incubation period was increased, the pH optimised, or alternative sources of sulfatases were used, epicatechin monosulfates and methylepicatechin monosulfates remained as major peaks in the chromatograms of the samples. An assessment of literature data strongly suggested that the majority of reports where enzyme hydrolysis was used had significantly underestimated epicatechin bioavailability in humans. Methods for quantifying epicatechin concentrations in blood and urine need to take account of the lack of hydrolysis of (methyl)epicatechin-sulfates, for example by quantifying these directly using LC-MS/MS. Copyright © 2012 Elsevier Ltd. All rights reserved.
Utilization of community pharmacy space to enhance privacy: a qualitative study.
Hattingh, H Laetitia; Emmerton, Lynne; Ng Cheong Tin, Pascale; Green, Catherine
2016-10-01
Community pharmacists require access to consumers' information about their medicines and health-related conditions to make informed decisions regarding treatment options. Open communication between consumers and pharmacists is ideal although consumers are only likely to disclose relevant information if they feel that their privacy requirements are being acknowledged and adhered to. This study sets out to explore community pharmacy privacy practices, experiences and expectations and the utilization of available space to achieve privacy. Qualitative methods were used, comprising a series of face-to-face interviews with 25 pharmacists and 55 pharmacy customers in Perth, Western Australia, between June and August 2013. The use of private consultation areas for certain services and sensitive discussions was supported by pharmacists and consumers although there was recognition that workflow processes in some pharmacies may need to change to maximize the use of private areas. Pharmacy staff adopted various strategies to overcome privacy obstacles such as taking consumers to a quieter part of the pharmacy, avoiding exposure of sensitive items through packaging, lowering of voices, interacting during pharmacy quiet times and telephoning consumers. Pharmacy staff and consumers regularly had to apply judgement to achieve the required level of privacy. Management of privacy can be challenging in the community pharmacy environment, and on-going work in this area is important. As community pharmacy practice is increasingly becoming more involved in advanced medication and disease state management services with unique privacy requirements, pharmacies' layouts and systems to address privacy challenges require a proactive approach. © 2015 The Authors. Health Expectations Published by John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, Justin Matthew
These are the slides for a graduate presentation at Mississippi State University. It covers the following: the BRL Shaped-Charge Geometry in PAGOSA, mesh refinement study, surrogate modeling using a radial basis function network (RBFN), ruling out parameters using sensitivity analysis (equation of state study), uncertainty quantification (UQ) methodology, and sensitivity analysis (SA) methodology. In summary, a mesh convergence study was used to ensure that solutions were numerically stable by comparing PDV data between simulations. A Design of Experiments (DOE) method was used to reduce the simulation space to study the effects of the Jones-Wilkins-Lee (JWL) Parameters for the Composition Bmore » main charge. Uncertainty was quantified by computing the 95% data range about the median of simulation output using a brute force Monte Carlo (MC) random sampling method. Parameter sensitivities were quantified using the Fourier Amplitude Sensitivity Test (FAST) spectral analysis method where it was determined that detonation velocity, initial density, C1, and B1 controlled jet tip velocity.« less
High Throughput Exposure Modeling of Semi-Volatile Chemicals in Articles of Commerce (ACS)
Risk due to chemical exposure is a function of both chemical hazard and exposure. Near-field exposures to chemicals in consumer products are identified as the main drivers of exposure and yet are not well quantified or understood. The ExpoCast project is developing a model that e...
Cradle to Gate Life Cycle Assessment of North American Cellulosic Fiberboard Production
Maureen Puettmann; Richard Bergman; Elaine Oneil
2016-01-01
All consumer products have an environmental footprint. Quantifying that footprint has become more common with the advent of Environmental Preferential Purchasing (EPP), an emergent world-wide phenomenon. The forest products industry in particular has been challenged regarding its environmental sustainability. The greatest challenges with respect to practices center on...
A Market-oriented Approach To Maximizing Product Benefits: Cases in U.S. Forest Products Industries
Vijay S. Reddy; Robert J. Bush; Ronen Roudik
1996-01-01
Conjoint analysis, a decompositional customer preference modelling technique, has seen little application to forest products. However, the technique provides useful information for marketing decisions by quantifying consumer preference functions for multiattribute product alternatives. The results of a conjoint analysis include the contribution of each attribute and...
ERIC Educational Resources Information Center
Cotterill, Stewart T.
2015-01-01
The development of effective learning environments in higher education (HE) appears to become increasingly prioritised by HE institutions. This approach reflects an increasingly "consumer" focused student body, and HE attempt to further quantify the quality of their products. However, all too often attempts to build more effective…
Orique, Sabrina B; Patty, Christopher M; Sandidge, Alisha; Camarena, Emma; Newsom, Rose
2017-12-01
The aim of this article is to describe the use of Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) data to measure missed nursing care and construct a missed nursing care metric. Missed nursing care varies widely within and between US hospitals. Missed nursing care can be measured utilizing the HCAHPS data. This cross-sectional study used HCAHPS data to measure missed care. This analysis includes HCAHPS data from 1125 acute care patients discharged between January 2014 and December 2014. A missed care index was computed by dividing the total number of missed care occurrences as reported by the patient into the total number of survey responses that did not indicate missed care. The computed missed care index for the organization was 0.6 with individual unit indices ranging from 0.2 to 1.4. Our methods utilize existing data to quantify missed nursing care. Based on the assessment, nursing leaders can develop interventions to decrease the incidence of missed care. Further data should be gathered to validate the incidence of missed care from HCAHPS reports.
Volatile aroma compounds in various brewed green teas.
Lee, Jeehyun; Chambers, Delores H; Chambers, Edgar; Adhikari, Koushik; Yoon, Youngmo
2013-08-20
This study identifies and semi-quantifies aroma volatiles in brewed green tea samples. The objectives of this study were to identify using a gas chromatograph-mass spectrometer (GC-MS) paired with a headspace solid-phase micro-extraction (HS-SPME) the common volatile compounds that may be responsible for aroma/flavor of the brewed liquor of a range of green tea samples from various countries as consumed and to determine if green teas from the same region have similarities in volatile composition when green tea samples are prepared for consumption. Twenty-four green tea samples from eight different countries were brewed as recommended for consumer brewing. The aroma volatiles were extracted by HS-SPME, separated on a gas chromatograph and identified using a mass spectrometer. Thirty-eight compounds were identified and the concentrations were semi-quantified. The concentrations were lower than those reported by other researchers, probably because this research examined headspace volatiles from brewed tea rather than solvent extraction of leaves. No relationship to country of origin was found, which indicates that other factors have a greater influence than country of origin on aroma.
Cook, Brendan; Gazzano, Jerrome; Gunay, Zeynep; Hiller, Lucas; Mahajan, Sakshi; Taskan, Aynur; Vilogorac, Samra
2012-04-23
The electric grid in the United States has been suffering from underinvestment for years, and now faces pressing challenges from rising demand and deteriorating infrastructure. High congestion levels in transmission lines are greatly reducing the efficiency of electricity generation and distribution. In this paper, we assess the faults of the current electric grid and quantify the costs of maintaining the current system into the future. While the proposed "smart grid" contains many proposals to upgrade the ailing infrastructure of the electric grid, we argue that smart meter installation in each U.S. household will offer a significant reduction in peak demand on the current system. A smart meter is a device which monitors a household's electricity consumption in real-time, and has the ability to display real-time pricing in each household. We conclude that these devices will provide short-term and long-term benefits to utilities and consumers. The smart meter will enable utilities to closely monitor electricity consumption in real-time, while also allowing households to adjust electricity consumption in response to real-time price adjustments.
Analysis of Consumers’ Preferences and Price Sensitivity to Native Chickens
Lee, Min-A; Jung, Yoojin; Jo, Cheorun
2017-01-01
This study analyzed consumers’ preferences and price sensitivity to native chickens. A survey was conducted from Jan 6 to 17, 2014, and data were collected from consumers (n=500) living in Korea. Statistical analyses evaluated the consumption patterns of native chickens, preference marketing for native chicken breeds which will be newly developed, and price sensitivity measurement (PSM). Of the subjects who preferred broilers, 24.3% do not purchase native chickens because of the dryness and tough texture, while those who preferred native chickens liked their chewy texture (38.2%). Of the total subjects, 38.2% preferred fried native chickens (38.2%) for processed food, 38.4% preferred direct sales for native chicken distribution, 51.0% preferred native chickens to be slaughtered in specialty stores, and 32.4% wanted easy access to native chickens. Additionally, the price stress range (PSR) was 50 won and the point of marginal cheapness (PMC) and point of marginal expensiveness (PME) were 6,980 won and 12,300 won, respectively. Evaluation of the segmentation market revealed that consumers who prefer broiler to native chicken breeds were more sensitive to the chicken price. To accelerate the consumption of newly developed native chicken meat, it is necessary to develop a texture that each consumer needs, to increase the accessibility of native chickens, and to have diverse menus and recipes as well as reasonable pricing for native chickens. PMID:28747834
Alagandula, Ravali; Zhou, Xiang; Guo, Baochuan
2017-01-15
Liquid chromatography/tandem mass spectrometry (LC/MS/MS) is the gold standard of urine drug testing. However, current LC-based methods are time consuming, limiting the throughput of MS-based testing and increasing the cost. This is particularly problematic for quantification of drugs such as phenobarbital, which is often analyzed in a separate run because they must be negatively ionized. This study examined the feasibility of using a dilute-and-shoot flow-injection method without LC separation to quantify drugs with phenobarbital as a model system. Briefly, a urine sample containing phenobarbital was first diluted by 10 times, followed by flow injection of the diluted sample to mass spectrometer. Quantification and detection of phenobarbital were achieved by an electrospray negative ionization MS/MS system operated in the multiple reaction monitoring (MRM) mode with the stable-isotope-labeled drug as internal standard. The dilute-and-shoot flow-injection method developed was linear with a dynamic range of 50-2000 ng/mL of phenobarbital and correlation coefficient > 0.9996. The coefficients of variation and relative errors for intra- and inter-assays at four quality control (QC) levels (50, 125, 445 and 1600 ng/mL) were 3.0% and 5.0%, respectively. The total run time to quantify one sample was 2 min, and the sensitivity and specificity of the method did not deteriorate even after 1200 consecutive injections. Our method can accurately and robustly quantify phenobarbital in urine without LC separation. Because of its 2 min run time, the method can process 720 samples per day. This feasibility study shows that the dilute-and-shoot flow-injection method can be a general way for fast analysis of drugs in urine. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
GC/MS screening method for phthalate esters in children's toys.
Ting, Keh-Chuh; Gill, Modan; Garbin, Orlando
2009-01-01
Phthalate esters are commonly added into polyvinyl chloride (PVC) as softeners to make the plastic material flexible. Phthalates are suspected cancer-causing agents and possible teratogens; they have been linked to liver and kidney damage, as well as the underdevelopment of reproductive organs in humans and animals. Public safety concerns about human exposure to phthalates are on the rise because they do not chemically bond to PVC and leach from the material over time. Following the lead of the European Union and Japan in restricting the use of certain phthalates, a legal limit of 0.1% in children's toys was established by the California State Legislature (AB-1108). In addition to its mission to protect public health and the environment from toxic harm, the California Department of Toxic Substances Control (DTSC) has been delegated the role of lead agency for consumer product safety. To support DTSC's Green Chemistry activities, the Environmental Chemistry Laboratory Mobile Laboratory Team has developed an on-site screening method to monitor phthalates in children's toys. This method is simple, fast, and effective, with ample sensitivity to quantify the 6 restricted phthalates in children's toys at 100 ppm (limit of quantitation = 100 microg/g) which is 10 times lower than the legal allowable level of 0.1%. Additionally, the method has a high throughput capability and enables testing of approximately 6-10 samples per day, depending on the complexity of the sample matrix and concentration. This method is designed to survey the 6 phthalates in children's toys and other consumer products for compliance with the threshold of 0.1% (1000 ppm).
IgE Sensitization Patterns to Commonly Consumed Foods Determined by Skin Prick Test in Korean Adults
2016-01-01
Offending food allergens can vary with regional preferences in food consumption. In this study, we analysed sensitization rates to commonly consumed foods in Korean adults suspected of having food allergy. One hundred and thirty four subjects underwent a skin prick test (SPT) with 55 food allergens, of which 13 were made by our laboratory and the rest were commercially purchased. Of the 134 patients, 73 (54.5%) were sensitized to one or more food allergens. Sensitization to chrysalis was detected most frequently, at a rate of 25.4%. Sensitization rates to other food allergens were as follows: maize grain (13.4%), shrimp (11.9%), almond (11.1%), wheat flour (8.2%), lobster (8.2%), buckwheat (8.2%), mackerel (5.2%), pollack (5.2%), halibut (4.5%), peanut (4.5%), anchovy (4.4%), squid (3.7%), saury (3.0%), common eel (3.0%), yellow corvina (3.0%), hairtail (2.2%), octopus (2.2%), and others. In addition to well-known food allergens, sensitivity to mackerel, chrysalis, pollack, and halibut, which are popular foods in Korea, was observed at high rates in Korean adults. We suggest that the SPT panel for food allergy in Korea should include these allergens. PMID:27478328
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Tao; Hossain, Mahmud; Schepmoes, Athena A.
2012-08-03
Sandwich immunoassay is the standard technique used in clinical labs for quantifying protein biomarkers for disease detection, monitoring and therapeutic intervention. Albeit highly sensitive, the development of a specific immunoassay is rather time-consuming and associated with extremely high cost due to the requirement for paired immunoaffinity reagents of high specificity. Recently, mass spectrometry-based methods, specifically selected reaction monitoring mass spectrometry (SRM-MS), have been increasingly applied to measure low abundance biomarker candidates in tissue and biofluids, owing to high sensitivity and specificity, simplicity of assay configuration, and great multiplexing capability. In this study, we report for the first time the developmentmore » of immunoaffinity depletion-based workflows and SRM-MS assays that enable sensitive and accurate quantification of total and free prostate-specific antigen (PSA) in serum without the requirement for specific PSA antibodies. With stable isotope dilution and external calibration, low ng/mL level detection of both total and free PSA was consistently achieved in both PSA-spiked female serum samples and actual patient serum samples. Moreover, comparison of the results obtained when SRM PSA assays and conventional immunoassays were applied to the same samples showed very good correlation (R2 values ranging from 0.90 to 0.99) in several independent clinical serum sample sets, including a set of 33 samples assayed in a blinded test. These results demonstrate that the workflows and SRM assays developed here provide an attractive alternative for reliably measuring total and free PSA in human blood. Furthermore, simultaneous measurement of free and total PSA and many other biomarkers can be performed in a single analysis using high-resolution liquid chromatographic separation coupled with SRM-MS.« less
Assessment of the impact of increased solar ultraviolet radiation upon marine ecosystems
NASA Technical Reports Server (NTRS)
Vandyke, H.
1977-01-01
Specifically, the study has addressed the following: (1) potential for irreversible damage to the productivity, structure and/or functioning of a model estuarine ecosystem by increased UV-B radiation or ecosystems highly stable or amenable to adaptive change, and (2) the sensitivity of key community components (the primary producers, consumers, and decomposers) to increased UV-B radiation. Three areas of study were examined during the past year: (1) a continuation of the study utilizing the two seminatural ecosystem chambers, (2) a pilot study utilizing three flow-through ecosystem tanks enclosed in a small, outdoor greenhouse, and (3) sensitivity studies of representative primary producers and consumers.
Skodje, Gry I; Sarna, Vikas K; Minelle, Ingunn H; Rolfsen, Kjersti L; Muir, Jane G; Gibson, Peter R; Veierød, Marit B; Henriksen, Christine; Lundin, Knut E A
2018-02-01
Non-celiac gluten sensitivity is characterized by symptom improvement after gluten withdrawal in absence of celiac disease. The mechanisms of non-celiac gluten sensitivity are unclear, and there are no biomarkers for this disorder. Foods with gluten often contain fructans, a type of fermentable oligo-, di-, monosaccharides and polyols. We aimed to investigate the effect of gluten and fructans separately in individuals with self-reported gluten sensitivity. We performed a double-blind crossover challenge of 59 individuals on a self-instituted gluten-free diet, for whom celiac disease had been excluded. The study was performed at Oslo University Hospital in Norway from October 2014 through May 2016. Participants were randomly assigned to groups placed on diets containing gluten (5.7 g), fructans (2.1 g), or placebo, concealed in muesli bars, for 7 days. Following a minimum 7-day washout period (until the symptoms induced by the previous challenge were resolved), participants crossed over into a different group, until they completed all 3 challenges (gluten, fructan, and placebo). Symptoms were measured by Gastrointestinal Symptom Rating Scale Irritable Bowel Syndrome (GSRS-IBS) version. A linear mixed model for analysis was used. Overall GSRS-IBS scores differed significantly during gluten, fructan, and placebo challenges; mean values were 33.1 ± 13.3, 38.6 ± 12.3, and 34.3 ± 13.9, respectively (P = .04). Mean scores for GSRS-IBS bloating were 9.3 ± 3.5, 11.6 ± 3.5, and 10.1 ± 3.7, respectively, during the gluten, fructan, and placebo challenges (P = .004). The overall GSRS-IBS score for participants consuming fructans was significantly higher than for participants consuming gluten (P = .049), as was the GSRS bloating score (P = .003). Thirteen participants had the highest overall GSRS-IBS score after consuming gluten, 24 had the highest score after consuming fructan, and 22 had the highest score after consuming placebo. There was no difference in GSRS-IBS scores between gluten and placebo groups. In a randomized, double-blind, placebo-controlled crossover study of individuals with self-reported non-celiac gluten sensitivity, we found fructans to induce symptoms, measured by the GSRS-IBS. Clinicaltrials.gov no: NCT02464150. Copyright © 2018 AGA Institute. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Vargo, L. J.; Galewsky, J.; Rupper, S.; Ward, D. J.
2018-04-01
The subtropical Andes (18.5-27 °S) have been glaciated in the past, but are presently glacier-free. We use idealized model experiments to quantify glacier sensitivity to changes in climate in order to investigate the climatic drivers of past glaciations. We quantify the equilibrium line altitude (ELA) sensitivity (the change in ELA per change in climate) to temperature, precipitation, and shortwave radiation for three distinct climatic regions in the subtropical Andes. We find that in the western cordillera, where conditions are hyper-arid with the highest solar radiation on Earth, ELA sensitivity is as high as 34 m per % increase in precipitation, and 70 m per % decrease in shortwave radiation. This is compared with the eastern cordillera, where precipitation is the highest of the three regions, and ELA sensitivity is only 10 m per % increase in precipitation, and 25 m per % decrease in shortwave radiation. The high ELA sensitivity to shortwave radiation highlights the influence of radiation on mass balance of high elevation and low-latitude glaciers. We also consider these quantified ELA sensitivities in context of previously dated glacial deposits from the regions. Our results suggest that glaciation of the humid eastern cordillera was driven primarily by lower temperatures, while glaciations of the arid Altiplano and western cordillera were also influenced by increases in precipitation and decreases in shortwave radiation. Using paleoclimate records from the timing of glaciation, we find that glaciation of the hyper-arid western cordillera can be explained by precipitation increases of 90-160% (1.9-2.6× higher than modern), in conjunction with associated decreases in shortwave radiation of 7-12% and in temperature of 3.5 °C.
2013-01-01
This analysis supplements the Annual Energy Outlook 2013 alternative cases which imposed hypothetical carbon dioxide emission fees on fossil fuel consumers. It offers further cases that examine the impacts of fees placed only on the emissions from electric power facilities, impacts of returning potential revenues to consumers, and two cap-and-trade policies.
A force-based, parallel assay for the quantification of protein-DNA interactions.
Limmer, Katja; Pippig, Diana A; Aschenbrenner, Daniela; Gaub, Hermann E
2014-01-01
Analysis of transcription factor binding to DNA sequences is of utmost importance to understand the intricate regulatory mechanisms that underlie gene expression. Several techniques exist that quantify DNA-protein affinity, but they are either very time-consuming or suffer from possible misinterpretation due to complicated algorithms or approximations like many high-throughput techniques. We present a more direct method to quantify DNA-protein interaction in a force-based assay. In contrast to single-molecule force spectroscopy, our technique, the Molecular Force Assay (MFA), parallelizes force measurements so that it can test one or multiple proteins against several DNA sequences in a single experiment. The interaction strength is quantified by comparison to the well-defined rupture stability of different DNA duplexes. As a proof-of-principle, we measured the interaction of the zinc finger construct Zif268/NRE against six different DNA constructs. We could show the specificity of our approach and quantify the strength of the protein-DNA interaction.
Invasive plant species alters consumer behavior by providing refuge from predation.
Dutra, Humberto P; Barnett, Kirk; Reinhardt, Jason R; Marquis, Robert J; Orrock, John L
2011-07-01
Understanding the effects of invasive plants on native consumers is important because consumer-mediated indirect effects have the potential to alter the dynamics of coexistence in native communities. Invasive plants may promote changes in consumer pressure due to changes in protective cover (i.e., the architectural complexity of the invaded habitat) and in food availability (i.e., subsidies of fruits and seeds). No experimental studies have evaluated the relative interplay of these two effects. In a factorial experiment, we manipulated cover and food provided by the invasive shrub Amur honeysuckle (Lonicera maackii) to evaluate whether this plant alters the foraging activity of native mammals. Using tracking plates to quantify mammalian foraging activity, we found that removal of honeysuckle cover, rather than changes in the fruit resources it provides, reduced the activity of important seed consumers, mice in the genus Peromyscus. Two mesopredators, Procyon lotor and Didelphis virginiana, were also affected. Moreover, we found rodents used L. maackii for cover only on cloudless nights, indicating that the effect of honeysuckle was weather-dependent. Our work provides experimental evidence that this invasive plant species changes habitat characteristics, and in so doing alters the behavior of small- and medium-sized mammals. Changes in seed predator behavior may lead to cascading effects on the seeds that mice consume.
Predictors of assistive technology use: the importance of personal and psychosocial factors.
Scherer, Marcia J; Sax, Caren; Vanbiervliet, Alan; Cushman, Laura A; Scherer, John V
2005-11-15
To validate an assistive technology (AT) baseline and outcomes measure and to quantify the measure's value in determining the best match of consumer and AT considering consumer ratings of their subjective quality of life, mood, support from others, motivation for AT use, program/therapist reliance, and self-determination/self-esteem. Prospective multi-cohort study. Vocational rehabilitation offices and community. Over 150 vocational rehabilitation counselors in 25 U.S. states with one consumer each receiving new AT. Counselor training in the Matching Person and Technology (MPT) Model and consumer completion of the MPT measure, Assistive Technology Device Predisposition Assessment (ATD PA). Total and subscale scores on the ATD PA as well as counselor-completed questionnaires. ATD PA items differentiated consumer predispositions to AT use as well as AT and user match. There were no significant differences due to gender, physical locality, or age within this sample of working-age adult consumers. Vocational rehabilitation counselors exposed to training in the MPT Model achieved enhanced AT service delivery outcomes. The ATD PA is a valid measure of predisposition to use an AT and the subsequent match of AT and user. Rehabilitation practitioners who use the ATD PA will achieve evidence-based practice and can expect to see enhanced AT service delivery outcomes.
Consumer awareness and interest toward sodium reduction trends in Korea.
Kim, Mina K; Lee, Kwang-Geun
2014-07-01
Reduction of dietary sodium intake by lowering amount of sodium in foods is a global industry target. Quantitative information on current consumer knowledge of sodium reduction trends in Korea is unknown. The objective of this study was to quantify the consumer knowledge and awareness of sodium and salt reduction in foods and to characterize consumer interest in health labeling on the food package. Additionally, comparison of consumer knowledge status between Korea and United States was followed. Consumers (n = 289) participated in an internet survey designed to gauge consumer knowledge and attitudes toward dietary sodium, the sodium content in representative food products (n = 27), and their interest toward specific health claims, including sodium labeling. Questions regarding demographics as well as consumption characteristics were asked. Sodium knowledge index and saltiness belief index were calculated based on the number of correct responses regarding the salt level and sodium content in given food products. Kano analysis was conducted to determine the role of nutrition labels in consumer satisfaction with products. Current consumer knowledge on the sodium content in food products was high, and consumers were adept at matching the sodium content with the salty taste intensity of food products. Consumers' knowledge of the relationship between diets high in sodium and an increased risk of developing previously reported sodium-related diseases, such as hypertension, coronary heart disease, kidney disease, and stomach cancer, were also high. Information on the nutrition panel that influences the consumer satisfaction (trans-fat, sodium, ingredient list, and country of origin) as well as adjective-nutrition claim pairs that appeal positively to purchase intent of the product were identified. This work provided the current status of Korean consumer knowledge on the amount of sodium in food and that sodium can be a risk factor of developing chronic diseases. It also provided practical information to food marketers on what consumers like and what they want to see on product labels in Korea. © 2014 Institute of Food Technologists®
Novick, Rachel M; Nelson, Mindy L; Unice, Kenneth M; Keenan, James J; Paustenbach, Dennis J
2013-06-01
1,2-Benzisothiazolin-3-one (BIT; CAS # 2634-33-5) is a preservative used in consumer products. Dermal exposure to BIT at sufficient dose and duration can produce skin sensitization and allergic contact dermatitis in animals and susceptible humans.The purpose of this study is to derive a maximal concentration of BIT in various consumer products that would result in exposures below the No Expected Sensitization Induction Level (NESIL), a dose below which skin sensitization should not occur. A screening level exposure estimate was performed for several product use scenarios with sunscreen, laundry detergent, dish soap, and spray cleaner. We calculated that BIT concentrations below the following concentrations of 0.0075%, 0.035%, 0.035%, 0.021% in sunscreen, laundry detergent, dish soap, and spray cleaner, respectively, are unlikely to induce skin sensitization. We completed a pilot study consisting of bulk sample analysis of one representative product from each category labelled as containing BIT, and found BIT concentrations of 0.0009% and 0.0027% for sunscreen and dish soap, respectively. BIT was not detected in the laundry detergent and spray cleaner products above the limit of detection of 0.0006%. Based on publically available data for product formulations and our results, we were able to establish that cleaning products and sunscreens likely contain BIT at concentrations similar to or less than our calculated maximal safe concentrations and that exposures are unlikely to induce skin sensitization in most users. Copyright © 2013 Elsevier Ltd. All rights reserved.
Quantification of M13 and T7 bacteriophages by TaqMan and SYBR green qPCR.
Peng, Xiujuan; Nguyen, Alex; Ghosh, Debadyuti
2018-02-01
TaqMan and SYBR Green quantitative PCR (qPCR) methods were developed as DNA-based approaches to reproducibly enumerate M13 and T7 phages from phage display selection experiments individually and simultaneously. The genome copies of M13 and T7 phages were quantified by TaqMan or SYBR Green qPCR referenced against M13 and T7 DNA standard curves of known concentrations. TaqMan qPCR was capable of quantifying M13 and T7 phage DNA simultaneously with a detection range of 2.75*10 1 -2.75*10 8 genome copies(gc)/μL and 2.66*10 1 -2.66*10 8 genome copies(gc)/μL respectively. TaqMan qPCR demonstrated an efficient amplification efficiency (E s ) of 0.97 and 0.90 for M13 and T7 phage DNA, respectively. SYBR Green qPCR was ten-fold more sensitive than TaqMan qPCR, able to quantify 2.75-2.75*10 7 gc/μL and 2.66*10 1 -2.66*10 7 gc/μL of M13 and T7 phage DNA, with an amplification efficiency E s of 1.06 and 0.78, respectively. Due to its superior sensitivity, SYBR Green qPCR was used to enumerate M13 and T7 phage display clones selected against a cell line, and quantified titers demonstrated accuracy comparable to titers from traditional double-layer plaque assay. Compared to enzyme linked immunosorbent assay, both qPCR methods exhibited increased detection sensitivity and reproducibility. These qPCR methods are reproducible, sensitive, and time-saving to determine their titers and to quantify a large number of phage samples individually or simultaneously, thus avoiding the need for time-intensive double-layer plaque assay. These findings highlight the attractiveness of qPCR for phage enumeration for applications ranging from selection to next-generation sequencing (NGS). Copyright © 2017 Elsevier B.V. All rights reserved.
76 FR 4920 - Government-Owned Inventions; Availability for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-27
... appear to be distinct from each other. Available for licensing is a Plk1 ELISA assay using peptide... binding property, an easy and reliable ELISA assay has been developed to quantify Plk1 expression levels... sequence. ELISA assay to quantify Plk1 expression and kinase activity. Advantages: Rapid, highly sensitive...
Effects of temperature on consumer-resource interactions.
Amarasekare, Priyanga
2015-05-01
Understanding how temperature variation influences the negative (e.g. self-limitation) and positive (e.g. saturating functional responses) feedback processes that characterize consumer-resource interactions is an important research priority. Previous work on this topic has yielded conflicting outcomes with some studies predicting that warming should increase consumer-resource oscillations and others predicting that warming should decrease consumer-resource oscillations. Here, I develop a consumer-resource model that both synthesizes previous findings in a common framework and yields novel insights about temperature effects on consumer-resource dynamics. I report three key findings. First, when the resource species' birth rate exhibits a unimodal temperature response, as demonstrated by a large number of empirical studies, the temperature range over which the consumer-resource interaction can persist is determined by the lower and upper temperature limits to the resource species' reproduction. This contrasts with the predictions of previous studies, which assume that the birth rate exhibits a monotonic temperature response, that consumer extinction is determined by temperature effects on consumer species' traits, rather than the resource species' traits. Secondly, the comparative analysis I have conducted shows that whether warming leads to an increase or decrease in consumer-resource oscillations depends on the manner in which temperature affects intraspecific competition. When the strength of self-limitation increases monotonically with temperature, warming causes a decrease in consumer-resource oscillations. However, if self-limitation is strongest at temperatures physiologically optimal for reproduction, a scenario previously unanalysed by theory but amply substantiated by empirical data, warming can cause an increase in consumer-resource oscillations. Thirdly, the model yields testable comparative predictions about consumer-resource dynamics under alternative hypotheses for how temperature affects competitive and resource acquisition traits. Importantly, it does so through empirically quantifiable metrics for predicting temperature effects on consumer viability and consumer-resource oscillations, which obviates the need for parameterizing complex dynamical models. Tests of these metrics with empirical data on a host-parasitoid interaction yield realistic estimates of temperature limits for consumer persistence and the propensity for consumer-resource oscillations, highlighting their utility in predicting temperature effects, particularly warming, on consumer-resource interactions in both natural and agricultural settings. © 2014 The Author. Journal of Animal Ecology © 2014 British Ecological Society.
USDA-ARS?s Scientific Manuscript database
Analytical methods for the determination of mycotoxins in foods are commonly based on chromatographic techniques (GC, HPLC or LC-MS). Although these methods permit a sensitive and accurate determination of the analyte, they require skilled personnel and are time-consuming, expensive, and unsuitable ...
Hjelmeland, Anna K; Wylie, Philip L; Ebeler, Susan E
2016-02-01
Methoxypyrazines are volatile compounds found in plants, microbes, and insects that have potent vegetal and earthy aromas. With sensory detection thresholds in the low ng L(-1) range, modest concentrations of these compounds can profoundly impact the aroma quality of foods and beverages, and high levels can lead to consumer rejection. The wine industry routinely analyzes the most prevalent methoxypyrazine, 2-isobutyl-3-methoxypyrazine (IBMP), to aid in harvest decisions, since concentrations decrease during berry ripening. In addition to IBMP, three other methoxypyrazines IPMP (2-isopropyl-3-methoxypyrazine), SBMP (2-sec-butyl-3-methoxypyrazine), and EMP (2-ethyl-3-methoxypyrazine) have been identified in grapes and/or wine and can impact aroma quality. Despite their routine analysis in the wine industry (mostly IBMP), accurate methoxypyrazine quantitation is hindered by two major challenges: sensitivity and resolution. With extremely low sensory detection thresholds (~8-15 ng L(-1) in wine for IBMP), highly sensitive analytical methods to quantify methoxypyrazines at trace levels are necessary. Here we were able to achieve resolution of IBMP as well as IPMP, EMP, and SBMP from co-eluting compounds using one-dimensional chromatography coupled to positive chemical ionization tandem mass spectrometry. Three extraction techniques HS-SPME (headspace-solid phase microextraction), SBSE (stirbar sorptive extraction), and HSSE (headspace sorptive extraction) were validated and compared. A 30 min extraction time was used for HS-SPME and SBSE extraction techniques, while 120 min was necessary to achieve sufficient sensitivity for HSSE extractions. All extraction methods have limits of quantitation (LOQ) at or below 1 ng L(-1) for all four methoxypyrazines analyzed, i.e., LOQ's at or below reported sensory detection limits in wine. The method is high throughput, with resolution of all compounds possible with a relatively rapid 27 min GC oven program. Copyright © 2015 Elsevier B.V. All rights reserved.
Estimating consumer familiarity with health terminology: a context-based approach.
Zeng-Treitler, Qing; Goryachev, Sergey; Tse, Tony; Keselman, Alla; Boxwala, Aziz
2008-01-01
Effective health communication is often hindered by a "vocabulary gap" between language familiar to consumers and jargon used in medical practice and research. To present health information to consumers in a comprehensible fashion, we need to develop a mechanism to quantify health terms as being more likely or less likely to be understood by typical members of the lay public. Prior research has used approaches including syllable count, easy word list, and frequency count, all of which have significant limitations. In this article, we present a new method that predicts consumer familiarity using contextual information. The method was applied to a large query log data set and validated using results from two previously conducted consumer surveys. We measured the correlation between the survey result and the context-based prediction, syllable count, frequency count, and log normalized frequency count. The correlation coefficient between the context-based prediction and the survey result was 0.773 (p < 0.001), which was higher than the correlation coefficients between the survey result and the syllable count, frequency count, and log normalized frequency count (p < or = 0.012). The context-based approach provides a good alternative to the existing term familiarity assessment methods.
Quantifying the bending of bilayer temperature-sensitive hydrogels
NASA Astrophysics Data System (ADS)
Dong, Chenling; Chen, Bin
2017-04-01
Stimuli-responsive hydrogels can serve as manipulators, including grippers, sensors, etc., where structures can undergo significant bending. Here, a finite-deformation theory is developed to quantify the evolution of the curvature of bilayer temperature-sensitive hydrogels when subjected to a temperature change. Analysis of the theory indicates that there is an optimal thickness ratio to acquire the largest curvature in the bilayer and also suggests that the sign or the magnitude of the curvature can be significantly affected by pre-stretches or small pores in the bilayer. This study may provide important guidelines in fabricating temperature-responsive bilayers with desirable mechanical performance.
A microplate assay to measure classical and alternative complement activity.
Puissant-Lubrano, Bénédicte; Fortenfant, Françoise; Winterton, Peter; Blancher, Antoine
2017-05-01
We developed and validated a kinetic microplate hemolytic assay (HA) to quantify classical and alternative complement activity in a single dilution of human plasma or serum. The assay is based on monitoring hemolysis of sensitized sheep (or uncoated rabbit) red blood cells by means of a 96-well microplate reader. The activity of the calibrator was evaluated by reference to 200 healthy adults. The conversion of 50% hemolysis time into a percentage of activity was obtained using a calibration curve plotted daily. The linearity of the assay as well as interference (by hemolysis, bilrubinemia and lipemia) was assessed for classical pathway (CP). The within-day and the between-day precision was satisfactory regarding the performance of commercially available liposome immunoassay (LIA) and ELISA. Patients with hereditary or acquired complement deficiencies were detected (activity was measured <30%). We also provided a reference range obtained from 200 blood donors. The agreement of CP evaluated on samples from 48 patients was 94% with LIA and 87.5% with ELISA. The sensitivity of our assay was better than that of LIA, and the cost was lower than either LIA or ELISA. In addition, this assay was less time consuming than previously reported HAs. This assay allows the simultaneous measurement of 36 samples in duplicate per run of a 96-well plate. The use of a daily calibration curve allows standardization of the method and leads to good reproducibility. The same technique was also adapted for the quantification of alternative pathway (AP) activity.
Platter, W J; Tatum, J D; Belk, K E; Chapman, P L; Scanga, J A; Smith, G C
2003-11-01
Logistic regression was used to quantify and characterize the effects of changes in marbling score, Warner-Bratzler shear force (WBSF), and consumer panel sensory ratings for tenderness, juiciness, or flavor on the probability of overall consumer acceptance of strip loin steaks from beef carcasses (n = 550). Consumers (n = 489) evaluated steaks for tenderness, juiciness, and flavor using nine-point hedonic scales (1 = like extremely and 9 = dislike extremely) and for overall steak acceptance (satisfied or not satisfied). Predicted acceptance of steaks by consumers was high (> 85%) when the mean consumer sensory rating for tenderness,juiciness, or flavor for a steak was 3 or lower on the hedonic scale. Conversely, predicted consumer acceptance of steaks was low (< or = 10%) when the mean consumer rating for tenderness, juiciness, or flavor for a steak was 5 or higher on the hedonic scale. As mean consumer sensory ratings for tenderness, juiciness, or flavor decreased from 3 to 5, the probability of acceptance of steaks by consumers diminished rapidly in a linear fashion. These results suggest that small changes in consumer sensory ratings for these sensory traits have dramatic effects on the probability of acceptance of steaks by consumers. Marbling score displayed a weak (adjusted R2 = 0.053), yet significant (P < 0.01), relationship to acceptance of steaks by consumers, and the shape of the predicted probability curve for steak acceptance was approximately linear over the entire range of marbling scores (Traces67 to Slightly Abundant97), suggesting that the likelihood of consumer acceptance of steaks increases approximately 10% for each full marbling score increase between Slight to Slightly Abundant. The predicted probability curve for consumer acceptance of steaks was sigmoidal for the WBSF model, with a steep decline in predicted probability of acceptance as WBSF values increased from 3.0 to 5.5 kg. Changes in WBSF within the high (> 5.5 kg) or low (< 3.0 kg) portions of the range of WBSF values had little effect on the probability of consumer acceptance of steaks.
Quantifying food waste in Hawaii's food supply chain.
Loke, Matthew K; Leung, PingSun
2015-12-01
Food waste highlights a considerable loss of resources invested in the food supply chain. While it receives a lot of attention in the global context, the assessment of food waste is deficient at the sub-national level, owing primarily to an absence of quality data. This article serves to explore that gap and aims to quantify the edible weight, economic value, and calorie equivalent of food waste in Hawaii. The estimates are based on available food supply data for Hawaii and the US Department of Agriculture's (USDA's) loss-adjusted food availability data for defined food groups at three stages of the food supply chain. At its highest aggregated level, we estimate Hawaii's food waste generation at 237,122 t or 26% of available food supply in 2010. This is equivalent to food waste of 161.5 kg per person, per annum. Additionally, this food waste is valued at US$1.025 billion annually or the equivalent of 502.6 billion calories. It is further evident that the occurrence of food waste by all three measures is highest at the consumer stage, followed by the distribution and retail stage, and is lowest at the post-harvest and packing stage. The findings suggest that any meaningful intervention to reduce food waste in Hawaii should target the consumer, and distribution and retail stages of the food supply chain. Interventions at the consumer stage should focus on the two protein groups, as well as fresh fruits and fresh vegetables. © The Author(s) 2015.
Bucher, Tamara; Collins, Clare; Rollo, Megan E; McCaffrey, Tracy A; De Vlieger, Nienke; Van der Bend, Daphne; Truby, Helen; Perez-Cueto, Federico J A
2016-06-01
Nudging or 'choice architecture' refers to strategic changes in the environment that are anticipated to alter people's behaviour in a predictable way, without forbidding any options or significantly changing their economic incentives. Nudging strategies may be used to promote healthy eating behaviour. However, to date, the scientific evidence has not been systematically reviewed to enable practitioners and policymakers to implement, or argue for the implementation of, specific measures to support nudging strategies. This systematic review investigated the effect of positional changes of food placement on food choice. In total, seven scientific databases were searched using relevant keywords to identify interventions that manipulated food position (proximity or order) to generate a change in food selection, sales or consumption, among normal-weight or overweight individuals across any age group. From 2576 identified articles, fifteen articles comprising eighteen studies met our inclusion criteria. This review has identified that manipulation of food product order or proximity can influence food choice. Such approaches offer promise in terms of impacting on consumer behaviour. However, there is a need for high-quality studies that quantify the magnitude of positional effects on food choice in conjunction with measuring the impact on food intake, particularly in the longer term. Future studies should use outcome measures such as change in grams of food consumed or energy intake to quantify the impact on dietary intake and potential impacts on nutrition-related health. Research is also needed to evaluate potential compensatory behaviours secondary to such interventions.
Context-Sensitive Spelling Correction of Consumer-Generated Content on Health Care.
Zhou, Xiaofang; Zheng, An; Yin, Jiaheng; Chen, Rudan; Zhao, Xianyang; Xu, Wei; Cheng, Wenqing; Xia, Tian; Lin, Simon
2015-07-31
Consumer-generated content, such as postings on social media websites, can serve as an ideal source of information for studying health care from a consumer's perspective. However, consumer-generated content on health care topics often contains spelling errors, which, if not corrected, will be obstacles for downstream computer-based text analysis. In this study, we proposed a framework with a spelling correction system designed for consumer-generated content and a novel ontology-based evaluation system which was used to efficiently assess the correction quality. Additionally, we emphasized the importance of context sensitivity in the correction process, and demonstrated why correction methods designed for electronic medical records (EMRs) failed to perform well with consumer-generated content. First, we developed our spelling correction system based on Google Spell Checker. The system processed postings acquired from MedHelp, a biomedical bulletin board system (BBS), and saved misspelled words (eg, sertaline) and corresponding corrected words (eg, sertraline) into two separate sets. Second, to reduce the number of words needing manual examination in the evaluation process, we respectively matched the words in the two sets with terms in two biomedical ontologies: RxNorm and Systematized Nomenclature of Medicine -- Clinical Terms (SNOMED CT). The ratio of words which could be matched and appropriately corrected was used to evaluate the correction system's overall performance. Third, we categorized the misspelled words according to the types of spelling errors. Finally, we calculated the ratio of abbreviations in the postings, which remarkably differed between EMRs and consumer-generated content and could largely influence the overall performance of spelling checkers. An uncorrected word and the corresponding corrected word was called a spelling pair, and the two words in the spelling pair were its members. In our study, there were 271 spelling pairs detected, among which 58 (21.4%) pairs had one or two members matched in the selected ontologies. The ratio of appropriate correction in the 271 overall spelling errors was 85.2% (231/271). The ratio of that in the 58 spelling pairs was 86% (50/58), close to the overall ratio. We also found that linguistic errors took up 31.4% (85/271) of all errors detected, and only 0.98% (210/21,358) of words in the postings were abbreviations, which was much lower than the ratio in the EMRs (33.6%). We conclude that our system can accurately correct spelling errors in consumer-generated content. Context sensitivity is indispensable in the correction process. Additionally, it can be confirmed that consumer-generated content differs from EMRs in that consumers seldom use abbreviations. Also, the evaluation method, taking advantage of biomedical ontology, can effectively estimate the accuracy of the correction system and reduce manual examination time.
Powell, Brian S; Kerry, Colette G; Cornuelle, Bruce D
2013-10-01
Measurements of acoustic ray travel-times in the ocean provide synoptic integrals of the ocean state between source and receiver. It is known that the ray travel-time is sensitive to variations in the ocean at the transmission time, but the sensitivity of the travel-time to spatial variations in the ocean prior to the acoustic transmission have not been quantified. This study examines the sensitivity of ray travel-time to the temporally and spatially evolving ocean state in the Philippine Sea using the adjoint of a numerical model. A one year series of five day backward integrations of the adjoint model quantify the sensitivity of travel-times to varying dynamics that can alter the travel-time of a 611 km ray by 200 ms. The early evolution of the sensitivities reveals high-mode internal waves that dissipate quickly, leaving the lowest three modes, providing a connection to variations in the internal tide generation prior to the sample time. They are also strongly sensitive to advective effects that alter density along the ray path. These sensitivities reveal how travel-time measurements are affected by both nearby and distant waters. Temporal nonlinearity of the sensitivities suggests that prior knowledge of the ocean state is necessary to exploit the travel-time observations.
Sensitivity of whitewater rafting consumers surplus to pecuniary travel cost specifications
Donald B.K. English; J. Michael Bowker
1996-01-01
Considerable research has examined how different ways of accounting for onsite and travel time affect surplus estimates from travel cost models. However, little has been done regarding different definitions of out-of-pocket costs. Estimates of per trip consumer surplus are developed for a zonal travel cost model for outfitted rafting on the Chattooga River. Nine price...
ERIC Educational Resources Information Center
Malone, Stephen M.; McGue, Matt; Iacono, William G.
2010-01-01
Background: The maximum number of alcoholic drinks consumed in a single 24-hr period is an alcoholism-related phenotype with both face and empirical validity. It has been associated with severity of withdrawal symptoms and sensitivity to alcohol, genes implicated in alcohol metabolism, and amplitude of a measure of brain activity associated with…
USDA-ARS?s Scientific Manuscript database
To evaluate newer indirect calorimetry system to quantify energetic parameters, 8 cross-bred beef steers (initial BW = 241 ± 4.10 kg) were used in a 77-d experiment to examine energetics parameters calculated from carbon dioxide (CO2), methane (CH4), and oxygen (O2) fluxes. Steers were individually ...
The pigments of sorghum pericarp are associated with the contents of cartenoids and pro-vitamin A
USDA-ARS?s Scientific Manuscript database
Sorghum is a staple crop consumed in certain regions of Africa and Asia, where vitamin A deficiency is prevalent. However, the correlation of sorghum intake and vitamin A deficiency is contradictory. The objective of this study was to identify and quantify the carotenoids and pro-vitamin A in the se...
Overlap and partitioning of the ecological and isotopic niches
Elizabeth A. Flaherty; Merav Ben-David
2010-01-01
Recently, it was proposed that stable isotope patterns can be used to quantify the width of the ecological niche of animals. However, the potential effects of habitat use on isotopic patterns of consumers have not been fully explored and consequently isotopic patterns may yield deceptive estimates of niche width. Here, we simulated four different scenarios of a...
Cinnamic aldehyde: a survey of consumer patch-test sensitization.
Danneman, P J; Booman, K A; Dorsky, J; Kohrman, K A; Rothenstein, A S; Sedlak, R I; Steltenkamp, R J; Thompson, G R
1983-12-01
The potential for cinnamic aldehyde, an important fragrance and flavour ingredient, to induce or to elicit delayed contact hypersensitivity reactions in man was evaluated by analysing patch-test data. Results of studies involving a total of 4117 patch tests on various consumer products and fragrance blends containing cinnamic aldehyde and on the material itself were collected from fragrance and formulator companies. The data indicate that cinnamic aldehyde contained in consumer products and fragrance blends at concentrations up to 6 X 10(-1)%, and patch-tested at concentrations up to 8 X 10(-3)%, has no detectable potential to induce hypersensitivity. Cinnamic aldehyde when tested alone induced a dose-related hypersensitivity response. According to published reports, cinnamic aldehyde elicited positive delayed hypersensitivity responses in dermatitic patients. However, results of the current survey show that when cinnamic aldehyde was tested alone or as part of a mixture in subjects in the general population, no pre-existing hypersensitivity reactions to the fragrance material were observed in any of the 4117 patch tests which constituted the survey. Cinnamic aldehyde at the concentrations contained in consumer products and fragrances, has a very low potential to induce hypersensitivity ('induced' reactions) or to elicit sensitization reactions ('elicited' reactions) in the general population.
Creating Peer-Led Media to Teach Sensitive Topics: Recommendations from Practicing Health Educators
ERIC Educational Resources Information Center
Hudson, Heather K.; Bliss, Kadi R.; Bice, Matthew R.; Lodyga, Marc G.; Ragon, Bruce M.
2014-01-01
Purpose: The purpose of the study was to evaluate consumer (instructor) reception of Channel Surfing Contraceptives in order to determine components necessary for creation of peer-led educational videos to teach sensitive contraceptive topics. Methods: Two focus group interviews with introductory-level undergraduate personal health instructors…
RAPID PCR-BASED MONITORING OF INFECTIOUS ENTEROVIRUSES IN DRINKING WATER. (R824756)
Currently, the standard method for the detection of enteroviruses and hepatitis A virus in water involves cell culture assay which is expensive and time consuming. Direct RT-PCR offers a rapid and sensitive alternative to virus detection but sensitivity is oft...
Byrne, Sahara; Niederdeppe, Jeff; Avery, Rosemary J; Cantor, Jonathan
2013-01-01
Previous research suggests that direct-to-consumer (DTC) advertisements for pharmaceutical drugs have the potential to influence consumers' perceptions of whether symptoms should be treated medically and/or through behavior change. However, the relative frequency of messages emphasizing these approaches in pharmaceutical advertising remains largely unknown. A content analysis of print and television advertisements for cholesterol management medication between 1994 and 2005 (for print) and between 1999 and 2007 (for television) was conducted. First, the extent to which established theoretical constructs drawn from health communication scholarship are depicted in the content of DTC cholesterol advertisements is quantified. Second, specific claims about behavior change inefficacy when a pharmaceutical alternative is available are identified. Findings indicate that DTC ads offer many mixed messages about the efficacy of diet and exercise in reducing cholesterol and risk of heart disease. Theoretical and practical implications of this work are discussed.
Decision-making patterns for dietary supplement purchases among women aged 25 to 45 years.
Miller, Carla K; Russell, Teri; Kissling, Grace
2003-11-01
Women frequently consume dietary supplements but the criteria used to select supplements have received little investigation. This research identified the decision-making criteria used for dietary supplements among women aged 25 to 45 years who consumed a supplement at least four times per week. Participants (N=51) completed an in-store shopping interview that was audiotaped, transcribed, and analyzed qualitatively for the criteria used to make supplement selections. Qualitative analysis revealed 10 key criteria and the number of times each person used each criterion was quantified. Cluster analysis identified five homogeneous subgroups of participants based on the criteria used. These included brand shopper, bargain shopper, quality shopper, convenience shopper, and information gatherer. Supplement users vary in the criteria used to make point-of-purchase supplement selections. Dietetics professionals can classify supplement users according to the criteria used to tailor their nutrition counseling and better meet the educational needs of consumers.
Using a drug facts box to communicate drug benefits and harms: two randomized trials.
Schwartz, Lisa M; Woloshin, Steven; Welch, H Gilbert
2009-04-21
Direct-to-consumer prescription drug ads typically fail to provide fundamental information that consumers need to make informed decisions: data on how well the drug works. To see whether providing consumers with a drug facts box-a table quantifying outcomes with and without the drug-improves knowledge and affects judgments about prescription medications. Two randomized, controlled trials conducted between October 2006 and April 2007: a symptom drug box trial using direct-to-consumer ads for a histamine-2 blocker and a proton-pump inhibitor to treat heartburn, and a prevention drug box trial using direct-to-consumer ads for a statin and clopidogrel to prevent cardiovascular events. National sample of U.S. adults identified by random-digit dialing. Adults age 35 to 70 years who completed a mailed survey; the final samples comprised 231 participants with completed surveys in the symptom drug box trial (49% response rate) and 219 in the prevention drug box trial (46% response rate). In both trials, the control group received 2 actual drug ads (including both the front page and brief summary). The drug box group received the same ads, except that the brief summary was replaced by a drug facts box. Choice between drugs (primary outcome of the symptom drug box trial) and accurate perceptions of drug benefits and side effects (primary outcome of the prevention drug box trial). In the symptom drug box trial, 70% of the drug box group and 8% of the control group correctly identified the PPI as being "a lot more effective" than the histamine-2 blocker (P < 0.001), and 80% and 38% correctly recognized that the side effects of the 2 drugs were similar (P < 0.001). When asked what they would do if they had bothersome heartburn and could have either drug for free, 68% of the drug box group and 31% of the control group chose the proton-pump inhibitor, the superior drug (P < 0.001). In the prevention drug box trial, the drug box improved consumers' knowledge of the benefits and side effects of a statin and clopidogrel. For example, 72% of the drug box group and 9% of the control group correctly quantified the benefit (absolute risk reduction) of the statin (P < 0.001). Most of the control participants overestimated this benefit, and 65% did so by a factor of 10 or more. The trials tested drug boxes in only 4 direct-to-consumer ads. If other direct-to-consumer ads were to communicate outcome data better, the effect of the drug box would be reduced. A drug facts box improved U.S. consumers' knowledge of prescription drug benefits and side effects. It resulted in better choices between drugs for current symptoms and corrected the overestimation of benefit in the setting of prevention. National Cancer Institute and Attorney General Consumer and Prescriber Education Program.
Talio, María Carolina; Alesso, Magdalena; Acosta, Mariano; Wills, Verónica S; Fernández, Liliana P
2017-11-01
In this work, a new procedure was developed for separation and preconcentration of nickel(II) and cadmium(II) in several and varied tobacco samples. Tobacco samples were selected considering the main products consumed by segments of the population, in particular the age (youth) and lifestyle of the consumer. To guarantee representative samples, a randomized strategy of sampling was used. In the first step, a chemofiltration on nylon membrane is carried out employing eosin (Eo) and carbon nanotubes dispersed in sodium dodecylsulfate (SDS) solution (phosphate buffer pH 7). In this condition, Ni(II) was selectively retained on the solid support. After that, the filtrate liquid with Cd(II) was re-conditioned with acetic acid /acetate buffer solution (pH 5) and followed by detection. A spectrofluorimetric determination of both metals was carried out, on the solid support and the filtered aqueous solution, for Ni(II) and Cd(II), respectively. The solid surface fluorescence (SSF) determination was performed at λ em = 545nm (λ ex = 515nm) for Ni(II)-Eo complex and the fluorescence of Cd(II)-Eo was quantified in aqueous solution using λ em = 565nm (λ ex = 540nm). The calibration graphs resulted linear in a range of 0.058-29.35μgL -1 for Ni(II) and 0.124-56.20μgL -1 for Cd(II), with detection limits of 0.019 and 0.041μgL -1 (S/N = 3). The developed methodology shows good sensitivity and adequate selectivity, and it was successfully applied to the determination of trace amounts of nickel and cadmium present in tobacco samples (refill solutions for e-cigarettes, snuff used in narguille (molasses) and traditional tobacco) with satisfactory results. The new methodology was validated by ICP-MS with adequate agreement. The proposed methodology represents a novel fluorescence application to Ni(II) and Cd(II) quantification with sensitivity and accuracy similar to atomic spectroscopies, introducing for the first time the quenching effect on SSF. Copyright © 2017 Elsevier B.V. All rights reserved.
Sensitivity to psychostimulants in mice bred for high and low stimulation to methamphetamine.
Kamens, H M; Burkhart-Kasch, S; McKinnon, C S; Li, N; Reed, C; Phillips, T J
2005-03-01
Methamphetamine (MA) and cocaine induce behavioral effects primarily through modulation of dopamine neurotransmission. However, the genetic regulation of sensitivity to these two drugs may be similar or disparate. Using selective breeding, lines of mice were produced with extreme sensitivity (high MA activation; HMACT) and insensitivity (low MA activation; LMACT) to the locomotor stimulant effects of acute MA treatment. Studies were performed to determine whether there is pleiotropic genetic influence on sensitivity to the locomotor stimulant effect of MA and to other MA- and cocaine-related behaviors. The HMACT line exhibited more locomotor stimulation in response to several doses of MA and cocaine, compared to the LMACT line. Both lines exhibited locomotor sensitization to 2 mg/kg of MA and 10 mg/kg of cocaine; the magnitude of sensitization was similar in the two lines. However, the lines differed in the magnitude of sensitization to a 1 mg/kg dose of MA, a dose that did not produce a ceiling effect that may confound interpretation of studies using higher doses. The LMACT line consumed more MA and cocaine in a two-bottle choice drinking paradigm; the lines consumed similar amounts of saccharin and quinine, although the HMACT line exhibited slightly elevated preference for a low concentration of saccharin. These results suggest that some genes that influence sensitivity to the acute locomotor stimulant effect of MA have a pleiotropic influence on the magnitude of behavioral sensitization to MA and sensitivity to the stimulant effects of cocaine. Further, extreme sensitivity to MA may protect against MA and cocaine self-administration.
NASA Astrophysics Data System (ADS)
Song, X.; Chen, X.; Dai, H.; Hammond, G. E.; Song, H. S.; Stegen, J.
2016-12-01
The hyporheic zone is an active region for biogeochemical processes such as carbon and nitrogen cycling, where the groundwater and surface water mix and interact with each other with distinct biogeochemical and thermal properties. The biogeochemical dynamics within the hyporheic zone are driven by both river water and groundwater hydraulic dynamics, which are directly affected by climate change scenarios. Besides that, the hydraulic and thermal properties of local sediments and microbial and chemical processes also play important roles in biogeochemical dynamics. Thus for a comprehensive understanding of the biogeochemical processes in the hyporheic zone, a coupled thermo-hydro-biogeochemical model is needed. As multiple uncertainty sources are involved in the integrated model, it is important to identify its key modules/parameters through sensitivity analysis. In this study, we develop a 2D cross-section model in the hyporheic zone at the DOE Hanford site adjacent to Columbia River and use this model to quantify module and parametric sensitivity on assessment of climate change. To achieve this purpose, We 1) develop a facies-based groundwater flow and heat transfer model that incorporates facies geometry and heterogeneity characterized from a field data set, 2) derive multiple reaction networks/pathways from batch experiments with in-situ samples and integrate temperate dependent reactive transport modules to the flow model, 3) assign multiple climate change scenarios to the coupled model by analyzing historical river stage data, 4) apply a variance-based global sensitivity analysis to quantify scenario/module/parameter uncertainty in hierarchy level. The objectives of the research include: 1) identifing the key control factors of the coupled thermo-hydro-biogeochemical model in the assessment of climate change, and 2) quantify the carbon consumption in different climate change scenarios in the hyporheic zone.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groen, E.A., E-mail: Evelyne.Groen@gmail.com; Heijungs, R.; Leiden University, Einsteinweg 2, Leiden 2333 CC
Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlationsmore » between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.« less
NASA Astrophysics Data System (ADS)
Qian, Y.; Wang, C.; Huang, M.; Berg, L. K.; Duan, Q.; Feng, Z.; Shrivastava, M. B.; Shin, H. H.; Hong, S. Y.
2016-12-01
This study aims to quantify the relative importance and uncertainties of different physical processes and parameters in affecting simulated surface fluxes and land-atmosphere coupling strength over the Amazon region. We used two-legged coupling metrics, which include both terrestrial (soil moisture to surface fluxes) and atmospheric (surface fluxes to atmospheric state or precipitation) legs, to diagnose the land-atmosphere interaction and coupling strength. Observations made using the Department of Energy's Atmospheric Radiation Measurement (ARM) Mobile Facility during the GoAmazon field campaign together with satellite and reanalysis data are used to evaluate model performance. To quantify the uncertainty in physical parameterizations, we performed a 120 member ensemble of simulations with the WRF model using a stratified experimental design including 6 cloud microphysics, 3 convection, 6 PBL and surface layer, and 3 land surface schemes. A multiple-way analysis of variance approach is used to quantitatively analyze the inter- and intra-group (scheme) means and variances. To quantify parameter sensitivity, we conducted an additional 256 WRF simulations in which an efficient sampling algorithm is used to explore the multiple-dimensional parameter space. Three uncertainty quantification approaches are applied for sensitivity analysis (SA) of multiple variables of interest to 20 selected parameters in YSU PBL and MM5 surface layer schemes. Results show consistent parameter sensitivity across different SA methods. We found that 5 out of 20 parameters contribute more than 90% total variance, and first-order effects dominate comparing to the interaction effects. Results of this uncertainty quantification study serve as guidance for better understanding the roles of different physical processes in land-atmosphere interactions, quantifying model uncertainties from various sources such as physical processes, parameters and structural errors, and providing insights for improving the model physics parameterizations.
NASA Astrophysics Data System (ADS)
Malone, A.
2017-12-01
Quantifying mass balance sensitivity to climate change is essential for forecasting glacier evolution and deciphering climate signals embedded in archives of past glacier changes. Ideally, these quantifications result from decades of field measurement, remote sensing, and a hierarchy modeling approach, but in data-sparse regions, such as the Himalayas and tropical Andes, regional-scale modeling rooted in first principles provides a first-order picture. Previous regional-scaling modeling studies have applied a surface energy and mass balance approach in order to quantify equilibrium line altitude sensitivity to climate change. In this study, an expanded regional-scale surface energy and mass balance model is implemented to quantify glacier-wide mass balance sensitivity to climate change for tropical Andean glaciers. Data from the Randolph Glacier Inventory are incorporated, and additional physical processes are included, such as a dynamic albedo and cloud-dependent atmospheric emissivity. The model output agrees well with the limited mass balance records for tropical Andean glaciers. The dominant climate variables driving interannual mass balance variability differ depending on the climate setting. For wet tropical glaciers (annual precipitation >0.75 m y-1), temperature is the dominant climate variable. Different hypotheses for the processes linking wet tropical glacier mass balance variability to temperature are evaluated. The results support the hypothesis that glacier-wide mass balance on wet tropical glaciers is largely dominated by processes at the lowest elevation where temperature plays a leading role in energy exchanges. This research also highlights the transient nature of wet tropical glaciers - the vast majority of tropical glaciers and a vital regional water resource - in an anthropogenic warming world.
NASA Astrophysics Data System (ADS)
Dobre, Mariana; Brooks, Erin; Lew, Roger; Kolden, Crystal; Quinn, Dylan; Elliot, William; Robichaud, Pete
2017-04-01
Soil erosion is a secondary fire effect with great implications for many ecosystem resources. Depending on the burn severity, topography, and the weather immediately after the fire, soil erosion can impact municipal water supplies, degrade water quality, and reduce reservoirs' storage capacity. Scientists and managers use field and remotely sensed data to quickly assess post-fire burn severity in ecologically-sensitive areas. From these assessments, mitigation activities are implemented to minimize post-fire flood and soil erosion and to facilitate post-fire vegetation recovery. Alternatively, land managers can use fire behavior and spread models (e.g. FlamMap, FARSITE, FOFEM, or CONSUME) to identify sensitive areas a priori, and apply strategies such as fuel reduction treatments to proactively minimize the risk of wildfire spread and increased burn severity. There is a growing interest in linking fire behavior and spread models with hydrology-based soil erosion models to provide site-specific assessment of mitigation treatments on post-fire runoff and erosion. The challenge remains, however, that many burn severity mapping and modeling products quantify vegetation loss rather than measuring soil burn severity. Wildfire burn severity is spatially heterogeneous and depends on the pre-fire vegetation cover, fuel load, topography, and weather. Severities also differ depending on the variable of interest (e.g. soil, vegetation). In the United States, Burned Area Reflectance Classification (BARC) maps, derived from Landsat satellite images, are used as an initial burn severity assessment. BARC maps are classified from either a Normalized Burn Ratio (NBR) or differenced Normalized Burned Ratio (dNBR) scene into four classes (Unburned, Low, Moderate, and High severity). The development of soil burn severity maps requires further manual field validation efforts to transform the BARC maps into a product more applicable for post-fire soil rehabilitation activities. Alternative spectral indices and modeled output approaches may prove better predictors of soil burn severity and hydrologic effects, but these have not yet been assessed in a model framework. In this project we compare field-verified soil burn severity maps to satellite-derived and modeled burn severity maps. We quantify the extent to which there are systematic differences in these mapping products. We then use the Water Erosion Prediction Project (WEPP) hydrologic soil erosion model to assess sediment delivery from these fires using the predicted and observed soil burn severity maps. Finally, we discuss differences in observed and predicted soil burn severity maps and application to watersheds in the Pacific Northwest to estimate post-fire sediment delivery.
Griffin, Dale W; Harris, Peter R
2011-05-01
Self-affirmation, reflecting on one's defining personal values, increases acceptance of threatening information, but does it do so at the cost of inducing undue alarm in people at low risk of harm? We contrast an alarm model, wherein self-affirmation simply increases response to threat, with a calibration model, wherein self-affirmation increases sensitivity to the self-relevance of health-risk information. Female seafood consumers (N = 165) completed a values self-affirmation or control task before reading a U.S. Food and Drug Administration brochure on mercury in seafood. Findings support the calibration model: Among frequent seafood consumers, self-affirmation generally increased concern (reports of depth of thought, personal message relevance, perceived risk, and negative affect) for those high in defensiveness and reduced it for those low in defensiveness. Among infrequent consumers of seafood, self-affirmation typically reduced concern. Thus, self-affirmation increased the sensitivity with which women at different levels of risk, and at different levels of defensiveness, responded cognitively and affectively to the materials.
NASA Astrophysics Data System (ADS)
Srinivasan, Veena; Gorelick, Steven M.; Goulder, Lawrence
2010-07-01
In this paper, we discuss a challenging water resources problem in a developing world city, Chennai, India. The goal is to reconstruct past system behavior and diagnose the causes of a major water crisis. In order to do this, we develop a hydrologic-engineering-economic model to address the complexity of urban water supply arising from consumers' dependence on multiple interconnected sources of water. We integrate different components of the urban water system: water flowing into the reservoir system; diversion and distribution by the public water utility; groundwater flow in the aquifer beneath the city; supply, demand, and prices in the informal tanker-truck-based water market; and consumer behavior. Both the economic and physical impacts of consumers' dependence on multiple sources of water are quantified. The model is calibrated over the period 2002-2006 using a range of hydrologic and socio-economic data. The model's results highlight the inadequacy of the reservoir system and the buffering role played by the urban aquifer and consumers' coping investments during multiyear droughts.
On the context-dependent scaling of consumer feeding rates.
Barrios-O'Neill, Daniel; Kelly, Ruth; Dick, Jaimie T A; Ricciardi, Anthony; MacIsaac, Hugh J; Emmerson, Mark C
2016-06-01
The stability of consumer-resource systems can depend on the form of feeding interactions (i.e. functional responses). Size-based models predict interactions - and thus stability - based on consumer-resource size ratios. However, little is known about how interaction contexts (e.g. simple or complex habitats) might alter scaling relationships. Addressing this, we experimentally measured interactions between a large size range of aquatic predators (4-6400 mg over 1347 feeding trials) and an invasive prey that transitions among habitats: from the water column (3D interactions) to simple and complex benthic substrates (2D interactions). Simple and complex substrates mediated successive reductions in capture rates - particularly around the unimodal optimum - and promoted prey population stability in model simulations. Many real consumer-resource systems transition between 2D and 3D interactions, and along complexity gradients. Thus, Context-Dependent Scaling (CDS) of feeding interactions could represent an unrecognised aspect of food webs, and quantifying the extent of CDS might enhance predictive ecology. © The Authors. Ecology Letters published by CNRS and John Wiley & Sons Ltd.
Propato, Marco; Uber, James G
2004-07-01
Can the spread of infectious disease through water distribution systems be halted by a disinfectant residual? This question is overdue for an answer. Regulatory agencies and water utilities have long been concerned about accidental intrusions of pathogens into distribution system pipelines (i.e., cross-connections) and are increasingly concerned about deliberate pathogen contamination. Here, a simulation framework is developed and used to assess the vulnerability of a water system to microbiological contamination. The risk of delivering contaminated water to consumers is quantified by a network water quality model that includes disinfectant decay and disinfection kinetics. The framework is applied to two example networks under a worst-case deliberate intrusion scenario. Results show that the risk of consumer exposure is affected by the residual maintenance strategy employed. The common regulation that demands a "detectable" disinfectant residual may not provide effective consumer protection against microbial contamination. A chloramine residual, instead of free chlorine, may significantly weaken this final barrier against pathogen intrusions. Moreover, the addition of a booster station at storage tanks may improve consumer protection without requiring excessive disinfectant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cappers, Peter; Todd, Annika; Perry, Michael
2013-06-27
This report offers guidelines and protocols for measuring the effects of time-based rates, enabling technology, and various other treatments on customers’ levels and patterns of electricity usage. Although the focus is on evaluating consumer behavior studies (CBS) that involve field trials and pilots, the methods can be extended to assessing the large-scale programs that may follow. CBSs are undertaken to resolve uncertainties and ambiguities about how consumers respond to inducements to modify their electricity demand. Those inducements include price structures; feedback and information; and enabling technologies embedded in programs such as: critical peak, time-of use, real-time pricing; peak time rebatemore » or critical peak rebate; home energy reports and in-home displays; and all manner of device controls for appliances and plug loads. Although the focus of this report is on consumer studies—where the subjects are households—the behavioral sciences principles discussed and many of the methods recommended apply equally to studying commercial and industrial customer electricity demand.« less
A critical analysis of the literature on the Internet and consumer health information.
Powell, J A; Lowe, P; Griffiths, F E; Thorogood, M
2005-01-01
A critical review of the published literature investigating the Internet and consumer health information was undertaken in order to inform further research and policy. A qualitative, narrative method was used, consisting of a three-stage process of identification and collation, thematic coding, and critical analysis. This analysis identified five main themes in the research in this area: (1) the quality of online health information for consumers; (2) consumer use of the Internet for health information; (3) the effect of e-health on the practitioner-patient relationship; (4) virtual communities and online social support and (5) the electronic delivery of information-based interventions. Analysis of these themes revealed more about the concerns of health professionals than about the effect of the Internet on users. Much of the existing work has concentrated on quantifying characteristics of the Internet: for example, measuring the quality of online information, or describing the numbers of users in different health-care settings. There is a lack of qualitative research that explores how citizens are actually using the Internet for health care.
van der Heide, Susan; Garcia Calavia, Paula; Hardwick, Sheila; Hudson, Simon; Wolff, Kim; Russell, David A
2015-05-01
A sensitive and versatile competitive enzyme immunoassay (cEIA) has been developed for the quantitative detection of cocaine in complex forensic samples. Polyclonal anti-cocaine antibody was purified from serum and deposited onto microtiter plates. The concentration of the cocaine antibody adsorbed onto the plates, and the dilution of the cocaine-HRP hapten were both studied to achieve an optimised immunoassay. The method was successfully used to quantify cocaine in extracts taken from both paper currency and latent fingermarks. The limit of detection (LOD) of 0.162ngmL(-1) achieved with the assay compares favourably to that of conventional chromatography-mass spectroscopy techniques, with an appropriate sensitivity for the quantification of cocaine at the low concentrations present in some forensic samples. The cEIA was directly compared to LC-MS for the analysis of ten UK banknote samples. The results obtained from both techniques were statistically similar, suggesting that the immunoassay was unaffected by cross-reactivity with potentially interfering compounds. The cEIA was used also for the detection of cocaine in extracts from latent fingermarks. The results obtained were compared to the cocaine concentrations detected in oral fluid sampled from the same individual. Using the cEIA, we have shown, for the first time, that endogeneously excreted cocaine can be detected and quantified from a single latent fingermark. Additionally, it has been shown that the presence of cocaine, at similar concentrations, in more than one latent fingermark from the same individual can be linked with those concentrations found in oral fluid. These results show that detection of drugs in latent fingermarks could directly indicate whether an individual has consumed the drug. The specificity and feasibility of measuring low concentrations of cocaine in complex forensic samples demonstrate the effectiveness and robustness of the assay. The immunoassay presents a simple and cost-effective alternative to the current mass spectrometry based techniques for the quantitation of cocaine at forensically significant concentrations. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Open Source Tools for Numerical Simulation of Urban Greenhouse Gas Emissions
NASA Astrophysics Data System (ADS)
Nottrott, A.; Tan, S. M.; He, Y.
2016-12-01
There is a global movement toward urbanization. Approximately 7% of the global population lives in just 28 megacities, occupying less than 0.1% of the total land area used by human activity worldwide. These cities contribute a significant fraction of the global budget of anthropogenic primary pollutants and greenhouse gasses. The 27 largest cities consume 9.9%, 9.3%, 6.7% and 3.0% of global gasoline, electricity, energy and water use, respectively. This impact motivates novel approaches to quantify and mitigate the growing contribution of megacity emissions to global climate change. Cities are characterized by complex topography, inhomogeneous turbulence, and variable pollutant source distributions. These features create a scale separation between local sources and urban scale emissions estimates known as the Grey-Zone. Modern computational fluid dynamics (CFD) techniques provide a quasi-deterministic, physically based toolset to bridge the scale separation gap between source level dynamics, local measurements, and urban scale emissions inventories. CFD has the capability to represent complex building topography and capture detailed 3D turbulence fields in the urban boundary layer. This presentation discusses the application of OpenFOAM to urban CFD simulations of natural gas leaks in cities. OpenFOAM is an open source software for advanced numerical simulation of engineering and environmental fluid flows. When combined with free or low cost computer aided drawing and GIS, OpenFOAM generates a detailed, 3D representation of urban wind fields. OpenFOAM was applied to model methane (CH4) emissions from various components of the natural gas distribution system, to investigate the impact of urban meteorology on mobile CH4 measurements. The numerical experiments demonstrate that CH4 concentration profiles are highly sensitive to the relative location of emission sources and buildings. Sources separated by distances of 5-10 meters showed significant differences in vertical dispersion of the plume due to building wake effects. The OpenFOAM flow fields were combined with an inverse, stochastic dispersion model to quantify and visualize the sensitivity of point sensors to upwind sources in various built environments.
Simultaneous ultramicroanalysis of both 17-keto-and 17beta-hydroxy androgens in biological fluids.
Ganjam, V K
1976-11-01
Sensitive methods for quantifying androgens were lacking. Therefore, a relatively simple procedure for separating steroids was combined with highly specific assay methods so that eight androgens could be measured with high accuracy, precision and sensitivity. Semi-automated separations on Sephadex LH-20 columns used heptane:methylene chloride:ethanol:water (50:50:1:0.12) and a flow rate of 17.0 min/ml. The six peaks eluted contained androstenedine; androsterone, epiandrosterone and dihydrotestosterone; testosterone and dehydroepiandrosterone; 3alpha-androstanediol; 3beta-androstanediol; and androstenediol. Androstenedione, dehydroepiandrosterone and androstenediol were quantified using specific antisera (sensitivity less than or equal to 75 pg). Testosterone and dihydrotestosterone were measured by competitive protein-binding assays using rabbit TeBG (sensitivity less than or equal to 150 pg). 3alpha- and 3beta-androstanediol were similarly assayed using human TeBG (sensitivity approximately 150 pg). Androsterone was reduced with NaBH4 and the resulting 3alpha-androstanediol was assayed using human TeBG (sensitivity approximately 200 pg). Inter- and intra-assay variations were less than 10% for radioimmunoassays and less than 16% for competitive protein-binding assays over the entire dose response curve.
Quantifying cell mono-layer cultures by video imaging.
Miller, K S; Hook, L A
1996-04-01
A method is described in which the relative number of adherent cells in multi-well tissue-culture plates is assayed by staining the cells with Giemsa and capturing the image of the stained cells with a video camera and charged-coupled device. The resultant image is quantified using the associated video imaging software. The method is shown to be sensitive and reproducible and should be useful for studies where quantifying relative cell numbers and/or proliferation in vitro is required.
Determinants of choice for pigeons and humans on concurrent-chains schedules of reinforcement.
Belke, T W; Pierce, W D; Powell, R A
1989-09-01
Concurrent-chains schedules of reinforcement were arranged for humans and pigeons. Responses of humans were reinforced with tokens exchangeable for money, and key pecks of 4 birds were reinforced with food. Variable-interval 30-s and 40-s schedules operated in the terminal links of the chains. Condition 1 exposed subjects to variable-interval 90-s and variable-interval 30-s initial links, respectively. Conditions 2 and 3 arranged equal initial-link schedules of 40 s or 120 s. Experimental conditions tested the descriptive adequacy of five equations: reinforcement density, delay reduction, modified delay reduction, matching and maximization. Results based on choice proportions and switch rates during the initial links showed that pigeons behaved in accord with delay-reduction models, whereas humans maximized overall rate of reinforcement. As discussed by Logue and associates in self-control research, different types of reinforcement may affect sensitivity to delay differentially. Pigeons' responses were reinforced with food, a reinforcer that is consumable upon presentation. Humans' responses were reinforced with money, a reinforcer exchanged for consumable reinforcers after it was earned. Reinforcers that are immediately consumed may generate high sensitivity to delay and behavior described as delay reduction. Reinforces with longer times to consumption may generate low sensitivity to delay and behavior that maximizes overall payoff.
Quantifying nonhomogeneous colors in agricultural materials part I: method development.
Balaban, M O
2008-11-01
Measuring the color of food and agricultural materials using machine vision (MV) has advantages not available by other measurement methods such as subjective tests or use of color meters. The perception of consumers may be affected by the nonuniformity of colors. For relatively uniform colors, average color values similar to those given by color meters can be obtained by MV. For nonuniform colors, various image analysis methods (color blocks, contours, and "color change index"[CCI]) can be applied to images obtained by MV. The degree of nonuniformity can be quantified, depending on the level of detail desired. In this article, the development of the CCI concept is presented. For images with a wide range of hue values, the color blocks method quantifies well the nonhomogeneity of colors. For images with a narrow hue range, the CCI method is a better indicator of color nonhomogeneity.
Prevalence and effects of multiple chemical sensitivities in Australia.
Steinemann, Anne
2018-06-01
Multiple chemical sensitivities (MCS) is a medical condition associated with exposure to common chemical pollutants. The aims of this study are to assess the prevalence of MCS, its overlaps with asthma and fragrance sensitivity, and its health and societal effects in Australia. Data were collected in June 2016 using an on-line survey with a representative national sample (N = 1098) of adults (ages 18-65) in Australia. Results found that, across the country, 6.5% report medically diagnosed MCS, 18.9% report chemical sensitivity (being unusually sensitive to everyday chemicals and chemically formulated products), and 19.9% either or both. Among people with MCS, 74.6% also have diagnosed asthma or an asthma-like condition, and 91.5% have fragrance sensitivity, reporting health problems (such as migraine headaches) when exposed to fragranced consumer products (such as air fresheners and cleaning supplies). In addition, among people with MCS, 77.5% are prevented from access to places because of fragranced products, 52.1% lost workdays or a job in the past year due to fragranced product exposure in the workplace, and 55.4% report health effects considered potentially disabling. Results indicate that MCS is a widespread disease, affecting an estimated 1 million adult Australians, with chemical sensitivity affecting another 2 million. Reducing chemical exposure to problematic sources, such as fragranced consumer products, is critical to reduce adverse effects.
van Voorn, George A. K.; Ligtenberg, Arend; Molenaar, Jaap
2017-01-01
Adaptation of agents through learning or evolution is an important component of the resilience of Complex Adaptive Systems (CAS). Without adaptation, the flexibility of such systems to cope with outside pressures would be much lower. To study the capabilities of CAS to adapt, social simulations with agent-based models (ABMs) provide a helpful tool. However, the value of ABMs for studying adaptation depends on the availability of methodologies for sensitivity analysis that can quantify resilience and adaptation in ABMs. In this paper we propose a sensitivity analysis methodology that is based on comparing time-dependent probability density functions of output of ABMs with and without agent adaptation. The differences between the probability density functions are quantified by the so-called earth-mover’s distance. We use this sensitivity analysis methodology to quantify the probability of occurrence of critical transitions and other long-term effects of agent adaptation. To test the potential of this new approach, it is used to analyse the resilience of an ABM of adaptive agents competing for a common-pool resource. Adaptation is shown to contribute positively to the resilience of this ABM. If adaptation proceeds sufficiently fast, it may delay or avert the collapse of this system. PMID:28196372
Resilience through adaptation.
Ten Broeke, Guus A; van Voorn, George A K; Ligtenberg, Arend; Molenaar, Jaap
2017-01-01
Adaptation of agents through learning or evolution is an important component of the resilience of Complex Adaptive Systems (CAS). Without adaptation, the flexibility of such systems to cope with outside pressures would be much lower. To study the capabilities of CAS to adapt, social simulations with agent-based models (ABMs) provide a helpful tool. However, the value of ABMs for studying adaptation depends on the availability of methodologies for sensitivity analysis that can quantify resilience and adaptation in ABMs. In this paper we propose a sensitivity analysis methodology that is based on comparing time-dependent probability density functions of output of ABMs with and without agent adaptation. The differences between the probability density functions are quantified by the so-called earth-mover's distance. We use this sensitivity analysis methodology to quantify the probability of occurrence of critical transitions and other long-term effects of agent adaptation. To test the potential of this new approach, it is used to analyse the resilience of an ABM of adaptive agents competing for a common-pool resource. Adaptation is shown to contribute positively to the resilience of this ABM. If adaptation proceeds sufficiently fast, it may delay or avert the collapse of this system.
Methane uptake in urban forests and lawns
Peter M. Groffman; Richard V. Pouyat
2009-01-01
The largest natural biological sink for the radiatively active trace gas methane (CH4) is bacteria in soils that consume CH4 as an energy and carbon source. This sink has been shown to be sensitive to nitrogen (N) inputs and alterations of soil physical conditions. Given this sensitivity, conversion of native ecosystems to...
Autonomous characterization of plastic-bonded explosives
NASA Astrophysics Data System (ADS)
Linder, Kim Dalton; DeRego, Paul; Gomez, Antonio; Baumgart, Chris
2006-08-01
Plastic-Bonded Explosives (PBXs) are a newer generation of explosive compositions developed at Los Alamos National Laboratory (LANL). Understanding the micromechanical behavior of these materials is critical. The size of the crystal particles and porosity within the PBX influences their shock sensitivity. Current methods to characterize the prominent structural characteristics include manual examination by scientists and attempts to use commercially available image processing packages. Both methods are time consuming and tedious. LANL personnel, recognizing this as a manually intensive process, have worked with the Kansas City Plant / Kirtland Operations to develop a system which utilizes image processing and pattern recognition techniques to characterize PBX material. System hardware consists of a CCD camera, zoom lens, two-dimensional, motorized stage, and coaxial, cross-polarized light. System integration of this hardware with the custom software is at the core of the machine vision system. Fundamental processing steps involve capturing images from the PBX specimen, and extraction of void, crystal, and binder regions. For crystal extraction, a Quadtree decomposition segmentation technique is employed. Benefits of this system include: (1) reduction of the overall characterization time; (2) a process which is quantifiable and repeatable; (3) utilization of personnel for intelligent review rather than manual processing; and (4) significantly enhanced characterization accuracy.
ddPCRclust - An R package and Shiny app for automated analysis of multiplexed ddPCR data.
Brink, Benedikt G; Meskas, Justin; Brinkman, Ryan R
2018-03-09
Droplet digital PCR (ddPCR) is an emerging technology for quantifying DNA. By partitioning the target DNA into ∼20000 droplets, each serving as its own PCR reaction compartment, a very high sensitivity of DNA quantification can be achieved. However, manual analysis of the data is time consuming and algorithms for automated analysis of non-orthogonal, multiplexed ddPCR data are unavailable, presenting a major bottleneck for the advancement of ddPCR transitioning from low-throughput to high- throughput. ddPCRclust is an R package for automated analysis of data from Bio-Rad's droplet digital PCR systems (QX100 and QX200). It can automatically analyse and visualise multiplexed ddPCR experiments with up to four targets per reaction. Results are on par with manual analysis, but only take minutes to compute instead of hours. The accompanying Shiny app ddPCRvis provides easy access to the functionalities of ddPCRclust through a web-browser based GUI. R package: https://github.com/bgbrink/ddPCRclust; Interface: https://github.com/bgbrink/ddPCRvis/; Web: https://bibiserv.cebitec.uni-bielefeld.de/ddPCRvis/. bbrink@cebitec.uni-bielefeld.de.
Liang, Kai; Liu, Fei; Fan, Jia; Sun, Dali; Liu, Chang; Lyon, Christopher J.; Bernard, David W.; Li, Yan; Yokoi, Kenji; Katz, Matthew H.; Koay, Eugene J.; Zhao, Zhen; Hu, Ye
2017-01-01
Tumour-derived extracellular vesicles (EVs) are of increasing interest as a resource of diagnostic biomarkers. However, most EV assays require large samples, are time-consuming, low-throughput and costly, and thus impractical for clinical use. Here, we describe a rapid, ultrasensitive and inexpensive nanoplasmon-enhanced scattering (nPES) assay that directly quantifies tumor-derived EVs from as little as 1 μL of plasma. The assay uses the binding of antibody-conjugated gold nanospheres and nanorods to EVs captured by EV-specific antibodies on a sensor chip to produce a local plasmon effect that enhances tumour-derived EV detection sensitivity and specificity. We identified a pancreatic cancer EV biomarker, ephrin type-A receptor 2 (EphA2), and demonstrate that an nPES assay for EphA2-EVs distinguishes pancreatic cancer patients from pancreatitis patients and healthy subjects. EphA2-EVs were also informative in staging tumour progression and in detecting early responses to neoadjuvant therapy, with better performance than a conventional enzyme-linked immunosorbent assay. The nPES assay can be easily refined for clinical use, and readily adapted for diagnosis and monitoring of other conditions with disease-specific EV biomarkers. PMID:28791195
Nursing protects honeybee larvae from secondary metabolites of pollen
Lucchetti, Matteo A.; Kilchenmann, Verena; Glauser, Gaetan; Praz, Christophe
2018-01-01
The pollen of many plants contains toxic secondary compounds, sometimes in concentrations higher than those found in the flowers or leaves. The ecological significance of these compounds remains unclear, and their impact on bees is largely unexplored. Here, we studied the impact of pyrrolizidine alkaloids (PAs) found in the pollen of Echium vulgare on honeybee adults and larvae. Echimidine, a PA present in E. vulgare pollen, was isolated and added to the honeybee diets in order to perform toxicity bioassays. While adult bees showed relatively high tolerance to PAs, larvae were much more sensitive. In contrast to other bees, the honeybee larval diet typically contains only traces of pollen and consists predominantly of hypopharyngeal and mandibular secretions produced by nurse bees, which feed on large quantities of pollen-containing bee bread. We quantified the transfer of PAs to nursing secretions produced by bees that had previously consumed bee bread supplemented with PAs. The PA concentration in these secretions was reduced by three orders of magnitude as compared to the PA content in the nurse diet and was well below the toxicity threshold for larvae. Our results suggest that larval nursing protects honeybee larvae from the toxic effect of secondary metabolites of pollen. PMID:29563265
An Evaluation of Aircraft Emissions Inventory Methodology by Comparisons with Reported Airline Data
NASA Technical Reports Server (NTRS)
Daggett, D. L.; Sutkus, D. J.; DuBois, D. P.; Baughcum, S. L.
1999-01-01
This report provides results of work done to evaluate the calculation methodology used in generating aircraft emissions inventories. Results from the inventory calculation methodology are compared to actual fuel consumption data. Results are also presented that show the sensitivity of calculated emissions to aircraft payload factors. Comparisons of departures made, ground track miles flown and total fuel consumed by selected air carriers were made between U.S. Dept. of Transportation (DOT) Form 41 data reported for 1992 and results of simplified aircraft emissions inventory calculations. These comparisons provide an indication of the magnitude of error that may be present in aircraft emissions inventories. To determine some of the factors responsible for the errors quantified in the DOT Form 41 analysis, a comparative study of in-flight fuel flow data for a specific operator's 747-400 fleet was conducted. Fuel consumption differences between the studied aircraft and the inventory calculation results may be attributable to several factors. Among these are longer flight times, greater actual aircraft weight and performance deterioration effects for the in-service aircraft. Results of a parametric study on the variation in fuel use and NOx emissions as a function of aircraft payload for different aircraft types are also presented.
Automatic Identification and Quantification of Extra-Well Fluorescence in Microarray Images.
Rivera, Robert; Wang, Jie; Yu, Xiaobo; Demirkan, Gokhan; Hopper, Marika; Bian, Xiaofang; Tahsin, Tasnia; Magee, D Mitchell; Qiu, Ji; LaBaer, Joshua; Wallstrom, Garrick
2017-11-03
In recent studies involving NAPPA microarrays, extra-well fluorescence is used as a key measure for identifying disease biomarkers because there is evidence to support that it is better correlated with strong antibody responses than statistical analysis involving intraspot intensity. Because this feature is not well quantified by traditional image analysis software, identification and quantification of extra-well fluorescence is performed manually, which is both time-consuming and highly susceptible to variation between raters. A system that could automate this task efficiently and effectively would greatly improve the process of data acquisition in microarray studies, thereby accelerating the discovery of disease biomarkers. In this study, we experimented with different machine learning methods, as well as novel heuristics, for identifying spots exhibiting extra-well fluorescence (rings) in microarray images and assigning each ring a grade of 1-5 based on its intensity and morphology. The sensitivity of our final system for identifying rings was found to be 72% at 99% specificity and 98% at 92% specificity. Our system performs this task significantly faster than a human, while maintaining high performance, and therefore represents a valuable tool for microarray image analysis.
Nursing protects honeybee larvae from secondary metabolites of pollen.
Lucchetti, Matteo A; Kilchenmann, Verena; Glauser, Gaetan; Praz, Christophe; Kast, Christina
2018-03-28
The pollen of many plants contains toxic secondary compounds, sometimes in concentrations higher than those found in the flowers or leaves. The ecological significance of these compounds remains unclear, and their impact on bees is largely unexplored. Here, we studied the impact of pyrrolizidine alkaloids (PAs) found in the pollen of Echium vulgare on honeybee adults and larvae. Echimidine, a PA present in E. vulgare pollen, was isolated and added to the honeybee diets in order to perform toxicity bioassays. While adult bees showed relatively high tolerance to PAs, larvae were much more sensitive. In contrast to other bees, the honeybee larval diet typically contains only traces of pollen and consists predominantly of hypopharyngeal and mandibular secretions produced by nurse bees, which feed on large quantities of pollen-containing bee bread. We quantified the transfer of PAs to nursing secretions produced by bees that had previously consumed bee bread supplemented with PAs. The PA concentration in these secretions was reduced by three orders of magnitude as compared to the PA content in the nurse diet and was well below the toxicity threshold for larvae. Our results suggest that larval nursing protects honeybee larvae from the toxic effect of secondary metabolites of pollen. © 2018 The Authors.
Martins, Cristina; Moreira da Silva, Nadia; Silva, Guilherme; Rozanski, Verena E; Silva Cunha, Joao Paulo
2016-08-01
Hippocampal sclerosis (HS) is the most common cause of temporal lobe epilepsy (TLE) and can be identified in magnetic resonance imaging as hippocampal atrophy and subsequent volume loss. Detecting this kind of abnormalities through simple radiological assessment could be difficult, even for experienced radiologists. For that reason, hippocampal volumetry is generally used to support this kind of diagnosis. Manual volumetry is the traditional approach but it is time consuming and requires the physician to be familiar with neuroimaging software tools. In this paper, we propose an automated method, written as a script that uses FSL-FIRST, to perform hippocampal segmentation and compute an index to quantify hippocampi asymmetry (HAI). We compared the automated detection of HS (left or right) based on the HAI with the agreement of two experts in a group of 19 patients and 15 controls, achieving 84.2% sensitivity, 86.7% specificity and a Cohen's kappa coefficient of 0.704. The proposed method is integrated in the "Advanced Brain Imaging Lab" (ABrIL) cloud neurocomputing platform. The automated procedure is 77% (on average) faster to compute vs. the manual volumetry segmentation performed by an experienced physician.
Liang, Li-Guo; Kong, Meng-Qi; Zhou, Sherry; Sheng, Ye-Feng; Wang, Ping; Yu, Tao; Inci, Fatih; Kuo, Winston Patrick; Li, Lan-Juan; Demirci, Utkan; Wang, ShuQi
2017-01-01
Extracellular vesicles (EVs), including exosomes and microvesicles, are present in a variety of bodily fluids, and the concentration of these sub-cellular vesicles and their associated biomarkers (proteins, nucleic acids, and lipids) can be used to aid clinical diagnosis. Although ultracentrifugation is commonly used for isolation of EVs, it is highly time-consuming, labor-intensive and instrument-dependent for both research laboratories and clinical settings. Here, we developed an integrated double-filtration microfluidic device that isolated and enriched EVs with a size range of 30–200 nm from urine, and subsequently quantified the EVs via a microchip ELISA. Our results showed that the concentration of urinary EVs was significantly elevated in bladder cancer patients (n = 16) compared to healthy controls (n = 8). Receiver operating characteristic (ROC) analysis demonstrated that this integrated EV double-filtration device had a sensitivity of 81.3% at a specificity of 90% (16 bladder cancer patients and 8 healthy controls). Thus, this integrated device has great potential to be used in conjunction with urine cytology and cystoscopy to improve clinical diagnosis of bladder cancer in clinics and at point-of-care (POC) settings. PMID:28436447
John Butnor; Brian Roth; Kurt Johnsen
2005-01-01
Tree root systems are commonly evaluated via labor intensive, destructive, time-consuming excavations. Ground-penetrating radar (GPR) can be used to detect and monitor roots if there is sufficient electromagnetic contrast with the surrounding soil matrix. This methodology is commonly used in civil engineering for non-destructive testing of concrete as well as road and...
Scott F. Pearson; Douglas J. Levey; Cathryn H. Greenberg; Carlos Martinez del Rio
2003-01-01
The use of stable isotopes to infer diet requires quantifying the relationship between diet and tissues and, in particular, knowing of how quickly isotopes turnover in different tissues and how isotopic concentrations of different food components change (discriminate) when incorporated into consumer tissues. We used feeding trials with wild-caught yellow-rumped...
2012-01-01
The electric grid in the United States has been suffering from underinvestment for years, and now faces pressing challenges from rising demand and deteriorating infrastructure. High congestion levels in transmission lines are greatly reducing the efficiency of electricity generation and distribution. In this paper, we assess the faults of the current electric grid and quantify the costs of maintaining the current system into the future. While the proposed “smart grid” contains many proposals to upgrade the ailing infrastructure of the electric grid, we argue that smart meter installation in each U.S. household will offer a significant reduction in peak demand on the current system. A smart meter is a device which monitors a household’s electricity consumption in real-time, and has the ability to display real-time pricing in each household. We conclude that these devices will provide short-term and long-term benefits to utilities and consumers. The smart meter will enable utilities to closely monitor electricity consumption in real-time, while also allowing households to adjust electricity consumption in response to real-time price adjustments. PMID:22540990
Bertram, S. M.; Bowen, M.; Kyle, M.; Schade, J. D.
2008-01-01
Heterotrophic organisms must obtain essential elements in sufficient quantities from their food. Because plants naturally exhibit extensive variation in their elemental content, it is important to quantify the within-species stoichiometric variation of consumers. If extensive stoichiometric variation exists, it may help explain consumer variation in life-history strategy and fitness. To date, however, research on stoichiometric variation has focused on interspecific differences and assumed minimal intraspecific differences. Here this assumption is tested. Natural variation is quantified in body stoichiometry of two terrestrial insects: the generalist field cricket, Gryllus texensis Cade and Otte (Orthoptera: Gryllidae) and a specialist curculionid weevil, Sabinia setosa (Le Conte) (Coleoptera: Curculionidae). Both species exhibited extensive intraspecific stoichiometric variation. Cricket body nitrogen content ranged from 8–12% and there was a four-fold difference in body phosphorus content, ranging from 0.32–1.27%. Body size explained half this stoichiometric variation, with larger individuals containing less nitrogen and phosphorus. Weevils exhibited an almost three-fold difference in body phosphorus content, ranging from 0.38–0.97%. Overall, the variation observed within each of these species is comparable to the variation previously observed across almost all terrestrial insect species. PMID:20298114
Individuality in nutritional preferences: a multi-level approach in field crickets.
Han, Chang S; Jäger, Heidi Y; Dingemanse, Niels J
2016-06-30
Selection may favour individuals of the same population to differ consistently in nutritional preference, for example, because optimal diets covary with morphology or personality. We provided Southern field crickets (Gryllus bimaculatus) with two synthetic food sources (carbohydrates and proteins) and quantified repeatedly how much of each macronutrient was consumed by each individual. We then quantified (i) whether individuals were repeatable in carbohydrate and protein intake rate, (ii) whether an individual's average daily intake of carbohydrates was correlated with its average daily intake of protein, and (iii) whether short-term changes in intake of carbohydrates coincided with changes in intake of protein within individuals. Intake rates were individually repeatable for both macronutrients. However, individuals differed in their relative daily intake of carbohydrates versus proteins (i.e., 'nutritional preference'). By contrast, total consumption varied plastically as a function of body weight within individuals. Body weight-but not personality (i.e., aggression, exploration behaviour)-positively predicted nutritional preference at the individual level as large crickets repeatedly consumed a higher carbohydrate to protein ratio compared to small ones. Our finding of level-specific associations between the consumption of distinct nutritional components demonstrates the merit of applying multivariate and multi-level viewpoints to the study of nutritional preference.
Context-Sensitive Spelling Correction of Consumer-Generated Content on Health Care
Chen, Rudan; Zhao, Xianyang; Xu, Wei; Cheng, Wenqing; Lin, Simon
2015-01-01
Background Consumer-generated content, such as postings on social media websites, can serve as an ideal source of information for studying health care from a consumer’s perspective. However, consumer-generated content on health care topics often contains spelling errors, which, if not corrected, will be obstacles for downstream computer-based text analysis. Objective In this study, we proposed a framework with a spelling correction system designed for consumer-generated content and a novel ontology-based evaluation system which was used to efficiently assess the correction quality. Additionally, we emphasized the importance of context sensitivity in the correction process, and demonstrated why correction methods designed for electronic medical records (EMRs) failed to perform well with consumer-generated content. Methods First, we developed our spelling correction system based on Google Spell Checker. The system processed postings acquired from MedHelp, a biomedical bulletin board system (BBS), and saved misspelled words (eg, sertaline) and corresponding corrected words (eg, sertraline) into two separate sets. Second, to reduce the number of words needing manual examination in the evaluation process, we respectively matched the words in the two sets with terms in two biomedical ontologies: RxNorm and Systematized Nomenclature of Medicine -- Clinical Terms (SNOMED CT). The ratio of words which could be matched and appropriately corrected was used to evaluate the correction system’s overall performance. Third, we categorized the misspelled words according to the types of spelling errors. Finally, we calculated the ratio of abbreviations in the postings, which remarkably differed between EMRs and consumer-generated content and could largely influence the overall performance of spelling checkers. Results An uncorrected word and the corresponding corrected word was called a spelling pair, and the two words in the spelling pair were its members. In our study, there were 271 spelling pairs detected, among which 58 (21.4%) pairs had one or two members matched in the selected ontologies. The ratio of appropriate correction in the 271 overall spelling errors was 85.2% (231/271). The ratio of that in the 58 spelling pairs was 86% (50/58), close to the overall ratio. We also found that linguistic errors took up 31.4% (85/271) of all errors detected, and only 0.98% (210/21,358) of words in the postings were abbreviations, which was much lower than the ratio in the EMRs (33.6%). Conclusions We conclude that our system can accurately correct spelling errors in consumer-generated content. Context sensitivity is indispensable in the correction process. Additionally, it can be confirmed that consumer-generated content differs from EMRs in that consumers seldom use abbreviations. Also, the evaluation method, taking advantage of biomedical ontology, can effectively estimate the accuracy of the correction system and reduce manual examination time. PMID:26232246
SCIENTIFIC UNCERTAINTIES IN ATMOSPHERIC MERCURY MODELS II: SENSITIVITY ANALYSIS IN THE CONUS DOMAIN
In this study, we present the response of model results to different scientific treatments in an effort to quantify the uncertainties caused by the incomplete understanding of mercury science and by model assumptions in atmospheric mercury models. Two sets of sensitivity simulati...
Pitt, Tracy J; Becker, Allan B; Chan-Yeung, Moira; Chan, Edmond S; Watson, Wade T A; Chooniedass, Rishma; Azad, Meghan B
2018-02-01
Recent trials have shown that avoiding peanuts during infancy increases the risk of peanut allergy; however, these studies did not address maternal peanut consumption. We sought to investigate the relationship between maternal peanut consumption while breast-feeding, timing of direct peanut introduction, and peanut sensitization at age 7 years. Secondary analysis of a nested cohort within the 1995 Canadian Asthma Primary Prevention Study intervention study was performed. Breast-feeding and maternal and infant peanut consumption were captured by repeated questionnaires during infancy. Skin prick testing for peanut sensitization was performed at age 7 years. Overall, 58.2% of mothers consumed peanuts while breast-feeding and 22.5% directly introduced peanuts to their infant by 12 months. At 7 years, 9.4% of children were sensitized to peanuts. The lowest incidence (1.7%) was observed among children whose mothers consumed peanuts while breast-feeding and directly introduced peanuts before 12 months. Incidence was significantly higher (P < .05) if mothers consumed peanuts while breast-feeding but delayed introducing peanuts to their infant beyond 12 months (15.1%), or if mothers avoided peanuts themselves but directly introduced peanuts by 12 months (17.6%). Interaction analyses controlling for study group and maternal atopy confirmed that maternal peanut consumption while breast-feeding and infant peanut consumption by 12 months were protective in combination, whereas either exposure in isolation was associated with an increased risk of sensitization (P interaction = .003). In this secondary analysis, maternal peanut consumption while breast-feeding paired with direct introduction of peanuts in the first year of life was associated with the lowest risk of peanut sensitization, compared with all other combinations of maternal and infant peanut consumption. Copyright © 2017 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
Ahrenstorff, Tyler D.; Diana, James S.; Fetzer, William W.; Jones, Thomas S.; Lawson, Zach J.; McInerny, Michael C.; Santucci, Victor J.; Vander Zanden, M. Jake
2018-01-01
Body size governs predator-prey interactions, which in turn structure populations, communities, and food webs. Understanding predator-prey size relationships is valuable from a theoretical perspective, in basic research, and for management applications. However, predator-prey size data are limited and costly to acquire. We quantified predator-prey total length and mass relationships for several freshwater piscivorous taxa: crappie (Pomoxis spp.), largemouth bass (Micropterus salmoides), muskellunge (Esox masquinongy), northern pike (Esox lucius), rock bass (Ambloplites rupestris), smallmouth bass (Micropterus dolomieu), and walleye (Sander vitreus). The range of prey total lengths increased with predator total length. The median and maximum ingested prey total length varied with predator taxon and length, but generally ranged from 10–20% and 32–46% of predator total length, respectively. Predators tended to consume larger fusiform prey than laterally compressed prey. With the exception of large muskellunge, predators most commonly consumed prey between 16 and 73 mm. A sensitivity analysis indicated estimates can be very accurate at sample sizes greater than 1,000 diet items and fairly accurate at sample sizes greater than 100. However, sample sizes less than 50 should be evaluated with caution. Furthermore, median log10 predator-prey body mass ratios ranged from 1.9–2.5, nearly 50% lower than values previously reported for freshwater fishes. Managers, researchers, and modelers could use our findings as a tool for numerous predator-prey evaluations from stocking size optimization to individual-based bioenergetics analyses identifying prey size structure. To this end, we have developed a web-based user interface to maximize the utility of our models that can be found at www.LakeEcologyLab.org/pred_prey. PMID:29543856
DIETARY SELENIUM PROTECTS AGAINST SELECTED SIGNS OF AGING AND METHYLMERCURY EXPOSURE
Banna, Kelly M.; Reed, Miranda N.; Pesek, Erin F.; Cole, Nathan; Li, Jun; Newland, M. Christopher
2010-01-01
Acute or short-term exposure to high doses of methylmercury (MeHg) causes a well-characterized syndrome that includes sensory and motor deficits. The environmental threat from MeHg, however, comes from chronic, low-level exposure, the consequences of which are poorly understood. Selenium (Se), an essential nutrient, both increases deposition of mercury (Hg) in neurons and mitigates some of MeHg's neurotoxicity in the short term, but it is unclear whether this deposition produces long-term adverse consequences. To investigate these issues, adult Long Evans rats were fed a diet containing 0.06 or 0.6 ppm of Se as sodium selenite. After 100 days on these diets, the subjects began consuming 0.0, 0.5, 5.0, or 15 ppm of Hg as methylmercuric chloride in their drinking water for 16 months. Somatosensory sensitivity, grip strength, hind-limb cross (clasping reflex), flexion, and voluntary wheel-running in overnight sessions were among the measures examined. MeHg caused a dose- and time-dependent impairment in all measures, No effects appeared in rats consuming 0 or 0.5 ppm of Hg. Somatosensory function, grip strength, and flexion were among the earliest signs of exposure. Selenium significantly delayed or blunted MeHgs effects. Selenium also increased running in unexposed animals as they aged, a novel finding that may have important clinical implications. Nerve pathology studies revealed axonal atrophy or mild degeneration in peripheral nerve fibers, which is consistent with abnormal sensorimotor function in chronic MeHg neurotoxicity. Lidocaine challenge reproduced the somatosensory deficits but not hind-limb cross or flexion. Together, these results quantify the neurotoxicity of long-term MeHg exposure, support the safety and efficacy of Se in ameliorating MeHg's neurotoxicity, and demonstrate the potential benefits of Se during aging. PMID:20079371
Yi, Andy Xianliang; Leung, Kenneth M Y; Lam, Michael H W; Lee, Jae-Seong; Giesy, John P
2012-11-01
The state of scientific knowledge regarding analytical methods, environmental fate, ecotoxicity and ecological risk of triphenyltin (TPT) compounds in marine ecosystems as well as their exposure and health hazard to humans was reviewed. Since the 1960s, TPT compounds have been commonly applied as biocides for diverse industrial and agricultural purposes. For instance, they are used as active ingredients in antifouling systems on marine vessels and mariculture facilities, and as fungicides in agriculture. Due to their intensive use, contamination of coastal waters by TPT and its products of transformation has become a worldwide problem. The proportion of quantified TPT to total phenyltin compounds in the marine environment provides evidence that TPT is photodegradable in water and sediment but resistant to biotransformation. Concentrations of TPT in marine biota are consistently greater than concentrations in water and sediment, which implies potential of TPT to bioaccumulate. TPT is toxic to both marine plants and animals. The predicted no effect concentration (PNEC) for TPT, as determined by use of the species sensitivity distribution approach, is 0.64 ng L(-1). In some parts of the world, concentrations of TPT in seawater exceed the PNEC, indicating that TPT can pose risks to marine life. Although there is negligible risk of TPT to average human consumers, TPT has been detected in blood of Finnish people and the concentration was greater in fishermen who ate more seafood. It is, therefore, advocated to initiate regular monitoring of TPT in blood and breast milk of populations that consume greater amounts of seafood. Copyright © 2012 Elsevier Ltd. All rights reserved.
Gaeta, Jereme W; Ahrenstorff, Tyler D; Diana, James S; Fetzer, William W; Jones, Thomas S; Lawson, Zach J; McInerny, Michael C; Santucci, Victor J; Vander Zanden, M Jake
2018-01-01
Body size governs predator-prey interactions, which in turn structure populations, communities, and food webs. Understanding predator-prey size relationships is valuable from a theoretical perspective, in basic research, and for management applications. However, predator-prey size data are limited and costly to acquire. We quantified predator-prey total length and mass relationships for several freshwater piscivorous taxa: crappie (Pomoxis spp.), largemouth bass (Micropterus salmoides), muskellunge (Esox masquinongy), northern pike (Esox lucius), rock bass (Ambloplites rupestris), smallmouth bass (Micropterus dolomieu), and walleye (Sander vitreus). The range of prey total lengths increased with predator total length. The median and maximum ingested prey total length varied with predator taxon and length, but generally ranged from 10-20% and 32-46% of predator total length, respectively. Predators tended to consume larger fusiform prey than laterally compressed prey. With the exception of large muskellunge, predators most commonly consumed prey between 16 and 73 mm. A sensitivity analysis indicated estimates can be very accurate at sample sizes greater than 1,000 diet items and fairly accurate at sample sizes greater than 100. However, sample sizes less than 50 should be evaluated with caution. Furthermore, median log10 predator-prey body mass ratios ranged from 1.9-2.5, nearly 50% lower than values previously reported for freshwater fishes. Managers, researchers, and modelers could use our findings as a tool for numerous predator-prey evaluations from stocking size optimization to individual-based bioenergetics analyses identifying prey size structure. To this end, we have developed a web-based user interface to maximize the utility of our models that can be found at www.LakeEcologyLab.org/pred_prey.
Hapten-specific lymphocyte transformation in humans sensitized with NDMA or DNCB.
SoebergB; Andersen, V
1976-01-01
The primary immune response to a contact sensitizing dose of para-N-dimethylnitrosaniline (NDMA) and dinitrochlorobenzene (DNCB) was obtained in humans and measured in vitro by increased thymidine incorporation into sensitized lymphocytes. No cross-reaction was found between these two haptens, and it is thus possible on two separate occasions to quantify and follow the primary cellular immune response in man. PMID:963911
Consumers limit the abundance and dynamics of a perennial shrub with a seed bank
Kauffman, M.J.; Maron, J.L.
2006-01-01
For nearly 30 years, ecologists have argued that predators of seeds and seedlings seldom have population-level effects on plants with persistent seed banks and density-dependent seedling survival. We parameterized stage-based population models that incorporated density dependence and seed dormancy with data from a 5.5-year experiment that quantified how granivorous mice and herbivorous voles influence bush lupine (Lupinus arboreus) demography. We asked how seed dormancy and density-dependent seedling survival mediate the impacts of these consumers in dune and grassland habitats. In dune habitat, mice reduced analytical ?? (the intrinsic rate of population growth) by 39%, the equilibrium number of above-ground plants by 90%, and the seed bank by 98%; voles had minimal effects. In adjacent grasslands, mice had minimal effects, but seedling herbivory by voles reduced analytical ?? by 15% and reduced both the equilibrium number of aboveground plants and dormant seeds by 63%. A bootstrap analysis demonstrated that these consumer effects were robust to parameter uncertainty. Our results demonstrate that the quantitative strengths of seed dormancy and density-dependent seedling survival-not their mere existence-critically mediate consumer effects. This study suggests that plant population dynamics and distribution may be more strongly influenced by consumers of seeds and seedlings than is currently recognized. ?? 2006 by The University of Chicago.
Costs and benefits of communicating product safety information to the public via the Internet.
Saoutert, Erwan; Andreasen, Ina
2006-04-01
Procter & Gamble (P&G) developed Science-in-the-Box (SIB; www.scienceinthebox.com) after discussions with their stakeholders as to how the consumer products company could better communicate key environmental performance and safety information to the public. A series of workshops enabled P&G to understand that consumers and other key business decision makers wanted meaningful information about the science behind P&G products. In addition, it was clear that making such information available would produce business benefits by encouraging long-term relationships with decision makers ranging from consumers and retailers to policy makers and nongovernmental organizations. These benefits were not necessarily quantifiable in the short term, but they still had to be balanced by the costs in terms of resource commitment and potential intellectual property issues. Since its inception in September 2002, SIB has successfully reached key target audiences and built improved credibility and confidence in P&G products and approaches. The website is now available in English, French, Spanish, German, and Italian and is used by consumers, journalists, teachers, scientists, and policy makers. Several user surveys carried out during the initial developmental period, together with unsolicited e-mail feedback, have demonstrated that SIB has successfully created a platform for continuous dialogue with consumers and other interested parties.
Maier, Michelle A.; Uchii, Kimiko; Peterson, Tawnya D.
2016-01-01
ABSTRACT Lethal parasitism of large phytoplankton by chytrids (microscopic zoosporic fungi) may play an important role in organic matter and nutrient cycling in aquatic environments by shunting carbon away from hosts and into much smaller zoospores, which are more readily consumed by zooplankton. This pathway provides a mechanism to more efficiently retain carbon within food webs and reduce export losses. However, challenges in accurate identification and quantification of chytrids have prevented a robust assessment of the relative importance of parasitism for carbon and energy flows within aquatic systems. The use of molecular techniques has greatly advanced our ability to detect small, nondescript microorganisms in aquatic environments in recent years, including chytrids. We used quantitative PCR (qPCR) to quantify the consumption of zoospores by Daphnia in laboratory experiments using a culture-based comparative threshold cycle (CT) method. We successfully quantified the reduction of zoospores in water samples during Daphnia grazing and confirmed the presence of chytrid DNA inside the daphnid gut. We demonstrate that comparative CT qPCR is a robust and effective method to quantify zoospores and evaluate zoospore grazing by zooplankton and will aid in better understanding how chytrids contribute to organic matter cycling and trophic energy transfer within food webs. IMPORTANCE The study of aquatic fungi is often complicated by the fact that they possess complex life cycles that include a variety of morphological forms. Studies that rely on morphological characteristics to quantify the abundances of all stages of the fungal life cycle face the challenge of correctly identifying and enumerating the nondescript zoospores. These zoospores, however, provide an important trophic link between large colonial phytoplankton and zooplankton: that is, once the carbon is liberated from phytoplankton into the parasitic zoospores, the latter are consumed by zooplankton and carbon is retained in the aquatic food web rather than exported from the system. This study provides a tool to quantify zoospores and evaluate the consumption of zoospores by zooplankton in order to further our understanding of their role in food web dynamics. PMID:27107112
Maier, Michelle A; Uchii, Kimiko; Peterson, Tawnya D; Kagami, Maiko
2016-07-01
Lethal parasitism of large phytoplankton by chytrids (microscopic zoosporic fungi) may play an important role in organic matter and nutrient cycling in aquatic environments by shunting carbon away from hosts and into much smaller zoospores, which are more readily consumed by zooplankton. This pathway provides a mechanism to more efficiently retain carbon within food webs and reduce export losses. However, challenges in accurate identification and quantification of chytrids have prevented a robust assessment of the relative importance of parasitism for carbon and energy flows within aquatic systems. The use of molecular techniques has greatly advanced our ability to detect small, nondescript microorganisms in aquatic environments in recent years, including chytrids. We used quantitative PCR (qPCR) to quantify the consumption of zoospores by Daphnia in laboratory experiments using a culture-based comparative threshold cycle (CT) method. We successfully quantified the reduction of zoospores in water samples during Daphnia grazing and confirmed the presence of chytrid DNA inside the daphnid gut. We demonstrate that comparative CT qPCR is a robust and effective method to quantify zoospores and evaluate zoospore grazing by zooplankton and will aid in better understanding how chytrids contribute to organic matter cycling and trophic energy transfer within food webs. The study of aquatic fungi is often complicated by the fact that they possess complex life cycles that include a variety of morphological forms. Studies that rely on morphological characteristics to quantify the abundances of all stages of the fungal life cycle face the challenge of correctly identifying and enumerating the nondescript zoospores. These zoospores, however, provide an important trophic link between large colonial phytoplankton and zooplankton: that is, once the carbon is liberated from phytoplankton into the parasitic zoospores, the latter are consumed by zooplankton and carbon is retained in the aquatic food web rather than exported from the system. This study provides a tool to quantify zoospores and evaluate the consumption of zoospores by zooplankton in order to further our understanding of their role in food web dynamics. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Hattingh, Hendrika Laetitia; Knox, Kathy; Fejzic, Jasmina; McConnell, Denise; Fowler, Jane L; Mey, Amary; Kelly, Fiona; Wheeler, Amanda J
2015-02-01
The study aims to explore within the community pharmacy practice context the views of mental health stakeholders on: (1) current and past experiences of privacy, confidentiality and support; and (2) expectations and needs in relation to privacy and confidentiality. In-depth interviews and focus groups were conducted in three states in Australia, namely Queensland, the northern region of New South Wales and Western Australia, between December 2011 and March 2012. There were 98 participants consisting of consumers and carers (n = 74), health professionals (n = 13) and representatives from consumer organisations (n = 11). Participants highlighted a need for improved staff awareness. Consumers indicated a desire to receive information in a way that respects their privacy and confidentiality, in an appropriate space. Areas identified that require improved protection of privacy and confidentiality during pharmacy interactions were the number of staff having access to sensitive information, workflow models causing information exposure and pharmacies' layout not facilitating private discussions. Challenges experienced by carers created feelings of isolation which could impact on care. This study explored mental health stakeholders' experiences and expectations regarding privacy and confidentiality in the Australian community pharmacy context. A need for better pharmacy staff training about the importance of privacy and confidentiality and strategies to enhance compliance with national pharmacy practice requirements was identified. Findings provided insight into privacy and confidentiality needs and will assist in the development of pharmacy staff training material to better support consumers with sensitive conditions. © 2014 Royal Pharmaceutical Society.
Eradication of invasive Tamarix ramosissima along a desert stream increases native fish density
Kennedy, T.A.; Finlay, J.C.; Hobbie, S.E.
2005-01-01
Spring ecosystems of the western United States have high conservation value, particularly because of the highly endemic, and often endangered, fauna that they support. Refuges now protect these habitats from many of the human impacts that once threatened them, but invasive species often persist. Invasive saltcedar is ubiquitous along streams, rivers, and spring ecosystems of the western United States, yet the impact of saltcedar invasion on these ecosystems, or ecosystem response to its removal, have rarely been quantified. Along Jackrabbit Spring, a springbrook in Nevada that supports populations of two endangered fish (Ash Meadows pupfish and Ash Meadows speckled dace) as well as several exotic aquatic consumers, we quantified the response of aquatic consumers to largescale saltcedar removal and identified the mechanism underlying consumer response to the removal. Clearing saltcedar from the riparian zone increased densities of native pupfish and exotic screw snails, but decreased the density of exotic crayfish. Positive effects of saltcedar removal on pupfish and snails occurred because saltcedar heavily shades the stream, greatly reducing the availability of algae for herbivores. This was confirmed by analyses of potential organic matter sources and consumer 13C: pupfish and snails, along with native dace and exotic mosquitofish, relied heavily on algae-derived carbon and not saltcedar-derived carbon. By contrast, crayfish ??13C values mirrored algae ??13C during summer, but in winter indicated reliance on allochthonous saltcedar litter that dominated organic inputs in saltcedar reaches and on algae-derived carbon where saltcedar was absent. The seasonal use of saltcedar by crayfish likely explains its negative response to saltcedar removal. Clearing saltcedar effectively restored the springbrook of Jackrabbit Spring to the conditions characteristic of native vegetation sites. Given the high conservation value of spring ecosystems and the potential conservation benefits of saltcedar removal that this research highlights, eradicating saltcedar from spring ecosystems of the western United States should clearly be a management priority. ?? 2005 by the Ecological Society of America.
Meyerding, Stephan G H
2016-08-01
In many studies, consumer preferences are determined by using direct surveys. For this method social desirability is problematic. This leads to the effect that participants answer in a way that they perceive as desired by society. This leads to the stated importance of certain features in these studies not being reflected in real purchasing decisions. Therefore, the aim of the study is to compare consumer preferences measured by a quasi-experiment to those quantified by direct questions. Another objective is to quantify the part-worth utilities of product characteristics such as origin, price and food labels. Part-worth utilities are estimated on an interval scale with an arbitrary origin and are a measure for preferences. The real purchasing situation was simulated in a quasi-experiment using a choice-based conjoint analysis. The part-worth utilities were then compared with the results of a conventional preference assessment (Likert scale). For this purpose, 645 consumers from all over Germany were surveyed in 2014. The participants were on average 44 years old and 63% were women. The results of the conjoint analysis report the highest part-worth utility (2.853) for the lowest price (1.49€), followed by the characteristic "grown locally" (2.157). For the labels, the German organic label shows the highest part-worth utility (0.785) followed by Fairtrade/"A heart for the producer" (0.200). It is noticeable that the carbon footprint labels have negative part-worth utilities compared to tomatoes without a label (-0.130 with CO2 indication, -0.186 without CO2 indication). The price is ranked 12th in the importance of the characteristics of purchasing tomatoes in the survey with a Likert scale, whereas it is first in the evaluation of the quasi-experiment (conjoint analysis), which supports the assumption of a social desirability bias. Copyright © 2016 Elsevier Ltd. All rights reserved.
Boonen, Lieke H H M; Donkers, Bas; Schut, Frederik T
2011-01-01
Context To effectively bargain about the price and quality of health services, health insurers need to successfully channel their enrollees. Little is known about consumer sensitivity to different channeling incentives. In particular, the impact of status quo bias, which is expected to differ between different provider types, can play a large role in insurers' channeling ability. Objective To examine consumer sensitivity to channeling strategies and to analyze the impact of status quo bias for different provider types. Data Sources/Study Design With a large-scale discrete choice experiment, we investigate the impact of channeling incentives on choices for pharmacies and general practitioners (GPs). Survey data were obtained among a representative Dutch household panel (n=2,500). Principal Findings Negative financial incentives have a two to three times larger impact on provider choice than positive ones. Positive financial incentives have a relatively small impact on GP choice, while the impact of qualitative incentives is relatively large. Status quo bias has a large impact on provider choice, which is more prominent in the case of GPs than in the case of pharmacies. Conclusion The large impact of the status quo bias makes channeling consumers away from their current providers a daunting task, particularly in the case of GPs. PMID:21029092
Boonen, Lieke H H M; Donkers, Bas; Schut, Frederik T
2011-04-01
To effectively bargain about the price and quality of health services, health insurers need to successfully channel their enrollees. Little is known about consumer sensitivity to different channeling incentives. In particular, the impact of status quo bias, which is expected to differ between different provider types, can play a large role in insurers' channeling ability. To examine consumer sensitivity to channeling strategies and to analyze the impact of status quo bias for different provider types. With a large-scale discrete choice experiment, we investigate the impact of channeling incentives on choices for pharmacies and general practitioners (GPs). Survey data were obtained among a representative Dutch household panel (n = 2,500). Negative financial incentives have a two to three times larger impact on provider choice than positive ones. Positive financial incentives have a relatively small impact on GP choice, while the impact of qualitative incentives is relatively large. Status quo bias has a large impact on provider choice, which is more prominent in the case of GPs than in the case of pharmacies. The large impact of the status quo bias makes channeling consumers away from their current providers a daunting task, particularly in the case of GPs. © Health Research and Educational Trust.
Blake, Stephen; Guézou, Anne; Deem, Sharon L.; Yackulic, Charles B.; Cabrera, Fredy
2015-01-01
The distribution of resources and food selection are fundamental to the ecology, life history, physiology, population dynamics, and conservation of animals. Introduced plants are changing foraging dynamics of herbivores in many ecosystems often with unknown consequences. Galapagos tortoises, like many herbivores, undertake migrations along elevation gradients driven by variability in vegetation productivity which take them into upland areas dominated by introduced plants. We sought to characterize diet composition of two species of Galapagos tortoises, focussing on how the role of introduced forage species changes over space and the implications for tortoise conservation. We quantified the distribution of tortoises with elevation using GPS telemetry. Along the elevation gradient, we quantified the abundance of introduced and native plant species, estimated diet composition by recording foods consumed by tortoises, and assessed tortoise physical condition from body weights and blood parameter values. Tortoises ranged between 0 and 429 m in elevation over which they consumed at least 64 plant species from 26 families, 44 percent of which were introduced species. Cover of introduced species and the proportion of introduced species in tortoise diets increased with elevation. Introduced species were positively selected for by tortoises at all elevations. Tortoise physical condition was either consistent or increased with elevation at the least biologically productive season on Galapagos. Santa Cruz tortoises are generalist herbivores that have adapted their feeding behavior to consume many introduced plant species that has likely made a positive contribution to tortoise nutrition. Some transformed habitats that contain an abundance of introduced forage species are compatible with tortoise conservation.
Quantifying uncertainty and sensitivity in sea ice models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Urrego Blanco, Jorge Rolando; Hunke, Elizabeth Clare; Urban, Nathan Mark
The Los Alamos Sea Ice model has a number of input parameters for which accurate values are not always well established. We conduct a variance-based sensitivity analysis of hemispheric sea ice properties to 39 input parameters. The method accounts for non-linear and non-additive effects in the model.
Foxall, Gordon R; Oliveira-Castro, Jorge M; Schrezenmaier, Teresa C
2004-06-30
Purchasers of fast-moving consumer goods generally exhibit multi-brand choice, selecting apparently randomly among a small subset or "repertoire" of tried and trusted brands. Their behavior shows both matching and maximization, though it is not clear just what the majority of buyers are maximizing. Each brand attracts, however, a small percentage of consumers who are 100%-loyal to it during the period of observation. Some of these are exclusively buyers of premium-priced brands who are presumably maximizing informational reinforcement because their demand for the brand is relatively price-insensitive or inelastic. Others buy exclusively the cheapest brands available and can be assumed to maximize utilitarian reinforcement since their behavior is particularly price-sensitive or elastic. Between them are the majority of consumers whose multi-brand buying takes the form of selecting a mixture of economy -- and premium-priced brands. Based on the analysis of buying patterns of 80 consumers for 9 product categories, the paper examines the continuum of consumers so defined and seeks to relate their buying behavior to the question of how and what consumers maximize.
NASA Astrophysics Data System (ADS)
Sut-Lohmann, Magdalena; Raab, Thomas
2017-04-01
Contaminated sites create a significant risk to human health, by poisoning drinking water, soil, air and as a consequence food. Continuous release of persistent iron-cyanide (Fe-CN) complexes from various industrial sources poses a high hazard to the environment and indicates the necessity to analyze considerable amount of samples. At the present time quantitative determination of Fe-CN concentration in soil usually requires a time consuming two step process: digestion of the sample (e.g., micro distillation system) and its analytical detection performed, e.g., by automated spectrophotometrical flow injection analysis (FIA). In order to determine the feasibility of diffuse reflectance infrared Fourier spectroscopy (DRIFTS) to quantify the Fe-CN complexes in soil matrix, 42 soil samples were collected (8 to 12.520 mg kg-1CN) indicating single symmetrical CN band in the range 2092 - 2084 cm-1. Partial least squares (PLS) calibration-validation model revealed IR response to CNtot exceeding 1268 mg kg-1 (limit of detection, LOD). Subsequently, leave-one-out cross-validation (LOO-CV) was performed on soil samples containing low CNtot (<900 mg kg-1), which improved the sensitivity of the model by reducing the LOD to 154 mg kg-1. Finally, the LOO-CV conducted on the samples with CNtot >900 mg kg-1 resulted in LOD equal to 3494 mg kg-1. Our results indicate that spectroscopic data in combination with PLS statistics can efficiently be used to predict Fe-CN concentrations in soil. We conclude that the protocol applied in this study can strongly reduce the time and costs essential for the spatial and vertical screening of the site affected by complexed Fe-CN.
He, Yang; Al-Abed, Souhail R; Dionysiou, Dionysios D
2017-02-15
Carbon nanotubes (CNTs) have been incorporated into numerous consumer products, and have also been employed in various industrial areas because of their extraordinary properties. The large scale production and wide applications of CNTs make their release into the environment a major concern. Therefore, it is crucial to determine the degree of potential CNT contamination in the environment, which requires a sensitive and accurate technique for selectively detecting and quantifying CNTs in environmental matrices. In this study, a simple device based on utilizing heat generated/temperature increase from CNTs under microwave irradiation was built to quantify single-walled CNTs (SWCNTs), multi-walled CNTs (MWCNTs) and carboxylated CNTs (MWCNT-COOH) in three environmentally relevant matrices (sand, soil and sludge). Linear temperature vs CNT mass relationships were developed for the three environmental matrices spiked with known amounts of different types of CNTs that were then irradiated in a microwave at low energies (70-149W) for a short time (15-30s). MWCNTs had a greater microwave response in terms of heat generated/temperature increase than SWCNTs and MWCNT-COOH. An evaluation of microwave behavior of different carbonaceous materials showed that the microwave measurements of CNTs were not affected even with an excess of other organic, inorganic carbon or carbon based nanomaterials (fullerene, granular activated carbon and graphene oxide), mainly because microwave selectively heats materials such as CNTs that have a higher dielectric loss factor. Quantification limits using this technique for the sand, soil and sludge were determined as low as 18.61, 27.92, 814.4μg/g for MWCNTs at a microwave power of 133W and exposure time of 15s. Published by Elsevier B.V.
Okpala, Charles Odilichukwu R.; Bono, Gioacchino; Pipitone, Vito; Vitale, Sergio; Cannizzaro, Leonardo
2016-01-01
Background To date, there seems to be limited-to-zero emphasis about how consumers perceive crustacean products subject to either chemical and or non-chemical preservative treatments. In addition, studies that investigated price comparisons of crustacean products subject to either chemical or chemical-free preservative methods seem unreported. Objective This study focused on providing some foundational knowledge about how consumers perceive traditionally harvested crustaceans that are either chemical-treated and or free of chemicals, incorporating price comparisons using a descriptive approach. Design The study design employed a questionnaire approach via interview using a computer-assisted telephone system and sampled 1,540 participants across five key locations in Italy. To actualize consumer sensitivity, ‘price’ was the focus given its crucial role as a consumption barrier. Prior to this, variables such as demographic characteristics of participants, frequency of purchasing, quality attributes/factors that limit the consumption of crustaceans were equally considered. Results By price comparisons, consumers are likely to favor chemical-free (modified atmosphere packaging) crustacean products amid a price increase of up to 15%. But, a further price increase such as by 25% could markedly damage consumers’ feelings, which might lead to a considerable number opting out in favor of either chemical-treated or other seafood products. Comparing locations, the studied variables showed no statistical differences (p>0.05). On the contrary, the response weightings fluctuated across the studied categories. Both response weightings and coefficient of variation helped reveal more about how responses deviated per variable categories. Conclusions This study has revealed some foundational knowledge about how consumers perceive traditionally harvested crustaceans that were either chemical-treated or subject to chemical-free preservative up to price sensitivity using Italy as a reference case, which is applicable to other parts of the globe. PMID:27799084
Sensitization and Habituation of Motivated Behavior in Overweight and Non-Overweight Children
ERIC Educational Resources Information Center
Epstein, Leonard H.; Robinson, Jodie L.; Temple, Jennifer L.; Roemmich, James N.; Marusewski, Angela; Nadbrzuch, Rachel
2008-01-01
The rate of habituation to food is inversely related to energy intake, and overweight children may habituate slower to food and consume more energy. This study compared patterns of sensitization, as defined by an initial increase in operant or motivated responding for food, and habituation, defined by gradual reduction in responding, for macaroni…
USDA-ARS?s Scientific Manuscript database
Health benefits of whole grains (WG) are well known, yet consumption by Americans falls far short of recommended amounts. Roughly 75% of Americans are sensitive to bitter taste, and WG are known to contain bitter tasting phenolic compounds. It has been reported that individuals with the highest se...
Camp, Meghan J; Shipley, Lisa A; Johnson, Timothy R; Forbey, Jennifer Sorensen; Rachlow, Janet L; Crowell, Miranda M
2015-12-01
When selecting habitats, herbivores must weigh multiple risks, such as predation, starvation, toxicity, and thermal stress, forcing them to make fitness trade-offs. Here, we applied the method of paired comparisons (PC) to investigate how herbivores make trade-offs between habitat features that influence selection of food patches. The method of PC measures utility and the inverse of utility, relative risk, and makes trade-offs and indifferences explicit by forcing animals to make choices between two patches with different types of risks. Using a series of paired-choice experiments to titrate the equivalence curve and find the marginal rate of substitution for one risk over the other, we evaluated how toxin-tolerant (pygmy rabbit Brachylagus idahoensis) and fiber-tolerant (mountain cottontail rabbit Sylviagus nuttallii) herbivores differed in their hypothesized perceived risk of fiber and toxins in food. Pygmy rabbits were willing to consume nearly five times more of the toxin 1,8-cineole in their diets to avoid consuming higher levels of fiber than were mountain cottontails. Fiber posed a greater relative risk for pygmy rabbits than cottontails and cineole a greater risk for cottontails than pygmy rabbits. Our flexible modeling approach can be used to (1) quantify how animals evaluate and trade off multiple habitat attributes when the benefits and risks are difficult to quantify, and (2) integrate diverse risks that influence fitness and habitat selection into a single index of habitat value. This index potentially could be applied to landscapes to predict habitat selection across several scales.
Muscle Glycogen Utilisation during an Australian Rules Football Game.
Routledge, Harry E; Leckey, Jill J; Lee, Matt J; Garnham, Andrew; Graham, Stuart; Burgess, Darren; Burke, Louise M; Erskine, Robert M; Close, Graeme L; Morton, James P
2018-06-12
To better understand the carbohydrate (CHO) requirement of Australian Football (AF) match play by quantifying muscle glycogen utilisation during an in-season AF match. After a 24 h CHO loading protocol of 8 g/kg and 2 g/kg in the pre-match meal, two elite male forward players had biopsies sampled from m. vastus lateralis before and after participation in a South Australian Football League game. Player A (87.2kg) consumed water only during match play whereas player B (87.6kg) consumed 88 g CHO via CHO gels. External load was quantified using global positioning system technology. Player A completed more minutes on the ground (115 vs. 98 min) and covered greater total distance (12.2 vs. 11.2 km) than Player B, though with similar high-speed running (837 vs. 1070 m) and sprinting (135 vs. 138 m), respectively. Muscle glycogen decreased by 66% in Player A (Pre-: 656, Post-: 223 mmol∙kg-1 dw) and 24% in Player B (Pre-: 544, Post-: 416 mmol∙kg-1 dw), respectively. Pre-match CHO loading elevated muscle glycogen concentrations (i.e. >500 mmol.kg-1 dw), the magnitude of which appears sufficient to meet the metabolic demands of elite AF match play. The glycogen cost of AF match play may be greater than soccer and rugby and CHO feeding may also spare muscle glycogen use. Further studies using larger sample sizes are now required to quantify the inter-individual variability of glycogen cost of match play (including muscle and fibre-type specific responses) as well examine potential metabolic and ergogenic effects of CHO feeding.
ERIC Educational Resources Information Center
Hoover, Eric C.; Souza, Pamela E.; Gallun, Frederick J.
2012-01-01
Purpose: The benefits of amplitude compression in hearing aids may be limited by distortion resulting from rapid gain adjustment. To evaluate this, it is convenient to quantify distortion by using a metric that is sensitive to the changes in the processed signal that decrease consonant recognition, such as the Envelope Difference Index (EDI;…
C. T. Scott; R. Hernandez; C. Frihart; R. Gleisner; T. Tice
2005-01-01
A new method for quantifying percentage wood failure of an adhesively bonded block-shear specimen has been developed. This method incorporates a laser displacement gage with an automated two-axis positioning system that functions as a highly sensitive profilometer. The failed specimen is continuously scanned across its width to obtain a surface failure profile. The...
Quantifying the Thermal Fatigue of CPV Modules
NASA Astrophysics Data System (ADS)
Bosco, Nick; Kurtz, Sarah
2010-10-01
A method is presented to quantify thermal fatigue in the CPV die-attach from meteorological data. A comparative study between cities demonstrates a significant difference in the accumulated damage. These differences are most sensitive to the number of larger (ΔT) thermal cycles experienced for a location. High frequency data (<1/min) may be required to most accurately employ this method.
Jing, Rong-Rong; Wang, Hui-Min; Cui, Ming; Fang, Meng-Kang; Qiu, Xiao-Jun; Wu, Xin-Hua; Qi, Jin; Wang, Yue-Guo; Zhang, Lu-Rong; Zhu, Jian-Hua; Ju, Shao-Qing
2011-09-01
Human cell-free circulating DNA (cf-DNA) derived mainly from cell apoptosis and necrosis can be measured by a variety of laboratory techniques, but almost all of these methods require sample preparation. We have developed a branched DNA (bDNA)-based Alu assay for quantifying cf-DNA in myocardial infarction (MI) patients. A total of 82 individuals were included in the study; 22 MI and 60 normal controls. cf-DNA was quantified using a bDNA-based Alu assay. cf-DNA was higher in serum compared to plasma and there was a difference between genders. cf-DNA was significantly higher in MI patients compared to the controls. There was no correlation between cf-DNA and creatine kinase-MB (CK-MB), troponin I (cTnI) or myoglobin (MYO). In serial specimens, cf-DNA was sensitive and peaked earlier than cTnI. The bDNA-based Alu assay is a novel method for quantifying human cf-DNA. Increased cf-DNA in MI patients might complement cTnI, CK-MB and MYO in a multiple marker format. Copyright © 2011 The Canadian Society of Clinical Chemists. All rights reserved.
NASA Astrophysics Data System (ADS)
Liang, Q.; Wu, W.; Zhang, D.; Wei, B.; Sun, W.; Wang, Y.; Ge, Y.
2015-10-01
Roughness, which can represent the trade-off between manufacturing cost and performance of mechanical components, is a critical predictor of cracks, corrosion and fatigue damage. In order to measure polished or super-finished surfaces, a novel touch probe based on three-component force sensor for characterizing and quantifying surface roughness is proposed by using silicon micromachining technology. The sensor design is based on a cross-beam structure, which ensures that the system possesses high sensitivity and low coupling. The results show that the proposed sensor possesses high sensitivity, low coupling error, and temperature compensation function. The proposed system can be used to investigate micromechanical structures with nanometer accuracy.
DOT National Transportation Integrated Search
1997-03-01
The report was written as a part of an on-going research project called "Environmental Impacts of Quantifiable Consumption Patterns." Methods, data and resulrs form an assessment of the arable land use and some greenhouse gas emissions during part of...
ERIC Educational Resources Information Center
Cowan, Logan T.; Van Wagenen, Sarah A.; Brown, Brittany A.; Hedin, Riley J.; Seino-Stephan, Yukiko; Hall, P. Cougar; West, Joshua H.
2013-01-01
Objective. To quantify the presence of health behavior theory constructs in iPhone apps targeting physical activity. Methods. This study used a content analysis of 127 apps from Apple's (App Store) "Health & Fitness" category. Coders downloaded the apps and then used an established theory-based instrument to rate each app's inclusion of…
USDA-ARS?s Scientific Manuscript database
Coffee is a most consumed drink worldwide. In this paper, from three commercially available instant coffees, major chlorogenic acids were isolated and quantified using HPLC and NMR spectroscopic methods. Also, their anti-oxidant and anti-inflammatory activities were determined using DPPH-radical sca...
Roger D. Ottmar; David V. Sandberg; Cynthia L. Riccardi; Susan J. Prichard
2007-01-01
We present an overview of the Fuel Characteristic Classification System (FCCS), a tool that enables land managers, regulators, and scientists to create and catalog fuelbeds and to classify those fuelbeds for their capacity to support fire and consume fuels. The fuelbed characteristics and fire classification from this tool will provide inputs for current and future...
Ley, S H; Hanley, A J; Sermer, M; Zinman, B; O'Connor, D L
2013-11-01
Beneficial effects of vitamin E on insulin sensitivity have been reported in observational and short-term intervention studies in non-pregnant populations. We aimed to investigate whether dietary vitamin E intake during the second trimester would be associated with glucose metabolism later in pregnancy and whether this association would be influenced by an insulin-sensitizing hormone adiponectin. Women with singleton pregnancies (n=205) underwent a 3-h oral glucose tolerance test at 30 weeks gestation and were asked to recall second trimester dietary intake. Higher dietary vitamin E intake was associated with lower fasting glucose, lower HOMA insulin resistance, and higher Matsuda insulin sensitivity index after covariate adjustment including serum adiponectin among women consuming daily multivitamin supplements (all P≤0.03). Lower dietary vitamin E intake during the second trimester is associated with hyperglycemia and insulin resistance later in pregnancy among women consuming daily multivitamin supplementations. Further, these associations are not influenced by adiponectin.
Dynamic modelling of five different phytoplankton groups in the River Thames (UK)
NASA Astrophysics Data System (ADS)
Bussi, Gianbattista; Whitehead, Paul; Bowes, Michael; Read, Daniel; Dadson, Simon
2015-04-01
Phytoplankton play a vital role in fluvial ecosystems, being a major producer of organic carbon, a food source for primary consumers and a relevant source of oxygen for many low-gradient rivers, but also a producer of potentially harmful toxins (e.g. cyanobacteria). For these reasons, the forecast and prevention of algal blooms is fundamental for the safe management of river systems. In this study, we developed a new process-based phytoplankton model for operational management and forecast of algal and cyanobacteria blooms subject to environmental change. The model is based on a mass-balance and it reproduces phytoplankton growth and death, taking into account the controlling effect played by water temperature, solar radiation, self-shading and dissolved phosphorus and silicon concentrations. The model was implemented in five reaches of the River Thames (UK) with a daily time step over a period of three years, and its results were compared to a novel dataset of cytometric data which includes community cell abundance of chlorophytes, diatoms, cyanobacteria, microcystis-like cyanobacteria and picoalgae. The model results were satisfactory in terms of fitting the observed data. A Multi-Objective General Sensitivity Analysis was also carried out in order to quantify model sensitivity to its parameters. It showed that the most influential parameters are phytoplankton growth and death rates, while phosphorus concentration showed little influence on phytoplankton growth, due to the high levels of phosphorus in the River Thames. The model was demonstrated to be a reliable tool to be used in algal bloom forecasting and management.
González-Pérez, Brenda Karen; Sarma, S S S; Castellanos-Páez, M E; Nandini, S
2018-01-01
Triclosan is a personal care product widely used in North America, Europe and Asia as antimicrobial ingredient in many consumer chemical products. In Mexico concentrations of triclosan have been reported in aquatic systems. However, there is no law regulating the presence of chemicals such as triclosan, in aquatic systems. The scarce data about this chemical has increased concern among ecotoxicologists regarding possible effects on aquatic organisms. Moreover, multigenerational studies are rarely studied and the results vary depending on the contaminant. Rotifers, are a dominant group of zooplankton, and have been used in aquatic risk assessments of personal care products due to their sensitivity and high reproductive rates. Plationus patulus and Brachionus havanaensis are common rotifers distributed in aquatic ecosystems of Mexico and have been used in ecotoxicological bioassays. In this study, the median lethal concentration (LC50, 24h) of P. patulus and B. havanaensis exposed to triclosan was determined. Based on the LC50, we tested three sublethal concentrations of triclosan to quantify the demographic responses of both rotifers for two successive generations (F0, and F1). The 24h LC50 of triclosan for P. patulus and B. havanaensis were 300 and 500µgL -1 respectively. Despite the concentration, triclosan had an adverse effect on both Plationus patulus and Brachionus havanaensis in both generations exposed. Experiments show that P. patulus was more sensitive than B. havanaensis when exposed to triclosan. When exposed to triclosan the parental generation (F0) of P. patulus was far more affected than F1. Copyright © 2017 Elsevier Inc. All rights reserved.
Perrotta, Alberto; García, Santiago J; Michels, Jasper J; Andringa, Anne-Marije; Creatore, Mariadriana
2015-07-29
Water permeation in inorganic moisture permeation barriers occurs through macroscale defects/pinholes and nanopores, the latter with size approaching the water kinetic diameter (0.27 nm). Both permeation paths can be identified by the calcium test, i.e., a time-consuming and expensive optical method for determining the water vapor transmission rate (WVTR) through barrier layers. Recently, we have shown that ellipsometric porosimetry (i.e., a combination of spectroscopic ellipsometry and isothermal adsorption studies) is a valid method to classify and quantify the nanoporosity and correlate it with the WVTR values. Nevertheless, no information is obtained about the macroscale defects or the kinetics of water permeation through the barrier, both essential in assessing the quality of the barrier layer. In this study, electrochemical impedance spectroscopy (EIS) is shown as a sensitive and versatile method to obtain information on nanoporosity and macroscale defects, water permeation, and diffusivity of moisture barrier layers, complementing the barrier property characterization obtained by means of EP and calcium test. EIS is performed on thin SiO2 barrier layers deposited by plasma enhanced-CVD. It allows the determination of the relative water uptake in the SiO2 layers, found to be in agreement with the nanoporosity content inferred by EP. Furthermore, the kinetics of water permeation is followed by EIS, and the diffusivity (D) is determined and found to be in accordance with literature values. Moreover, differently from EP, EIS data are shown to be sensitive to the presence of local macrodefects, correlated with the barrier failure during the calcium test.
Detection of Antigenic Variants of Subtype H3 Swine Influenza A Viruses from Clinical Samples
Martin, Brigitte E.; Li, Lei; Nolting, Jacqueline M.; Smith, David R.; Hanson, Larry A.
2017-01-01
ABSTRACT A large population of genetically and antigenically diverse influenza A viruses (IAVs) are circulating among the swine population, playing an important role in influenza ecology. Swine IAVs not only cause outbreaks among swine but also can be transmitted to humans, causing sporadic infections and even pandemic outbreaks. Antigenic characterizations of swine IAVs are key to understanding the natural history of these viruses in swine and to selecting strains for effective vaccines. However, influenza outbreaks generally spread rapidly among swine, and the conventional methods for antigenic characterization require virus propagation, a time-consuming process that can significantly reduce the effectiveness of vaccination programs. We developed and validated a rapid, sensitive, and robust method, the polyclonal serum-based proximity ligation assay (polyPLA), to identify antigenic variants of subtype H3N2 swine IAVs. This method utilizes oligonucleotide-conjugated polyclonal antibodies and quantifies antibody-antigen binding affinities by quantitative reverse transcription-PCR (RT-PCR). Results showed the assay can rapidly detect H3N2 IAVs directly from nasal wash or nasal swab samples collected from laboratory-challenged animals or during influenza surveillance at county fairs. In addition, polyPLA can accurately separate the viruses at two contemporary swine IAV antigenic clusters (H3N2 swine IAV-α and H3N2 swine IAV-ß) with a sensitivity of 84.9% and a specificity of 100.0%. The polyPLA can be routinely used in surveillance programs to detect antigenic variants of influenza viruses and to select vaccine strains for use in controlling and preventing disease in swine. PMID:28077698
Giacometti, Federica; Bonilauri, Paolo; Amatiste, Simonetta; Arrigoni, Norma; Bianchi, Manila; Losio, Marina Nadia; Bilei, Stefano; Cascone, Giuseppe; Comin, Damiano; Daminelli, Paolo; Decastelli, Lucia; Merialdi, Giuseppe; Mioni, Renzo; Peli, Angelo; Petruzzelli, Annalisa; Tonucci, Franco; Piva, Silvia; Serraino, Andrea
2015-09-01
A quantitative risk assessment (RA) model was developed to describe the risk of campylobacteriosis linked to consumption of raw milk sold in vending machines in Italy. Exposure assessment was based on the official microbiological records of raw milk samples from vending machines monitored by the regional Veterinary Authorities from 2008 to 2011, microbial growth during storage, destruction experiments, consumption frequency of raw milk, serving size, consumption preference and age of consumers. The differential risk considered milk handled under regulation conditions (4°C throughout all phases) and the worst time-temperature field handling conditions detected. Two separate RA models were developed, one for the consumption of boiled milk and the other for the consumption of raw milk, and two different dose-response (D-R) relationships were considered. The RA model predicted no human campylobacteriosis cases per year either in the best (4°C) storage conditions or in the case of thermal abuse in case of boiling raw milk, whereas in case of raw milk consumption the annual estimated campylobacteriosis cases depend on the dose-response relationships used in the model (D-R I or D-R II), the milk time-temperature storage conditions, consumer behaviour and age of consumers, namely young (with two cut-off values of ≤5 or ≤6 years old for the sensitive population) versus adult consumers. The annual estimated cases for young consumers using D-R II for the sensitive population (≤5 years old) ranged between 1013.7/100,000 population and 8110.3/100,000 population and for adult consumers using D-R I between 79.4/100,000 population and 333.1/100,000 population. Quantification of the risks associated with raw milk consumption is necessary from a public health perspective and the proposed RA model represents a useful and flexible tool to perform future RAs based on local consumer habits to support decision-making on safety policies. Further educational programmes for raw milk consumers or potential raw milk consumers are required to encourage consumers to boil milk to reduce the associated risk of illness. Copyright © 2015 Elsevier B.V. All rights reserved.
Hopper, Kenneth D; Strollo, Diane C; Mauger, David T
2002-02-01
To determine the sensitivity and specificity of cardiac gated electron-beam computed tomography (CT) and ungated helical CT in detecting and quantifying coronary arterial calcification (CAC) by using a working heart phantom and artificial coronary arteries. A working heart phantom simulating normal cardiac motion and providing attenuation equal to that of an adult thorax was used. Thirty tubes with a 3-mm inner diameter were internally coated with pulverized human cortical bone mixed with epoxy glue to simulate minimal (n = 10), mild (n = 10), or severe (n = 10) calcified plaques. Ten additional tubes were not coated and served as normal controls. The tubes were attached to the same location on the phantom heart and scanned with electron-beam CT and helical CT in horizontal and vertical planes. Actual plaque calcium content was subsequently quantified with atopic spectroscopy. Two blinded experienced radiologic imaging teams, one for each CT system, separately measured calcium content in the model vessels by using a Hounsfield unit threshold of 130 or greater. The sensitivity and specificity of electron-beam CT in detecting CAC were 66.1% and 80.0%, respectively. The sensitivity and specificity of helical CT were 96.4% and 95.0%, respectively. Electron-beam CT was less reliable when vessels were oriented vertically (sensitivity and specificity, 71.4% and 70%; 95% CI: 39.0%, 75.0%) versus horizontally (sensitivity and specificity, 60.7% and 90.0%; 95% CI: 48.0%, 82.0%). When a correction factor was applied, the volume of calcified plaque was statistically better quantified with helical CT than with electron-beam CT (P =.004). Ungated helical CT depicts coronary arterial calcium better than does gated electron-beam CT. When appropriate correction factors are applied, helical CT is superior to electron-beam CT in quantifying coronary arterial calcium. Although further work must be done to optimize helical CT grading systems and scanning protocols, the data of this study demonstrated helical CT's inherent advantage over currently commercially available electron-beam CT systems in CAC detection and quantification.
Mertens, Nicole L; Russell, Bayden D; Connell, Sean D
2015-12-01
Ocean warming is anticipated to strengthen the persistence of turf-forming habitat, yet the concomitant elevation of grazer metabolic rates may accelerate per capita rates of consumption to counter turf predominance. Whilst this possibility of strong top-down control is supported by the metabolic theory of ecology (MTE), it assumes that consumer metabolism and consumption keep pace with increasing production. This assumption was tested by quantifying the metabolic rates of turfs and herbivorous gastropods under a series of elevated temperatures in which the ensuing production and consumption were observed. We discovered that as temperature increases towards near-future levels (year 2100), consumption rates of gastropods peak earlier than the rate of growth of producers. Hence, turfs have greater capacity to persist under near-future temperatures than the capacity for herbivores to counter their growth. These results suggest that whilst MTE predicts stronger top-down control, understanding whether consumer-producer responses are synchronous is key to assessing the future strength of top-down control.
Tracing global supply chains to air pollution hotspots
NASA Astrophysics Data System (ADS)
Moran, Daniel; Kanemoto, Keiichiro
2016-09-01
While high-income countries have made significant strides since the 1970s in improving air quality, air pollution continues to rise in many developing countries and the world as a whole. A significant share of the pollution burden in developing countries can be attributed to production for export to consumers in high-income nations. However, it remains a challenge to quantify individual actors’ share of responsibility for pollution, and to involve parties other than primary emitters in cleanup efforts. Here we present a new spatially explicit modeling approach to link SO2, NO x , and PM10 severe emissions hotspots to final consumers via global supply chains. These maps show developed countries reducing their emissions domestically but driving new pollution hotspots in developing countries. This is also the first time a spatially explicit footprint inventory has been established. Linking consumers and supply chains to emissions hotspots creates opportunities for other parties to participate alongside primary emitters and local regulators in pollution abatement efforts.
Mirzazadeh, Ali; Haghdoost, Ali Akbar; Nedjat, Saharnaz; Navadeh, Soodabeh; McFarland, Willi; Mohammad, Kazem
2013-02-01
We quantified discrepancies in reported behaviors of female sex workers (FSW) by comparing 63 face-to-face interviews (FTFI) to in-depth interviews (IDI), with corroboration of the directions and magnitudes of reporting by a panel of psychologists who work with FSW. Sensitivities, specificities, positive and negative predictive values (PPV and NPV) were assessed for FTFI responses using IDI as a "gold standard". Sensitivities were lowest in reporting symptoms of sexually transmitted infections (63.9 %), finding sex partners in venues (52.4 %) and not receiving HIV test results (66.7 %). Specificities (all >83 %) and PPVs (all >74.0 %) were higher than NPV. FSW significantly under-reported number of clients, sexual contacts and non-condom use sex acts with clients and number of days engaging in sex work in the preceding week. This study provides a quantified gauge of reporting biases in FSW behaviors. Such estimates and methods help better understand true HIV risk in marginalized populations and calibrate survey estimates accordingly.
Martin, Nicole; Carey, Nancy; Murphy, Steven; Kent, David; Bang, Jae; Stubbs, Tim; Wiedmann, Martin; Dando, Robin
2016-06-01
Fluid milk consumption per capita in the United States has been steadily declining since the 1940s. Many factors have contributed to this decline, including the increasing consumption of carbonated beverages and bottled water. To meet the challenge of stemming the decline in consumption of fluid milk, the dairy industry must take a systematic approach to identifying and correcting for factors that negatively affect consumers' perception of fluid milk quality. To that end, samples of fluid milk were evaluated to identify factors, with a particular focus on light-emitting diode (LED) light exposure, which negatively affect the perceived sensory quality of milk, and to quantify their relative effect on the consumer's experience. Fluid milk samples were sourced from 3 processing facilities with varying microbial postprocessing contamination patterns based on historical testing. The effect of fat content, light exposure, age, and microbiological content were assayed across 23 samples of fluid milk, via consumer, descriptive sensory, and instrumental analyses. Most notably, light exposure resulted in a broad negative reaction from consumers, more so than samples with microbiological contamination exceeding 20,000 cfu/mL on days approaching code. The predominant implication of the study is that a component of paramount importance in ensuring the success of the dairy industry would be to protect fluid milk from all sources of light exposure, from processing plant to consumer. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Nanus, L.; Clow, D. W.; Sickman, J. O.
2016-12-01
High-elevation aquatic ecosystems in Yosemite (YOSE) and Sequoia and Kings Canyon (SEKI) National Parks are impacted by atmospheric nitrogen (N) deposition associated with local and regional air pollution. Documented effects include elevated surface water nitrate concentrations, increased algal productivity, and changes in diatom species assemblages. Annual wet inorganic N deposition maps, developed at 1-km resolution for YOSE and SEKI to quantify N deposition to sensitive high-elevation ecosystems, range from 1.0 to over 5.0 kg N ha-1 yr-1. Critical loads of N deposition for nutrient enrichment of aquatic ecosystems were quantified and mapped using a geostatistical approach, with N deposition, topography, vegetation, geology, and climate as potential explanatory variables. Multiple predictive models were created using various combinations of explanatory variables; this approach allowed us to better quantify uncertainty and more accurately identify the areas most sensitive to atmospherically deposited N. The lowest critical loads estimates and highest exceedances identified within YOSE and SEKI occurred in high-elevation basins with steep slopes, sparse vegetation, and areas of neoglacial till and talus. These results are consistent with previous analyses in the Rocky Mountains, and highlight the sensitivity of alpine ecosystems to atmospheric N deposition.
Ultra-sensitive probe of spectral line structure and detection of isotopic oxygen
NASA Astrophysics Data System (ADS)
Garner, Richard M.; Dharamsi, A. N.; Khan, M. Amir
2018-01-01
We discuss a new method of investigating and obtaining quantitative behavior of higher harmonic (> 2f) wavelength modulation spectroscopy (WMS) based on the signal structure. It is shown that the spectral structure of higher harmonic WMS signals, quantified by the number of zero crossings and turnings points, can have increased sensitivity to ambient conditions or line-broadening effects from changes in temperature, pressure, or optical depth. The structure of WMS signals, characterized by combinations of signal magnitude and spectral locations of turning points and zero crossings, provides a unique scale that quantifies lineshape parameters and, thus, useful in optimization of measurements obtained from multi-harmonic WMS signals. We demonstrate this by detecting weaker rotational-vibrational transitions of isotopic atmospheric oxygen (16O18O) in the near-infrared region where higher harmonic WMS signals are more sensitive contrary to their signal-to-noise ratio considerations. The proposed approach based on spectral structure provides the ability to investigate and quantify signals not only at linecenter but also in the wing region of the absorption profile. This formulation is particularly useful in tunable diode laser spectroscopy and ultra-precision laser-based sensors where absorption signal profile carries information of quantities of interest, e.g., concentration, velocity, or gas collision dynamics, etc.
Defining Product Intake Fraction to Quantify and Compare Exposure to Consumer Products.
Jolliet, Olivier; Ernstoff, Alexi S; Csiszar, Susan A; Fantke, Peter
2015-08-04
There is a growing consciousness that exposure studies need to better cover near-field exposure associated with products use. To consistently and quantitatively compare human exposure to chemicals in consumer products, we introduce the concept of product intake fraction, as the fraction of a chemical within a product that is eventually taken in by the human population. This metric enables consistent comparison of exposures during consumer product use for different product-chemical combinations, exposure duration, exposure routes and pathways and for other life cycle stages. We present example applications of the product intake fraction concept, for two chemicals in two personal care products and two chemicals encapsulated in two articles, showing how intakes of these chemicals can primarily occur during product use. We demonstrate the utility of the product intake fraction and its application modalities within life cycle assessment and risk assessment contexts. The product intake fraction helps to provide a clear interface between the life cycle inventory and impact assessment phases, to identify best suited sentinel products and to calculate overall exposure to chemicals in consumer products, or back-calculate maximum allowable concentrations of substances inside products.
Seawater desalination and serum magnesium concentrations in Israel.
Koren, Gideon; Shlezinger, Meital; Katz, Rachel; Shalev, Varda; Amitai, Yona
2017-04-01
With increasing shortage of fresh water globally, more countries are consuming desalinated seawater (DSW). In Israel >50% of drinking water is now derived from DSW. Desalination removes magnesium, and hypomagnesaemia has been associated with increased cardiac morbidity and mortality. Presently the impact of consuming DSW on body magnesium status has not been established. We quantified changes in serum magnesium in a large population based study (n = 66,764), before and after desalination in regions consuming DSW and in regions where DSW has not been used. In the communities that switched to DSW in 2013, the mean serum magnesium was 2.065 ± 0.19 mg/dl before desalination and fell to 2.057 ± 0.19 mg/dl thereafter (p < 0.0001). In these communities 1.62% of subjects exhibited serum magnesium concentrations ≤1.6 mg/dl between 2010 and 2013. This proportion increased by 24% between 2010-2013 and 2015-2016 to 2.01% (p = 0.0019). In contrast, no such changes were recorded in the communities that did not consume DSW. Due to the emerging evidence of increased cardiac morbidity and mortality associated with hypomagnesaemia, it is vital to consider re-introduction of magnesium to DSW.
Global patterns in the impact of marine herbivores on benthic primary producers.
Poore, Alistair G B; Campbell, Alexandra H; Coleman, Ross A; Edgar, Graham J; Jormalainen, Veijo; Reynolds, Pamela L; Sotka, Erik E; Stachowicz, John J; Taylor, Richard B; Vanderklift, Mathew A; Duffy, J Emmett
2012-08-01
Despite the importance of consumers in structuring communities, and the widespread assumption that consumption is strongest at low latitudes, empirical tests for global scale patterns in the magnitude of consumer impacts are limited. In marine systems, the long tradition of experimentally excluding herbivores in their natural environments allows consumer impacts to be quantified on global scales using consistent methodology. We present a quantitative synthesis of 613 marine herbivore exclusion experiments to test the influence of consumer traits, producer traits and the environment on the strength of herbivore impacts on benthic producers. Across the globe, marine herbivores profoundly reduced producer abundance (by 68% on average), with strongest effects in rocky intertidal habitats and the weakest effects on habitats dominated by vascular plants. Unexpectedly, we found little or no influence of latitude or mean annual water temperature. Instead, herbivore impacts differed most consistently among producer taxonomic and morphological groups. Our results show that grazing impacts on plant abundance are better predicted by producer traits than by large-scale variation in habitat or mean temperature, and that there is a previously unrecognised degree of phylogenetic conservatism in producer susceptibility to consumption. © 2012 Blackwell Publishing Ltd/CNRS.
Becker, Anne E; Hadley Arrindell, Adrienne; Perloe, Alexandra; Fay, Kristen; Striegel-Moore, Ruth H
2010-01-01
Objective: The study aim was to identify and describe health consumer perspectives on social barriers to care for eating disorders in an ethnically diverse sample. Method: We conducted an exploratory secondary analysis of qualitative data comprising transcripts from semi-structured interviews with past and prospective consumers of eating disorder treatment (n = 32). Transcripts were inputted into NVivo 8 for coding, sorting, and quantifying thematic content of interest within strata defined by ethnic minority and non-minority participants. We then examined the influence of key social barriers—including stigma and social stereotypes—on perceived impact on care. Results: The majority of respondents (78%) endorsed at least one social barrier to care for an eating or weight concern. Perceived stigma (or shame) and social stereotyping—identified both within social networks and among clinicians—had adversely impacted care for 59% and 19% of respondents, respectively. Discussion: Social barriers to care for eating and weight related concerns may be prevalent in the U.S. and impact both ethnic minority and non-minority health care consumers. © 2009 by Wiley Periodicals, Inc. (Int J Eat Disord 2010;) PMID:19806607
The Effect of Diet Mixing on a Nonselective Herbivore
2016-01-01
The balanced-diet hypothesis states that a diverse prey community is beneficial to consumers due to resource complementarity among the prey species. Nonselective consumer species cannot differentiate between prey items and are therefore not able to actively regulate their diet intake. We thus wanted to test whether the balanced-diet hypothesis is applicable to nonselective consumers. We conducted a laboratory experiment in which a nonselective model grazer, the freshwater gastropod Lymnaea stagnalis, was fed benthic green algae as single species or as a multi-species mixture and quantified the snails’ somatic growth rates and shell lengths over a seven-week period. Gastropods fed the mixed diet were found to exhibit a higher somatic growth rate than the average of the snails fed single prey species. However, growth on the multi-species mixture did not exceed the growth rate obtained on the best single prey species. Similar results were obtained regarding the animals’ shell height increase over time. The mixed diet did not provide the highest growth rate, which confirms our hypothesis. We thus suggest that the balanced-diet hypothesis is less relevant for non-selective generalist consumers, which needs to be considered in estimates of secondary production. PMID:27391787
Serine Protease Zymography: Low-Cost, Rapid, and Highly Sensitive RAMA Casein Zymography.
Yasumitsu, Hidetaro
2017-01-01
To detect serine protease activity by zymography, casein and CBB stain have been used as a substrate and a detection procedure, respectively. Casein zymography has been using substrate concentration at 1 mg/mL and employing conventional CBB stain. Although ordinary casein zymography provides reproducible results, it has several disadvantages including time-consuming and relative low sensitivity. Improved casein zymography, RAMA casein zymography, is rapid and highly sensitive. RAMA casein zymography completes the detection process within 1 h after incubation and increases the sensitivity at least by tenfold. In addition to serine protease, the method also detects metalloprotease 7 (MMP7, Matrilysin) with high sensitivity.
RAMA casein zymography: Time-saving and highly sensitive casein zymography for MMP7 and trypsin.
Yasumitsu, Hidetaro; Ozeki, Yasuhiro; Kanaly, Robert A
2016-11-01
To detect metalloproteinase-7 (MMP7), zymography is conducted using a casein substrate and conventional CBB stain. It has disadvantages because it is time consuming and has low sensitivity. Previously, a sensitive method to detect MMP7 up to 30 pg was reported, however it required special substrates and complicated handlings. RAMA casein zymography described herein is rapid, sensitive, and reproducible. By applying high-sensitivity staining with low substrate conditions, the staining process is completed within 1 h and sensitivity was increased 100-fold. The method can detect 10 pg MMP7 by using commercially available casein without complicated handlings. Moreover, it increases detection sensitivity for trypsin. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Cline, Richard R; Mott, David A
2003-01-01
Several proposals for adding a prescription drug benefit to the Medicare program rely on consumer choice and market forces to promote efficiency. However, little information exists regarding: 1) the extent of price sensitivity for such plans among Medicare beneficiaries, or 2) the extent to which drug-only insurance plans using various cost-control mechanisms might experience adverse selection. Using data from a survey of elderly Wisconsin residents regarding their likely choices from a menu of hypothetical drug plans, we show that respondents are likely to be price sensitive with respect to both premiums and out-of-pocket costs but that selection problems may arise in these markets. Outside intervention may be necessary to ensure the feasibility of a market-based approach to a Medicare drug benefit.
NASA Technical Reports Server (NTRS)
Santanello, Joseph A., Jr.; Peters-Lidard, Christa D.; Kumar, Sujay V.
2011-01-01
The inherent coupled nature of earth s energy and water cycles places significant importance on the proper representation and diagnosis of land atmosphere (LA) interactions in hydrometeorological prediction models. However, the precise nature of the soil moisture precipitation relationship at the local scale is largely determined by a series of nonlinear processes and feedbacks that are difficult to quantify. To quantify the strength of the local LA coupling (LoCo), this process chain must be considered both in full and as individual components through their relationships and sensitivities. To address this, recent modeling and diagnostic studies have been extended to 1) quantify the processes governing LoCo utilizing the thermodynamic properties of mixing diagrams, and 2) diagnose the sensitivity of coupled systems, including clouds and moist processes, to perturbations in soil moisture. This work employs NASA s Land Information System (LIS) coupled to the Weather Research and Forecasting (WRF) mesoscale model and simulations performed over the U.S. Southern Great Plains. The behavior of different planetary boundary layers (PBL) and land surface scheme couplings in LIS WRF are examined in the context of the evolution of thermodynamic quantities that link the surface soil moisture condition to the PBL regime, clouds, and precipitation. Specifically, the tendency toward saturation in the PBL is quantified by the lifting condensation level (LCL) deficit and addressed as a function of time and space. The sensitivity of the LCL deficit to the soil moisture condition is indicative of the strength of LoCo, where both positive and negative feedbacks can be identified. Overall, this methodology can be applied to any model or observations and is a crucial step toward improved evaluation and quantification of LoCo within models, particularly given the advent of next-generation satellite measurements of PBL and land surface properties along with advances in data assimilation schemes.
NASA Astrophysics Data System (ADS)
Adhikari, R.; Nickel, J.; Kallmeyer, J.
2012-12-01
Microbial life is widespread in Earth's subsurface and estimated to represent a significant fraction of Earth's total living biomass. However, very little is known about subsurface microbial activity and its fundamental role in biogeochemical cycles of carbon and other biologically important elements. Hydrogen is one of the most important elements in subsurface anaerobic microbial metabolism. Heterotrophic and chemoautotrophic microorganisms use hydrogen in their metabolic pathways. They either consume or produce protons for ATP synthesis. Hydrogenase (H2ase) is a ubiquitous intracellular enzyme that catalyzes the interconversion of molecular hydrogen and/or water into protons and electrons. The protons are used for the synthesis of ATP, thereby coupling energy generating metabolic processes to electron acceptors such as CO2 or sulfate. H2ase enzyme targets a key metabolic compound in cellular metabolism therefore the assay can be used as a measure for total microbial activity without the need to identify any specific metabolic process. Using the highly sensitive tritium assay we measured H2ase enzyme activity in the organic-rich sediments of Lake Van, a saline, alkaline lake in eastern Turkey, in marine sediments of the Barents Sea and in deep subseafloor sediments from the Nankai Trough. H2ase activity could be quantified at all depths of all sites but the activity distribution varied widely with depth and between sites. At the Lake Van sites H2ase activity ranged from ca. 20 mmol H2 cm-3d-1 close to the sediment-water interface to 0.5 mmol H2 cm-3d-1 at a depth of 0.8 m. In samples from the Barents Sea H2ase activity ranged between 0.1 to 2.5 mmol H2 cm-3d-1 down to a depth of 1.60 m. At all sites the sulfate reduction rate profile followed the upper part of the H2ase activity profile until sulfate reduction reached the minimum detection limit (ca. 10 pmol cm-3d-1). H2ase activity could still be quantified after the decline of sulfate reduction, indicating that other microbial processes are becoming quantitatively more important. Similarly, H2ase activity could be quantified at greater depths (ca. 400 mbsf) in Nankai Trough sediments. Nankai Trough is one of the world's most geologically active accretionary wedges, where the Philippine Plate is subducting under the southwest of Japan. Due to the transient faulting, huge amounts of energy are liberated that enhance chemical transformations of organic and inorganic matter. An increase in H2ase activity could be observed at greater depth, which suggests that microbial activity is stimulated by the fault activity. Current techniques for the quantification of microbial activity in deep sediments have already reached their physical and technical limits and-in many cases- are still not sensitive enough to quantify extremely low rates of microbial activity. Additional to the quantification of specific processes, estimates of total microbial activity will provide valuable information on energy flux and microbial metabolism in the subsurface biosphere and other low-energy environments as well as help identifying hotspots of microbial activity. The tritium H2ase assay has a potential to become a valuable tool to measure total subsurface microbial activity.
Thermal Profiling of Residential Energy Use
DOE Office of Scientific and Technical Information (OSTI.GOV)
Albert, A; Rajagopal, R
This work describes a methodology for informing targeted demand-response (DR) and marketing programs that focus on the temperature-sensitive part of residential electricity demand. Our methodology uses data that is becoming readily available at utility companies-hourly energy consumption readings collected from "smart" electricity meters, as well as hourly temperature readings. To decompose individual consumption into a thermal-sensitive part and a base load (non-thermally-sensitive), we propose a model of temperature response that is based on thermal regimes, i.e., unobserved decisions of consumers to use their heating or cooling appliances. We use this model to extract useful benchmarks that compose thermal profiles ofmore » individual users, i.e., terse characterizations of the statistics of these users' temperature-sensitive consumption. We present example profiles generated using our model on real consumers, and show its performance on a large sample of residential users. This knowledge may, in turn, inform the DR program by allowing scarce operational and marketing budgets to be spent on the right users-those whose influencing will yield highest energy reductions-at the right time. We show that such segmentation and targeting of users may offer savings exceeding 100% of a random strategy.« less
Ultrafast Fabrication of Flexible Dye-Sensitized Solar Cells by Ultrasonic Spray-Coating Technology
Han, Hyun-Gyu; Weerasinghe, Hashitha C.; Min Kim, Kwang; Soo Kim, Jeong; Cheng, Yi-Bing; Jones, David J.; Holmes, Andrew B.; Kwon, Tae-Hyuk
2015-01-01
This study investigates novel deposition techniques for the preparation of TiO2 electrodes for use in flexible dye-sensitized solar cells. These proposed new methods, namely pre-dye-coating and codeposition ultrasonic spraying, eliminate the conventional need for time-consuming processes such as dye soaking and high-temperature sintering. Power conversion efficiencies of over 4.0% were achieved with electrodes prepared on flexible polymer substrates using this new deposition technology and N719 dye as a sensitizer. PMID:26420466
A field method for soil erosion measurements in agricultural and natural lands
Y.P. Hsieh; K.T. Grant; G.C. Bugna
2009-01-01
Soil erosion is one of the most important watershed processes in nature, yet quantifying it under field conditions remains a challenge. The lack of soil erosion field data is a major factor hindering our ability to predict soil erosion in a watershed. We present here the development of a simple and sensitive field method that quantifies soil erosion and the resulting...
Matthew D. Hurteau; Timothy A. Robards; Donald Stevens; David Saah; Malcolm North; George W. Koch
2014-01-01
Quantifying the impacts of changing climatic conditions on forest growth is integral to estimating future forest carbon balance. We used a growth-and-yield model, modified for climate sensitivity, to quantify the effects of altered climate on mixed-conifer forest growth in the Lake Tahoe Basin, California. Estimates of forest growth and live tree carbon stocks were...
Quantifying landscape-level methane fluxes in subarctic Finland using a multiscale approach.
Hartley, Iain P; Hill, Timothy C; Wade, Thomas J; Clement, Robert J; Moncrieff, John B; Prieto-Blanco, Ana; Disney, Mathias I; Huntley, Brian; Williams, Mathew; Howden, Nicholas J K; Wookey, Philip A; Baxter, Robert
2015-10-01
Quantifying landscape-scale methane (CH4 ) fluxes from boreal and arctic regions, and determining how they are controlled, is critical for predicting the magnitude of any CH4 emission feedback to climate change. Furthermore, there remains uncertainty regarding the relative importance of small areas of strong methanogenic activity, vs. larger areas with net CH4 uptake, in controlling landscape-level fluxes. We measured CH4 fluxes from multiple microtopographical subunits (sedge-dominated lawns, interhummocks and hummocks) within an aapa mire in subarctic Finland, as well as in drier ecosystems present in the wider landscape, lichen heath and mountain birch forest. An intercomparison was carried out between fluxes measured using static chambers, up-scaled using a high-resolution landcover map derived from aerial photography and eddy covariance. Strong agreement was observed between the two methodologies, with emission rates greatest in lawns. CH4 fluxes from lawns were strongly related to seasonal fluctuations in temperature, but their floating nature meant that water-table depth was not a key factor in controlling CH4 release. In contrast, chamber measurements identified net CH4 uptake in birch forest soils. An intercomparison between the aerial photography and satellite remote sensing demonstrated that quantifying the distribution of the key CH4 emitting and consuming plant communities was possible from satellite, allowing fluxes to be scaled up to a 100 km(2) area. For the full growing season (May to October), ~ 1.1-1.4 g CH4 m(-2) was released across the 100 km(2) area. This was based on up-scaled lawn emissions of 1.2-1.5 g CH4 m(-2) , vs. an up-scaled uptake of 0.07-0.15 g CH4 m(-2) by the wider landscape. Given the strong temperature sensitivity of the dominant lawn fluxes, and the fact that lawns are unlikely to dry out, climate warming may substantially increase CH4 emissions in northern Finland, and in aapa mire regions in general. © 2015 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
SCALE 6.2 Continuous-Energy TSUNAMI-3D Capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perfetti, Christopher M; Rearden, Bradley T
2015-01-01
The TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation) capabilities within the SCALE code system make use of sensitivity coefficients for an extensive number of criticality safety applications, such as quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different systems, quantifying computational biases, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved ease of use and fidelity and the desire to extend TSUNAMI analysis to advanced applications have motivated the development of a SCALE 6.2 module for calculating sensitivity coefficients using three-dimensional (3D) continuous-energy (CE) Montemore » Carlo methods: CE TSUNAMI-3D. This paper provides an overview of the theory, implementation, and capabilities of the CE TSUNAMI-3D sensitivity analysis methods. CE TSUNAMI contains two methods for calculating sensitivity coefficients in eigenvalue sensitivity applications: (1) the Iterated Fission Probability (IFP) method and (2) the Contributon-Linked eigenvalue sensitivity/Uncertainty estimation via Track length importance CHaracterization (CLUTCH) method. This work also presents the GEneralized Adjoint Response in Monte Carlo method (GEAR-MC), a first-of-its-kind approach for calculating adjoint-weighted, generalized response sensitivity coefficients—such as flux responses or reaction rate ratios—in CE Monte Carlo applications. The accuracy and efficiency of the CE TSUNAMI-3D eigenvalue sensitivity methods are assessed from a user perspective in a companion publication, and the accuracy and features of the CE TSUNAMI-3D GEAR-MC methods are detailed in this paper.« less
NASA Technical Reports Server (NTRS)
Wilson, D. J.
1972-01-01
Time-dependent notch sensitivity of Inconel 718 sheet occurred at 900 to 1200 F when notched specimens were loaded below the yield strength, and tests on smooth specimens showed that small amounts of creep consumed large fractions of creep-rupture life. The severity of the notch sensitivity decreased with decreasing solution treatment temperature and increasing time and/or temperature of the aging treatment. Elimination of the notch sensitivity was correlated with a change in the dislocation mechanism from shearing to by-passing precipitate particles.
Metabolic stoichiometry and the fate of excess carbon and nutrients in consumers.
Anderson, Thomas R; Hessen, Dag O; Elser, James J; Urabe, Jotaro
2005-01-01
Animals encountering nutritionally imbalanced foods should release elements in excess of requirements in order to maintain overall homeostasis. Quantifying these excesses and predicting their fate is, however, problematic. A new model of the stoichiometry of consumers is formulated that incorporates the separate terms in the metabolic budget, namely, assimilation of ingested substrates and associated costs, protein turnover, other basal costs, such as osmoregulation, and the use of remaining substrates for production. The model indicates that release of excess C and nonlimiting nutrients may often be a significant fraction of the total metabolic budget of animals consuming the nutrient-deficient forages that are common in terrestrial and aquatic systems. The cost of maintenance, in terms of not just C but also N and P, is considerable, such that food quality is important even when intake is low. Many generalist consumers experience short-term and unpredictable fluctuations in their diets. Comparison of model output with data for one such consumer, Daphnia, indicates that mechanisms operating postabsorption in the gut are likely the primary means of regulating excess C, N, and P in these organisms, notably respiration decoupled from biochemical or mechanical work and excretion of carbon and nutrients. This stoichiometrically regulated release may often be in organic rather than inorganic form, with important consequences for the balance of autotrophic and heterotrophic processes in ecosystems.
Role of Young Child Formulae and Supplements to Ensure Nutritional Adequacy in U.K. Young Children
Vieux, Florent; Brouzes, Chloé M. C.; Maillot, Matthieu; Briend, André; Hankard, Régis; Lluch, Anne; Darmon, Nicole
2016-01-01
The European Food Safety Authority (EFSA) states that young child formulae (YCFs) “cannot be considered as a necessity to satisfy the nutritional requirements” of children aged 12–36 months. This study quantifies the dietary changes needed to ensure nutritional adequacy in U.K. young children who consume YCFs and/or supplements and in those who do not. Dietary data from 1147 young children (aged 12–18 months) were used to identify, using linear programming models, the minimum changes needed to ensure nutritional adequacy: (i) by changing the quantities of foods initially consumed by each child (repertoire-foods); and (ii) by introducing new foods (non-repertoire-foods). Most of the children consumed neither YCFs, nor supplements (61.6%). Nutritional adequacy with repertoire-foods alone was ensured for only one child in this group, against 74.4% of the children consuming YCFs and supplement. When access to all foods was allowed, smaller food changes were required when YCFs and supplements were initially consumed than when they were not. In the total sample, the main dietary shifts needed to ensure nutritional adequacy were an increase in YCF and a decrease in cow’s milk (+226 g/day and −181 g/day, respectively). Increasing YCF and supplement consumption was the shortest way to cover the EFSA nutrient requirements of U.K. children. PMID:27598195
Tao, Mengya; Li, Dingsheng; Song, Runsheng; Suh, Sangwon; Keller, Arturo A
2018-03-01
Chemicals in consumer products have become the focus of recent regulatory developments including California's Safer Consumer Products Act. However, quantifying the amount of chemicals released during the use and post-use phases of consumer products is challenging, limiting the ability to understand their impacts. Here we present a comprehensive framework, OrganoRelease, for estimating the release of organic chemicals from the use and post-use of consumer products given limited information. First, a novel Chemical Functional Use Classifier estimates functional uses based on chemical structure. Second, the quantity of chemicals entering different product streams is estimated based on market share data of the chemical functional uses. Third, chemical releases are estimated based on either chemical product categories or functional uses by using the Specific Environmental Release Categories and EU Technological Guidance Documents. OrganoRelease connects 19 unique functional uses and 14 product categories across 4 data sources and provides multiple pathways for chemical release estimation. Available user information can be incorporated in the framework at various stages. The Chemical Functional Use Classifier achieved an average accuracy above 84% for nine functional uses, which enables the OrganoRelease to provide release estimates for the chemical, mostly using only the molecular structure. The results can be can be used as input for methods estimating environmental fate and exposure. Copyright © 2017 Elsevier Ltd. All rights reserved.
Role of Young Child Formulae and Supplements to Ensure Nutritional Adequacy in U.K. Young Children.
Vieux, Florent; Brouzes, Chloé M C; Maillot, Matthieu; Briend, André; Hankard, Régis; Lluch, Anne; Darmon, Nicole
2016-09-02
The European Food Safety Authority (EFSA) states that young child formulae (YCFs) "cannot be considered as a necessity to satisfy the nutritional requirements" of children aged 12-36 months. This study quantifies the dietary changes needed to ensure nutritional adequacy in U.K. young children who consume YCFs and/or supplements and in those who do not. Dietary data from 1147 young children (aged 12-18 months) were used to identify, using linear programming models, the minimum changes needed to ensure nutritional adequacy: (i) by changing the quantities of foods initially consumed by each child (repertoire-foods); and (ii) by introducing new foods (non-repertoire-foods). Most of the children consumed neither YCFs, nor supplements (61.6%). Nutritional adequacy with repertoire-foods alone was ensured for only one child in this group, against 74.4% of the children consuming YCFs and supplement. When access to all foods was allowed, smaller food changes were required when YCFs and supplements were initially consumed than when they were not. In the total sample, the main dietary shifts needed to ensure nutritional adequacy were an increase in YCF and a decrease in cow's milk (+226 g/day and -181 g/day, respectively). Increasing YCF and supplement consumption was the shortest way to cover the EFSA nutrient requirements of U.K. children.
The Value of Distributed Solar Electric Generation to San Antonio
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Nic; Norris, Ben; Meyer, Lisa
2013-02-14
This report presents an analysis of value provided by grid-connected, distributed PV in San Antonio from a utility perspective. The study quantified six value components, summarized in Table ES- 1. These components represent the benefits that accrue to the utility, CPS Energy, in accepting solar onto the grid. This analysis does not treat the compensation of value, policy objectives, or cost-effectiveness from the retail consumer perspective.
Quantifying area changes of internationally important wetlands due to water consumption in LCA.
Verones, Francesca; Pfister, Stephan; Hellweg, Stefanie
2013-09-03
Wetlands harbor diverse species assemblages but are among the world's most threatened ecosystems. Half of their global area was lost during the last century. No approach currently exists in life cycle impact assessment that acknowledges the vulnerability and importance of wetlands globally and provides fate factors for water consumption. We use data from 1184 inland wetlands, all designated as sites of international importance under the Ramsar Convention, to develop regionalized fate factors (FF) for consumptive water use. FFs quantify the change of wetland area caused per m(3)/yr water consumed. We distinguish between surface water-fed and groundwater-fed wetlands and develop FFs for surface water and groundwater consumption. FFs vary over 8 (surface water-fed) and 6 (groundwater-fed) orders of magnitude as a function of the site characteristics, showing the importance of local conditions. Largest FFs for surface water-fed wetlands generally occur in hyper-arid zones and smallest in humid zones, highlighting the dependency on available surface water flows. FFs for groundwater-fed wetlands depend on hydrogeological conditions and vary largely with the total amount of water consumed from the aquifer. Our FFs translate water consumption into wetland area loss and thus become compatible with life cycle assessment methodologies of land use.
Degradation of specific aromatic compounds migrating from PEX pipes into drinking water.
Ryssel, Sune Thyge; Arvin, Erik; Lützhøft, Hans-Christian Holten; Olsson, Mikael Emil; Procházková, Zuzana; Albrechtsen, Hans-Jørgen
2015-09-15
Nine specific compounds identified to migrate from polyethylene (PE) and cross-linked polyethylene (PEX) to drinking water were investigated for their degradation in drinking water. Three sample types were studied: field samples (collected at consumer taps), PEX pipe water extractions, and water samples spiked with target compounds. Four compounds were quantified in field samples at concentrations of 0.15-8.0 μg/L. During PEX pipe water extraction 0.42 ± 0.20 mg NVOC/L was released and five compounds quantified (0.5-6.1 μg/L). The degradation of these compounds was evaluated in PEX-pipe water extractions and spiked samples. 4-ethylphenol was degraded within 22 days. Eight compounds were, however, only partially degradable under abiotic and biotic conditions within the timeframe of the experiments (2-4 weeks). Neither inhibition nor co-metabolism was observed in the presence of acetate or PEX pipe derived NVOC. Furthermore, the degradation in drinking water from four different locations with three different water works was similar. In conclusion, eight out of the nine compounds studied would - if being released from the pipes - reach consumers with only minor concentration decrease during water distribution. Copyright © 2015 Elsevier Ltd. All rights reserved.
In-use measurement of activity, energy use, and emissions of a plug-in hybrid electric vehicle.
Graver, Brandon M; Frey, H Christopher; Choi, Hyung-Wook
2011-10-15
Plug-in hybrid electric vehicles (PHEVs) could reduce transportation air emissions and energy use. However, a method is needed for estimating on-road emissions of PHEVs. To develop a framework for quantifying microscale energy use and emissions (EU&E), measurements were conducted on a Toyota Prius retrofitted with a plug-in battery system on eight routes. Measurements were made using the following: (1) a data logger for the hybrid control system; (2) a portable emissions measurement system; and (3) a global positioning system with barometric altimeter. Trends in EU&E are estimated based on vehicle specific power. Energy economy is quantified based on gasoline consumed by the engine and grid energy consumed by the plug-in battery. Emissions from electricity consumption are estimated based on the power generation mix. Fuel use is approximately 30% lower during plug-in battery use. Grid emissions were higher for CO₂, NO(x), SO₂, and PM compared to tailpipe emissions but lower for CO and hydrocarbons. EU&E depends on engine and plug-in battery operation. The use of two energy sources must be addressed in characterizing fuel economy; overall energy economy is 11% lower if including grid energy use than accounting only for fuel consumption.
Salvo, Alberto; Brito, Joel; Artaxo, Paulo; Geiger, Franz M
2017-07-18
Despite ethanol's penetration into urban transportation, observational evidence quantifying the consequence for the atmospheric particulate burden during actual, not hypothetical, fuel-fleet shifts, has been lacking. Here we analyze aerosol, meteorological, traffic, and consumer behavior data and find, empirically, that ambient number concentrations of 7-100-nm diameter particles rise by one-third during the morning commute when higher ethanol prices induce 2 million drivers in the real-world megacity of São Paulo to substitute to gasoline use (95% confidence intervals: +4,154 to +13,272 cm -3 ). Similarly, concentrations fall when consumers return to ethanol. Changes in larger particle concentrations, including US-regulated PM2.5, are statistically indistinguishable from zero. The prospect of increased biofuel use and mounting evidence on ultrafines' health effects make our result acutely policy relevant, to be weighed against possible ozone increases. The finding motivates further studies in real-world environments. We innovate in using econometrics to quantify a key source of urban ultrafine particles.The biofuel ethanol has been introduced into urban transportation in many countries. Here, by measuring aerosols in São Paulo, the authors find that high ethanol prices coincided with an increase in harmful nanoparticles by a third, as drivers switched from ethanol to cheaper gasoline, showing a benefit of ethanol.
Kim, Min Geun; Alçiçek, Zayde; Balaban, Murat O; Atar, Hasan Huseyin
2014-04-01
Aquacultured green lipped mussel (Perna canaliculus) is the New Zealand export leader of seafood in terms of weight. Different treatments shrink mussel meat differently and affect the consumer perception of half-shelled mussels. In order to quantify this, digital images of half-shelled green lipped mussels subjected to two postharvest treatments (ultrahigh pressure (UHP) and heat treatment (HT)) and raw controls were taken. The ratio of the view area of the meat to that of the shell (labelled as 'visual condition index' (VCI)) was measured using image analysis. A polygonal region of interest was defined on the image to depict the boundary of the meat and to calculate the view area. Raw mussels had a VCI of 85%. HT mussels had a much reduced VCI of 41%, indicating shrinkage of the meat due to heat. UHP treatment used as a shucking method resulted in a VCI of 83%. Since VCI is one measure of quality for the consumer, this quantitative method can be used in the optimization of shucking treatment (HT or UHP). VCI can be used to optimize postharvest treatments to minimize meat shrinkage. This method can also be applied to other shellfish such as oysters and clams. © 2013 Society of Chemical Industry.
Why climate change will invariably alter selection pressures on phenology.
Gienapp, Phillip; Reed, Thomas E; Visser, Marcel E
2014-10-22
The seasonal timing of lifecycle events is closely linked to individual fitness and hence, maladaptation in phenological traits may impact population dynamics. However, few studies have analysed whether and why climate change will alter selection pressures and hence possibly induce maladaptation in phenology. To fill this gap, we here use a theoretical modelling approach. In our models, the phenologies of consumer and resource are (potentially) environmentally sensitive and depend on two different but correlated environmental variables. Fitness of the consumer depends on the phenological match with the resource. Because we explicitly model the dependence of the phenologies on environmental variables, we can test how differential (heterogeneous) versus equal (homogeneous) rates of change in the environmental variables affect selection on consumer phenology. As expected, under heterogeneous change, phenotypic plasticity is insufficient and thus selection on consumer phenology arises. However, even homogeneous change leads to directional selection on consumer phenology. This is because the consumer reaction norm has historically evolved to be flatter than the resource reaction norm, owing to time lags and imperfect cue reliability. Climate change will therefore lead to increased selection on consumer phenology across a broad range of situations. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
77 FR 15757 - Agency Information Collection Activities; Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-16
... require sellers of consumer commodities to keep records that substantiate ``cents off,'' ``introductory... competitively sensitive information such as costs, sales statistics, inventories, formulas, patterns devices...
Assessing risks to non-target species during poison baiting programs for feral cats.
Buckmaster, Tony; Dickman, Christopher R; Johnston, Michael J
2014-01-01
Poison baiting is used frequently to reduce the impacts of pest species of mammals on agricultural and biodiversity interests. However, baiting may not be appropriate if non-target species are at risk of poisoning. Here we use a desktop decision tree approach to assess the risks to non-target vertebrate species in Australia that arise from using poison baits developed to control feral house cats (Felis catus). These baits are presented in the form of sausages with toxicant implanted in the bait medium within an acid-soluble polymer capsule (hard shell delivery vehicle, or HSDV) that disintegrates after ingestion. Using criteria based on body size, diet and feeding behaviour, we assessed 221 of Australia's 3,769 native vertebrate species as likely to consume cat-baits, with 47 of these likely to ingest implanted HSDVs too. Carnivorous marsupials were judged most likely to consume both the baits and HSDVs, with some large-bodied and ground-active birds and reptiles also consuming them. If criteria were relaxed, a further 269 species were assessed as possibly able to consume baits and 343 as possibly able to consume HSDVs; most of these consumers were birds. One threatened species, the Tasmanian devil (Sarcophilus harrisii) was judged as definitely able to consume baits with implanted HSDVs, whereas five threatened species of birds and 21 species of threatened mammals were rated as possible consumers. Amphibia were not considered to be at risk. We conclude that most species of native Australian vertebrates would not consume surface-laid baits during feral cat control programs, and that significantly fewer would be exposed to poisoning if HSDVs were employed. However, risks to susceptible species should be quantified in field or pen trials prior to the implementation of a control program, and minimized further by applying baits at times and in places where non-target species have little access.
Assessing Risks to Non-Target Species during Poison Baiting Programs for Feral Cats
Buckmaster, Tony; Dickman, Christopher R.; Johnston, Michael J.
2014-01-01
Poison baiting is used frequently to reduce the impacts of pest species of mammals on agricultural and biodiversity interests. However, baiting may not be appropriate if non-target species are at risk of poisoning. Here we use a desktop decision tree approach to assess the risks to non-target vertebrate species in Australia that arise from using poison baits developed to control feral house cats (Felis catus). These baits are presented in the form of sausages with toxicant implanted in the bait medium within an acid-soluble polymer capsule (hard shell delivery vehicle, or HSDV) that disintegrates after ingestion. Using criteria based on body size, diet and feeding behaviour, we assessed 221 of Australia's 3,769 native vertebrate species as likely to consume cat-baits, with 47 of these likely to ingest implanted HSDVs too. Carnivorous marsupials were judged most likely to consume both the baits and HSDVs, with some large-bodied and ground-active birds and reptiles also consuming them. If criteria were relaxed, a further 269 species were assessed as possibly able to consume baits and 343 as possibly able to consume HSDVs; most of these consumers were birds. One threatened species, the Tasmanian devil (Sarcophilus harrisii) was judged as definitely able to consume baits with implanted HSDVs, whereas five threatened species of birds and 21 species of threatened mammals were rated as possible consumers. Amphibia were not considered to be at risk. We conclude that most species of native Australian vertebrates would not consume surface-laid baits during feral cat control programs, and that significantly fewer would be exposed to poisoning if HSDVs were employed. However, risks to susceptible species should be quantified in field or pen trials prior to the implementation of a control program, and minimized further by applying baits at times and in places where non-target species have little access. PMID:25229348
Dermal safety assessment of Arm & Hammer laundry products formulated for sensitive skin.
Frederick, Douglas M; Vorwerk, Linda; Gupta, Archana; Ghassemi, Annahita
2017-09-01
The prevalence of sensitive skin among the general population in industrialized countries is reported to be over 50%. Sensitive skin subjects often report significant reactions to contact with cosmetics, soaps and other consumer products. This paper describes the overall skin compatibility and mildness program for a newly developed, lightly fragranced, colorant free laundry product (i.e. Arm & Hammer™ Sensitive Skin plus Skin-Friendly Fresh Scent), specially formulated for individuals with sensitive skin. The skin mildness of the product was compared to Arm & Hammer™ Free & Clear liquid laundry detergent with no fragrance or colorant, and an established history of safe use by sensitive skin consumers. The test material was a liquid laundry product with a light scent formulated for sensitive skin consumers (Arm & Hammer™ Sensitive Skin plus Skin-Friendly Fresh Scent). The product was compared to commercially marketed products for sensitive skin with a history of skin safety in the marketplace, including: a very similar product formulation (Arm & Hammer™ Free & Clear with no fragrance), and several selected competitors' products. Studies were conducted among individuals with self-assessed sensitive skin (based on a questionnaire) using standard protocols for the Human Repeat Insult Patch Test (HRIPT), 10-Day Cumulative Irritation, the Wrist Band Wear test, and the Safety In-Use testing. Responses in all protocols were evaluated by visual scoring of potential dermatologic reactions, and recording any sensory effects at the time of the examination. In addition, sensory effects collected from panelists' daily diaries were also evaluated. The HRIPT confirmed that neither the fragrance alone, nor the product formulation with fragrance, induced contact sensitization in sensitive skin subjects. The 10-Day cumulative irritation study conducted using sensitive skin subjects showed highly favorable skin compatibility, and the test product was comparable to the control product (Arm & Hammer Free & Clear) and other nonirritant controls. In the Wrist Band Wear test, exposure to laundered fabrics under exaggerated conditions gave similar results for the test and control products, with no objective signs of skin irritation, and no self-reported persistent adverse sensory effects. Very mild, transient and isolated sensory effects were noted in daily diaries by a small proportion of subjects, and were similar for the test and control products. The Safety In-Use tests evaluated 4-week exposure to product and laundered fabrics under realistic use conditions. There were no clinically objective signs of skin irritation, and reports of transitory, mild sensory effects were minimal and similar for the test and controls. A comprehensive skin safety program on a lightly scented sensitive skin laundry formulation (i.e. Arm & Hammer™ Sensitive Skin plus Skin-Friendly Fresh Scent) conducted among panels of self-assessed sensitive skin subjects demonstrated that the presence of a light fragrance did not adversely impact skin compatibility in any of the testing protocols when the product was compared to a similar product with no fragrance. The lightly fragranced product demonstrated overall skin compatibility and mildness when tested in a self-assessed sensitive skin population, and compared favorably to currently marketed sensitive skin products.
Computational Modelling and Optimal Control of Ebola Virus Disease with non-Linear Incidence Rate
NASA Astrophysics Data System (ADS)
Takaidza, I.; Makinde, O. D.; Okosun, O. K.
2017-03-01
The 2014 Ebola outbreak in West Africa has exposed the need to connect modellers and those with relevant data as pivotal to better understanding of how the disease spreads and quantifying the effects of possible interventions. In this paper, we model and analyse the Ebola virus disease with non-linear incidence rate. The epidemic model created is used to describe how the Ebola virus could potentially evolve in a population. We perform an uncertainty analysis of the basic reproductive number R 0 to quantify its sensitivity to other disease-related parameters. We also analyse the sensitivity of the final epidemic size to the time control interventions (education, vaccination, quarantine and safe handling) and provide the cost effective combination of the interventions.
Cryptic herbivores mediate the strength and form of ungulate impacts on a long-lived savanna tree.
Maclean, Janet E; Goheen, Jacob R; Doak, Daniel F; Palmer, Todd M; Young, Truman P
2011-08-01
Plant populations are regulated by a diverse array of herbivores that impose demographic filters throughout their life cycle. Few studies, however, simultaneously quantify the impacts of multiple herbivore guilds on the lifetime performance or population growth rate of plants. In African savannas, large ungulates (such as elephants) are widely regarded as important drivers of woody plant population dynamics, while the potential impacts of smaller, more cryptic herbivores (such as rodents) have largely been ignored. We combined a large-scale ungulate exclusion experiment with a five-year manipulation of rodent densities to quantify the impacts of three herbivore guilds (wild ungulates, domestic cattle, and rodents) on all life stages of a widespread savanna tree. We utilized demographic modeling to reveal the overall role of each guild in regulating tree population dynamics, and to elucidate the importance of different demographic hurdles in driving population growth under contrasting consumer communities. We found that wild ungulates dramatically reduced population growth, shifting the population trajectory from increase to decline, but that the mechanisms driving these effects were strongly mediated by rodents. The impact of wild ungulates on population growth was predominantly driven by their negative effect on tree reproduction when rodents were excluded, and on adult tree survival when rodents were present. By limiting seedling survival, rodents also reduced population growth; however, this effect was strongly dampened where wild ungulates were present. We suggest that these complex interactions between disparate consumer guilds can have important consequences for the population demography of long-lived species, and that the effects of a single consumer group are often likely to vary dramatically depending on the larger community in which interactions are embedded.
NASA Astrophysics Data System (ADS)
Belfiore, Laurence A.; Volpato, Fabio Z.; Paulino, Alexandre T.; Belfiore, Carol J.
2011-12-01
The primary objective of this investigation is to establish guidelines for generating significant mammalian cell density in suspension bioreactors when stress-sensitive kinetics enhance the rate of nutrient consumption. Ultra-low-frequency dynamic modulations of the impeller (i.e., 35104 Hz) introduce time-dependent oscillatory shear into this transient analysis of cell proliferation under semi-continuous creeping flow conditions. Greater nutrient consumption is predicted when the amplitude
Development of a time sensitivity score for frequently occurring motor vehicle crash injuries.
Schoell, Samantha L; Doud, Andrea N; Weaver, Ashley A; Talton, Jennifer W; Barnard, Ryan T; Martin, R Shayn; Meredith, J Wayne; Stitzel, Joel D
2015-03-01
Injury severity alone is a poor indicator of the time sensitivity of injuries. The purpose of the study was to quantify the urgency with which the most frequent motor vehicle crash injuries require treatment, according to expert physicians. The time sensitivity was quantified for the top 95% most frequently occurring Abbreviated Injury Scale (AIS) 2+ injuries in the National Automotive Sampling System-Crashworthiness Data System (NASS-CDS) 2000-2011. A Time Sensitivity Score was developed using expert physician survey data in which physicians were asked to determine whether a particular injury should go to a Level I/II trauma center and the urgency with which that injury required treatment. When stratifying by AIS severity, the mean Time Sensitivity Score increased with increasing AIS severity. The mean Time Sensitivity Scores by AIS severity were as follows: 0.50 (AIS 2); 0.78 (AIS 3); 0.92 (AIS 4); 0.97 (AIS 5); and 0.97 (AIS 6). When stratifying by anatomical region, the head, thorax, and abdomen were the most time sensitive. Appropriate triage depends on multiple factors, including the severity of an injury, the urgency with which it requires treatment, and the propensity of a significant injury to be missed. The Time Sensitivity Score did not correlate highly with the widely used AIS severity scores, which highlights the inability of AIS scores to capture all aspects of injury severity. The Time Sensitivity Score can be useful in Advanced Automatic Crash Notification systems for identifying highly time sensitive injuries in motor vehicle crashes requiring prompt treatment at a trauma center. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Ouabain-sensitive component of brown fat thermogenesis.
NASA Technical Reports Server (NTRS)
Horwitz, B. A.
1973-01-01
The study discussed was undertaken to quantify the amount of energy utilized by the ouabain-sensitive Na(+)-K(+) membrane pump during the norepinephrine-induced thermogenesis of brown adipose tissue. The data obtained indicate that the observed inhibition of the catecholamine-induced increase in brown fat thermogenesis by ouabain does not reflect an inhibition of cyclic AMP synthesis.
Relevance of sensitization to occupational allergy and asthma in the detergent industry.
Basketter, David; Berg, Ninna; Kruszewski, Francis H; Sarlo, Katherine; Concoby, Beth
2012-01-01
There exists considerable historic experience of the relationship between exposure and both the induction of sensitization and the elicitation of respiratory symptoms from industrial enzymes of bacterial and fungal origin used in a wide variety of detergent products. The detergent industry in particular has substantial experience of how the control of exposure leads to limitation of sensitization with low risk of symptoms. However, the experience also shows that there are substantial gaps in knowledge, even when the potential occupational allergy problem is firmly under control, and also that the relationship between exposure and sensitization can be hard to establish. The latter aspect includes a poor appreciation of how peak exposures and low levels of exposure over time contribute to sensitization. Furthermore, while a minority of workers develop specific IgE, essentially none appear to have symptoms, a situation which appears to contradict the allergy dogma that, once sensitized, an individual will react to much lower levels of exposure. For enzymes, the expression of symptoms occurs at similar or higher levels than those that cause induction. In spite of some knowledge gaps, medical surveillance programs and constant air monitoring provide the tools for successful management of enzymes in the occupational setting. Ultimately, the knowledge gained from the occupational setting facilitates the completion of safety assessments for consumer exposure to detergent enzymes. Such assessments have been proven to be correct by the decades of safe use both occupationally and in consumer products.
Ferguson, Ty; Rowlands, Alex V; Olds, Tim; Maher, Carol
2015-03-27
Technological advances have seen a burgeoning industry for accelerometer-based wearable activity monitors targeted at the consumer market. The purpose of this study was to determine the convergent validity of a selection of consumer-level accelerometer-based activity monitors. 21 healthy adults wore seven consumer-level activity monitors (Fitbit One, Fitbit Zip, Jawbone UP, Misfit Shine, Nike Fuelband, Striiv Smart Pedometer and Withings Pulse) and two research-grade accelerometers/multi-sensor devices (BodyMedia SenseWear, and ActiGraph GT3X+) for 48-hours. Participants went about their daily life in free-living conditions during data collection. The validity of the consumer-level activity monitors relative to the research devices for step count, moderate to vigorous physical activity (MVPA), sleep and total daily energy expenditure (TDEE) was quantified using Bland-Altman analysis, median absolute difference and Pearson's correlation. All consumer-level activity monitors correlated strongly (r > 0.8) with research-grade devices for step count and sleep time, but only moderately-to-strongly for TDEE (r = 0.74-0.81) and MVPA (r = 0.52-0.91). Median absolute differences were generally modest for sleep and steps (<10% of research device mean values for the majority of devices) moderate for TDEE (<30% of research device mean values), and large for MVPA (26-298%). Across the constructs examined, the Fitbit One, Fitbit Zip and Withings Pulse performed most strongly. In free-living conditions, the consumer-level activity monitors showed strong validity for the measurement of steps and sleep duration, and moderate valid for measurement of TDEE and MVPA. Validity for each construct ranged widely between devices, with the Fitbit One, Fitbit Zip and Withings Pulse being the strongest performers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lampert, David J.; Cai, Hao; Wang, Zhichao
The production of all forms of energy consumes water. To meet increased energy demands, it is essential to quantify the amount of water consumed in the production of different forms of energy. By analyzing the water consumed in different technologies, it is possible to identify areas for improvement in water conservation and reduce water stress in energy-producing regions. The transportation sector is a major consumer of energy in the United States. Because of the relationships between water and energy, the sustainability of transportation is tied to management of water resources. Assessment of water consumption throughout the life cycle of amore » fuel is necessary to understand its water resource implications. To perform a comparative life cycle assessment of transportation fuels, it is necessary first to develop an inventory of the water consumed in each process in each production supply chain. The Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET) model is an analytical tool that can used to estimate the full life-cycle environmental impacts of various transportation fuel pathways from wells to wheels. GREET is currently being expanded to include water consumption as a sustainability metric. The purpose of this report was to document data sources and methodologies to estimate water consumption factors (WCF) for the various transportation fuel pathways in GREET. WCFs reflect the quantity of freshwater directly consumed per unit production for various production processes in GREET. These factors do not include consumption of precipitation or low-quality water (e.g., seawater) and reflect only water that is consumed (i.e., not returned to the source from which it was withdrawn). The data in the report can be combined with GREET to compare the life cycle water consumption for different transportation fuels.« less
Attitudes and behaviour towards convenience food and food waste in the United Kingdom.
Mallinson, Lucy J; Russell, Jean M; Barker, Margo E
2016-08-01
Households in the UK discard much food. A reduction in such waste to mitigate environmental impact is part of UK government policy. This study investigated whether household food waste is linked to a lifestyle reliant on convenience food in younger consumers. A survey of 928 UK residents aged 18-40 years and responsible for the household food shopping (male n = 278; female n = 650) completed an online questionnaire designed to measure attitudes to convenience food and to quantify household food waste. Cluster analysis of 24 food-related lifestyle factors identified 5 consumer groups. General linear modelling techniques were used to test relationships between the purchase frequency of convenience food and household food waste. From the cluster analysis, five distinct convenience profiles emerged comprising: 'epicures' (n = 135), 'traditional consumers' (n = 255), 'casual consumers' (n = 246), 'food detached consumers' (n = 151) and 'kitchen evaders' (n = 141). Casual consumers and kitchen evaders were the most reliant on convenience food and notably were the most wasteful. The demographic profile of kitchen evaders matched the population groups currently targeted by UK food waste policy. Casual consumers represent a new and distinct group characterised by "buy a lot and waste a lot" behaviour. Household size, packaging format, price-awareness and marketing all appear to influence levels of food waste. However, it seems that subtle behavioural and sociocultural factors also have impact. Further research is needed to elucidate the factors that mediate the positive association between the purchase of convenience food and reported food waste in order to inform food waste policy and initiatives. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Piao, Lin; Fu, Zuntao
2016-11-01
Cross-correlation between pairs of variables takes multi-time scale characteristic, and it can be totally different on different time scales (changing from positive correlation to negative one), e.g., the associations between mean air temperature and relative humidity over regions to the east of Taihang mountain in China. Therefore, how to correctly unveil these correlations on different time scales is really of great importance since we actually do not know if the correlation varies with scales in advance. Here, we compare two methods, i.e. Detrended Cross-Correlation Analysis (DCCA for short) and Pearson correlation, in quantifying scale-dependent correlations directly to raw observed records and artificially generated sequences with known cross-correlation features. Studies show that 1) DCCA related methods can indeed quantify scale-dependent correlations, but not Pearson method; 2) the correlation features from DCCA related methods are robust to contaminated noises, however, the results from Pearson method are sensitive to noise; 3) the scale-dependent correlation results from DCCA related methods are robust to the amplitude ratio between slow and fast components, while Pearson method may be sensitive to the amplitude ratio. All these features indicate that DCCA related methods take some advantages in correctly quantifying scale-dependent correlations, which results from different physical processes.
Piao, Lin; Fu, Zuntao
2016-11-09
Cross-correlation between pairs of variables takes multi-time scale characteristic, and it can be totally different on different time scales (changing from positive correlation to negative one), e.g., the associations between mean air temperature and relative humidity over regions to the east of Taihang mountain in China. Therefore, how to correctly unveil these correlations on different time scales is really of great importance since we actually do not know if the correlation varies with scales in advance. Here, we compare two methods, i.e. Detrended Cross-Correlation Analysis (DCCA for short) and Pearson correlation, in quantifying scale-dependent correlations directly to raw observed records and artificially generated sequences with known cross-correlation features. Studies show that 1) DCCA related methods can indeed quantify scale-dependent correlations, but not Pearson method; 2) the correlation features from DCCA related methods are robust to contaminated noises, however, the results from Pearson method are sensitive to noise; 3) the scale-dependent correlation results from DCCA related methods are robust to the amplitude ratio between slow and fast components, while Pearson method may be sensitive to the amplitude ratio. All these features indicate that DCCA related methods take some advantages in correctly quantifying scale-dependent correlations, which results from different physical processes.
Langer, Swen; Marshall, Lisa J; Day, Andrea J; Morgan, Michael R A
2011-08-10
Intake of flavanols, a subgroup of dietary polyphenols present in many fruits and vegetables, may be associated with health benefits, particularly with reducing the risk of coronary diseases. Cocoa and chocolate products are rich in flavanol monomers, oligomers, and polymers (procyanidins). This study used normal phase HPLC to detect, identify, and quantify epicatechin, catechin, total monomers, procyanidin oligomers and polymers in 14 commercially available chocolate bars. In addition, methylxanthines (theobromine and caffeine) were also quantified. Nonfat cocoa solids (NFCS) were determined both gravimetrically and by calculation from theobromine contents. The flavanol levels of 12 commonly consumed brands of dark chocolate have been quantified and correlated with % theobromine and % NFCS. Epicatechin comprised the largest fraction of total chocolate flavonoids, with the remainder being catechin and procyanidins. Calculated NFCS did not reflect epicatechin (R(2) = 0.41) or total flavanol contents (R(2) = 0.49). Epicatechin (R(2) = 0.96) was a reliable marker of total flavanols, catechin (R(2) = 0.67) to a lesser extent. All dark chocolate tested contained higher levels of total flavanols (93.5-651.1 mg of epicatechin equiv/100 g of product) than a milk or a white "chocolate" (40.6 and 0.0 mg of epicatechin equiv/100 g, respectively). The amount and integrity of procyanidins often suffer in the manufacturing of chocolate, chiefly due to oxidation and alkalinization. In this study, the labeled cocoa content of the chocolate did not always reflect analyzed levels of flavonoids. Increasingly, high % NFCS is being used commercially to reflect chocolate quality. If the flavanol content of chocolate is accepted to be a key determinant of health benefits, then continued monitoring of flavanol levels in commercially available chocolate products may be essential for consumer assurance.
Calculating the optimum temperature for serving hot beverages.
Brown, Fredericka; Diller, Kenneth R
2008-08-01
Hot beverages such as tea, hot chocolate, and coffee are frequently served at temperatures between 160 degrees F (71.1 degrees C) and 185 degrees F (85 degrees C). Brief exposures to liquids in this temperature range can cause significant scald burns. However, hot beverages must be served at a temperature that is high enough to provide a satisfactory sensation to the consumer. This paper presents an analysis to quantify hot beverage temperatures that balance limiting the potential scald burn hazard and maintaining an acceptable perception of adequate product warmth. A figure of merit that can be optimized is defined that quantifies and combines both the above effects as a function of the beverage temperature. An established mathematical model for simulating burns as a function of applied surface temperature and time of exposure is used to quantify the extent of thermal injury. Recent data from the literature defines the consumer preferred drinking temperature of coffee. A metric accommodates the thermal effects of both scald hazard and product taste to identify an optimal recommended serving temperature. The burn model shows the standard exponential dependence of injury level on temperature. The preferred drinking temperature of coffee is specified in the literature as 140+/-15 degrees F (60+/-8.3 degrees C) for a population of 300 subjects. A linear (with respect to temperature) figure of merit merged the two effects to identify an optimal drinking temperature of approximately 136 degrees F (57.8 degrees C). The analysis points to a reduction in the presently recommended serving temperature of coffee to achieve the combined result of reducing the scald burn hazard and improving customer satisfaction.
Integrated Sensitivity Analysis Workflow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman-Hill, Ernest J.; Hoffman, Edward L.; Gibson, Marcus J.
2014-08-01
Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.
NASA Astrophysics Data System (ADS)
Bell, A.; Hioki, S.; Wang, Y.; Yang, P.; Di Girolamo, L.
2016-12-01
Previous studies found that including ice particle surface roughness in forward light scattering calculations significantly reduces the differences between observed and simulated polarimetric and radiometric observations. While it is suggested that some degree of roughness is desirable, the appropriate degree of surface roughness to be assumed in operational cloud property retrievals and the sensitivity of retrieval products to this assumption remains uncertain. In an effort to extricate this ambiguity, we will present a sensitivity analysis of space-borne multi-angle observations of reflectivity, to varying degrees of surface roughness. This process is two fold. First, sampling information and statistics of Multi-angle Imaging SpectroRadiometer (MISR) sensor data aboard the Terra platform, will be used to define the most coming viewing observation geometries. Using these defined geometries, reflectivity will be simulated for multiple degrees of roughness using results from adding-doubling radiative transfer simulations. Sensitivity of simulated reflectivity to surface roughness can then be quantified, thus yielding a more robust retrieval system. Secondly, sensitivity of the inverse problem will be analyzed. Spherical albedo values will be computed by feeding blocks of MISR data comprising cloudy pixels over ocean into the retrieval system, with assumed values of surface roughness. The sensitivity of spherical albedo to the inclusion of surface roughness can then be quantified, and the accuracy of retrieved parameters can be determined.
Tong, Qing-He; Tao, Tao; Xie, Li-Qi; Lu, Hao-Jie
2016-06-15
Detection of low-abundance proteins and their post-translational modifications (PTMs) remains a great challenge. A conventional enzyme-linked immunosorbent assay (ELISA) is not sensitive enough to detect low-abundance PTMs and suffers from nonspecific detection. Herein, a rapid, highly sensitive and specific platform integrating ELISA with a proximity ligation assay (PLA), termed ELISA-PLA, was developed. Using ELISA-PLA, the specificity was improved by the simultaneous and proximate recognition of targets through multiple probes, and the sensitivity was significantly improved by rolling circle amplification (RCA). For GFP, the limit of detection (LOD) was decreased by two orders of magnitude compared to that of ELISA. Using site-specific phospho-antibody and pan-specific phospho-antibody, ELISA-PLA was successfully applied to quantify the phosphorylation dynamics of ERK1/2 and the overall tyrosine phosphorylation level of ERK1/2, respectively. ELISA-PLA was also used to quantify the O-GlcNAcylation of AKT, c-Fos, CREB and STAT3, which is faster and more sensitive than the conventional immunoprecipitation and western blotting (IP-WB) method. As a result, the sample consumption of ELISA-PLA was reduced 40-fold compared to IP-WB. Therefore, ELISA-PLA could be a promising platform for the rapid, sensitive and specific detection of proteins and PTMs. Copyright © 2016 Elsevier B.V. All rights reserved.
Cobalt recycling in the United States in 1998
Shedd, Kim B.
2002-01-01
This report is one of a series of reports on metals recycling. It defines and quantifies the 1998 flow of cobalt-bearing materials in the United States, from imports and stock releases through consumption and disposition, with particular emphasis on the recycling of industrial scrap (new scrap) and used products (old scrap). Because of cobalt?s many and diverse uses, numerous types of scrap were available for recycling by a wide variety of processes. In 1998, an estimated 32 percent of U.S. cobalt supply was derived from scrap. The ratio of cobalt consumed from new scrap to that from old scrap was estimated to be 50:50. Of all the cobalt in old scrap available for recycling, an estimated 68 percent was either consumed in the United States or exported to be recycled.
78 FR 70046 - Agency Information Collection Activities; Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-22
... Gans, Attorney, Division of Marketing Practices, Bureau of Consumer Protection, Federal Trade... that your comment does not include any sensitive personal information, such as anyone's Social Security...
Lupfer, Gwen; Murphy, Eric S; Merculieff, Zoe; Radcliffe, Kori; Duddleston, Khrystyne N
2015-06-01
Ethanol consumption and sensitivity in many species are influenced by the frequency with which ethanol is encountered in their niches. In Experiment 1, dwarf hamsters (Phodopus campbelli) with ad libitum access to food and water consumed high amounts of unsweetened alcohol solutions. Their consumption of 15%, but not 30%, ethanol was reduced when they were fed a high-fat diet; a high carbohydrate diet did not affect ethanol consumption. In Experiment 2, intraperitoneal injections of ethanol caused significant dose-related motor impairment. Much larger doses administered orally, however, had no effect. In Experiment 3, ryegrass seeds, a common food source for wild dwarf hamsters, supported ethanol fermentation. Results of these experiments suggest that dwarf hamsters may have adapted to consume foods in which ethanol production naturally occurs. Copyright © 2015 Elsevier B.V. All rights reserved.
Angle Performance on Optima XE
DOE Office of Scientific and Technical Information (OSTI.GOV)
David, Jonathan; Satoh, Shu
2011-01-07
Angle control on high energy implanters is important due to shrinking device dimensions, and sensitivity to channeling at high beam energies. On Optima XE, beam-to-wafer angles are controlled in both the horizontal and vertical directions. In the horizontal direction, the beam angle is measured through a series of narrow slits, and any angle adjustment is made by steering the beam with the corrector magnet. In the vertical direction, the beam angle is measured through a high aspect ratio mask, and any angle adjustment is made by slightly tilting the wafer platen during implant.Using a sensitive channeling condition, we were ablemore » to quantify the angle repeatability of Optima XE. By quantifying the sheet resistance sensitivity to both horizontal and vertical angle variation, the total angle variation was calculated as 0.04 deg. (1{sigma}). Implants were run over a five week period, with all of the wafers selected from a single boule, in order to control for any crystal cut variation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Chen, Xingyuan; Ye, Ming
Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less
Ludlam, J.P.; Magoulick, D.D.
2010-01-01
Benthic consumers influence stream ecosystem structure and function, but these interactions depend on environmental context. We experimentally quantified the effects of central stoneroller minnows (Campostoma anomalum (Rafinesque) and Meek's crayfish (Orconectes meeki meeki (Faxon)) on benthic communities using electric exclusion quadrats in Little Mulberry Creek before (June) and during (August) seasonal stream drying. Unglazed ceramic tiles were deployed in June and August to measure periphyton and invertebrate abundance, and leafpack decomposition and primary production were also measured in August. Relationships between stoneroller and crayfish density and the size of consumer effects were evaluated with multiple linear regression models. Average chlorophyll a abundance was greater on exposed than exclusion tiles in August, but not in June. Sediment dry mass, periphyton ash-free dry mass (AFDM), and chironomid densities on tiles did not differ among treatments in either period. Leaf packs decayed faster in exposed than exclusion treatments (kexposed = 0.038 ?? 0.013, kexclusion = 0.007 ?? 0.002), but consumer effects were stronger in some pools than others. Leafpack invertebrate biomass and abundance and tile primary productivity did not differ among treatments. Consumer effects on chlorophyll a were related to crayfish and stoneroller density, and effects on chironomid density were related to stoneroller density. These results contrast with a previous exclusion experiment in Little Mulberry Creek that demonstrated strong consumer effects. The influence of stream drying on consumer effects appears to have been reduced by strong spates, underscoring the importance of conducting multi-year studies to determine the magnitude of variability in ecological interactions. ?? US Government: USGS 2010.
Perera, Ambegoda Liyanage Harini Amalka
2017-01-01
Natural rubber latex (NRL) allergy is caused by the extractable latex proteins in dipped rubber products. It is a major concern for the consumers who are sensitive to the allergenic extractable proteins (EP) in products such as NRL gloves. Objective of this research was to develop an economical method to reduce the EP in finished dipped NRL products. In order to reduce the EP levels, two natural proteases, bromelain from pineapple and papain from papaya, were extracted and partially purified using (NH4)2SO4. According to the newly developed method, different glove samples were treated with a 5% solution of each partially purified enzyme, for 2 hours at 60°C. Residual amounts of in treated samples were quantified using the modified Lowry assay (ASTM D5712-10). Bromelain displayed a 54 (±11)% reduction of the EP from the dipped rubber products, whereas it was 58 (±8)% with papain. These results clearly indicate that the selected natural proteases, bromelain, and papain contribute significantly towards the reduction of the total EP in finished NRL products. Application of bromelain enzyme for the aforementioned purpose has not been reported up to date, whereas papain has been used to treat raw NRL towards reducing the EP. PMID:28706952
Saha, Monjoy; Chakraborty, Chandan; Arun, Indu; Ahmed, Rosina; Chatterjee, Sanjoy
2017-06-12
Being a non-histone protein, Ki-67 is one of the essential biomarkers for the immunohistochemical assessment of proliferation rate in breast cancer screening and grading. The Ki-67 signature is always sensitive to radiotherapy and chemotherapy. Due to random morphological, color and intensity variations of cell nuclei (immunopositive and immunonegative), manual/subjective assessment of Ki-67 scoring is error-prone and time-consuming. Hence, several machine learning approaches have been reported; nevertheless, none of them had worked on deep learning based hotspots detection and proliferation scoring. In this article, we suggest an advanced deep learning model for computerized recognition of candidate hotspots and subsequent proliferation rate scoring by quantifying Ki-67 appearance in breast cancer immunohistochemical images. Unlike existing Ki-67 scoring techniques, our methodology uses Gamma mixture model (GMM) with Expectation-Maximization for seed point detection and patch selection and deep learning, comprises with decision layer, for hotspots detection and proliferation scoring. Experimental results provide 93% precision, 0.88% recall and 0.91% F-score value. The model performance has also been compared with the pathologists' manual annotations and recently published articles. In future, the proposed deep learning framework will be highly reliable and beneficial to the junior and senior pathologists for fast and efficient Ki-67 scoring.
Wang, Li; Carnegie, Graeme K.
2013-01-01
Among methods to study protein-protein interaction inside cells, Bimolecular Fluorescence Complementation (BiFC) is relatively simple and sensitive. BiFC is based on the production of fluorescence using two non-fluorescent fragments of a fluorescent protein (Venus, a Yellow Fluorescent Protein variant, is used here). Non-fluorescent Venus fragments (VN and VC) are fused to two interacting proteins (in this case, AKAP-Lbc and PDE4D3), yielding fluorescence due to VN-AKAP-Lbc-VC-PDE4D3 interaction and the formation of a functional fluorescent protein inside cells. BiFC provides information on the subcellular localization of protein complexes and the strength of protein interactions based on fluorescence intensity. However, BiFC analysis using microscopy to quantify the strength of protein-protein interaction is time-consuming and somewhat subjective due to heterogeneity in protein expression and interaction. By coupling flow cytometric analysis with BiFC methodology, the fluorescent BiFC protein-protein interaction signal can be accurately measured for a large quantity of cells in a short time. Here, we demonstrate an application of this methodology to map regions in PDE4D3 that are required for the interaction with AKAP-Lbc. This high throughput methodology can be applied to screening factors that regulate protein-protein interaction. PMID:23979513
Wang, Li; Carnegie, Graeme K
2013-08-15
Among methods to study protein-protein interaction inside cells, Bimolecular Fluorescence Complementation (BiFC) is relatively simple and sensitive. BiFC is based on the production of fluorescence using two non-fluorescent fragments of a fluorescent protein (Venus, a Yellow Fluorescent Protein variant, is used here). Non-fluorescent Venus fragments (VN and VC) are fused to two interacting proteins (in this case, AKAP-Lbc and PDE4D3), yielding fluorescence due to VN-AKAP-Lbc-VC-PDE4D3 interaction and the formation of a functional fluorescent protein inside cells. BiFC provides information on the subcellular localization of protein complexes and the strength of protein interactions based on fluorescence intensity. However, BiFC analysis using microscopy to quantify the strength of protein-protein interaction is time-consuming and somewhat subjective due to heterogeneity in protein expression and interaction. By coupling flow cytometric analysis with BiFC methodology, the fluorescent BiFC protein-protein interaction signal can be accurately measured for a large quantity of cells in a short time. Here, we demonstrate an application of this methodology to map regions in PDE4D3 that are required for the interaction with AKAP-Lbc. This high throughput methodology can be applied to screening factors that regulate protein-protein interaction.
A novel thermometric biosensor for fast surveillance of β-lactamase activity in milk.
Zhou, Shuang; Zhao, Yunfeng; Mecklenburg, Michael; Yang, Dajin; Xie, Bin
2013-11-15
Regulatory restrictions on antibiotic residues in dairy products have resulted in the illegal addition of β-lactamase to lower antibiotic levels in milk in China. Here we demonstrate a fast, sensitive and convenient method based on enzyme thermistor (ET) for the surveillance of β-lactamase in milk. A fixed amount of penicillin G, which is a specific substrate of β-lactamase, was incubated with the milk sample, and an aliquot of the mixture was directly injected into the ET system to give a temperature change corresponding to the remained penicillin G. The amount of β-lactamase present in sample was deduced by the penicillin G consumed during incubation. This method was successfully applied to quantify β-lactamase in milk with the linear range of 1.1-20 UmL(-1) and the detection limit of 1.1 UmL(-1). The recoveries ranged from 93% to 105%, with relative standard deviations (RSDs) below 8%. The stability of the column equipped in ET was also studied, and only 5% decrease of activity was observed after 60 days of use. Compared with the conventional culture-based assay, the advantages of high throughput, timesaving and accurate quantification have made this method an ideal alternative for routine use. Copyright © 2013 Elsevier B.V. All rights reserved.
Granja, Rodrigo H M M; Niño, Alfredo M Montes; Zucchetti, Roberto A M; Niño, Rosario E Montes; Salerno, Alessandro G
2008-01-01
Ethopabate is frequently used in the prophylaxis and treatment of coccidiosis in poultry. Residues of this drug in food present a potential risk to consumers. A simple, rapid, and sensitive column high-performance liquid chromatographic (HPLC) method with UV detection for determination of ethopabate in poultry liver is presented. The drug is extracted with acetonitrile. After evaporation, the residue is dissolved with an acetone-hexane mixture and cleaned up by solid-phase extraction using Florisil columns. The analyte is then eluted with methanol. LC analysis is carried out on a C18 5 microm Gemini column, 15 cm x 4.6 mm. Ethopabate is quantified by means of UV detection at 270 nm. Parameters such as decision limit, detection capability, precision, recovery, ruggedness, and measurement uncertainty were calculated according to method validation guidelines provided in 2002/657/EC and ISO/IEC 17025:2005. Decision limit and detection capability were determined to be 2 and 3 microg/kg, respectively. Average recoveries from poultry samples fortified with 10, 15, and 20 microg/kg levels of ethopabate were 100-105%. A complete statistical analysis was performed on the results obtained, including an estimation of the method uncertainty. The method is to be implemented into Brazil's residue monitoring and control program for ethopabate.
Hormone Profiling in Plant Tissues.
Müller, Maren; Munné-Bosch, Sergi
2017-01-01
Plant hormones are for a long time known to act as chemical messengers in the regulation of physiological processes during a plant's life cycle, from germination to senescence. Furthermore, plant hormones simultaneously coordinate physiological responses to biotic and abiotic stresses. To study the hormonal regulation of physiological processes, three main approaches have been used (1) exogenous application of hormones, (2) correlative studies through measurements of endogenous hormone levels, and (3) use of transgenic and/or mutant plants altered in hormone metabolism or signaling. A plant hormone profiling method is useful to unravel cross talk between hormones and help unravel the hormonal regulation of physiological processes in studies using any of the aforementioned approaches. However, hormone profiling is still particularly challenging due to their very low abundance in plant tissues. In this chapter, a sensitive, rapid, and accurate method to quantify all the five "classic" classes of plant hormones plus other plant growth regulators, such as jasmonates, salicylic acid, melatonin, and brassinosteroids is described. The method includes a fast and simple extraction procedure without time consuming steps as purification or derivatization, followed by optimized ultrahigh-performance liquid chromatography coupled to electrospray ionization-tandem mass spectrometry (UHPLC-MS/MS) analysis. This protocol facilitates the high-throughput analysis of hormone profiling and is applicable to different plant tissues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Zhenhong; Dong, Jing; Liu, Changzheng
2012-01-01
The petroleum and electricity consumptions of plug-in hybrid electric vehicles (PHEVs) are sensitive to the variation of daily vehicle miles traveled (DVMT). Some studies assume DVMT to follow a Gamma distribution, but such a Gamma assumption is yet to be validated. This study finds the Gamma assumption valid in the context of PHEV energy analysis, based on continuous GPS travel data of 382 vehicles, each tracked for at least 183 days. The validity conclusion is based on the found small prediction errors, resulting from the Gamma assumption, in PHEV petroleum use, electricity use, and energy cost. The finding that themore » Gamma distribution is valid and reliable is important. It paves the way for the Gamma distribution to be assumed for analyzing energy uses of PHEVs in the real world. The Gamma distribution can be easily specified with very few pieces of driver information and is relatively easy for mathematical manipulation. Given the validation in this study, the Gamma distribution can now be used with better confidence in a variety of applications, such as improving vehicle consumer choice models, quantifying range anxiety for battery electric vehicles, investigating roles of charging infrastructure, and constructing online calculators that provide personal estimates of PHEV energy use.« less
Quantifying chaos for ecological stoichiometry.
Duarte, Jorge; Januário, Cristina; Martins, Nuno; Sardanyés, Josep
2010-09-01
The theory of ecological stoichiometry considers ecological interactions among species with different chemical compositions. Both experimental and theoretical investigations have shown the importance of species composition in the outcome of the population dynamics. A recent study of a theoretical three-species food chain model considering stoichiometry [B. Deng and I. Loladze, Chaos 17, 033108 (2007)] shows that coexistence between two consumers predating on the same prey is possible via chaos. In this work we study the topological and dynamical measures of the chaotic attractors found in such a model under ecological relevant parameters. By using the theory of symbolic dynamics, we first compute the topological entropy associated with unimodal Poincaré return maps obtained by Deng and Loladze from a dimension reduction. With this measure we numerically prove chaotic competitive coexistence, which is characterized by positive topological entropy and positive Lyapunov exponents, achieved when the first predator reduces its maximum growth rate, as happens at increasing δ1. However, for higher values of δ1 the dynamics become again stable due to an asymmetric bubble-like bifurcation scenario. We also show that a decrease in the efficiency of the predator sensitive to prey's quality (increasing parameter ζ) stabilizes the dynamics. Finally, we estimate the fractal dimension of the chaotic attractors for the stoichiometric ecological model.
Hort, Vincent; Nicolas, Marina; Minvielle, Brice; Maleix, Corentin; Desbourdes, Caroline; Hommet, Frédéric; Dragacci, Sylviane; Dervilly-Pinel, Gaud; Engel, Erwan; Guérin, Thierry
2018-05-28
Consumers generally considered organic products to be healthier and safer but data regarding the contamination of organic products are scarce. This study evaluated the impact of the farming system on the levels of ochratoxin A (OTA) in the tissues of French pigs (muscle and liver) reared following three different types of production (organic, Label Rouge and conventional). Because OTA is present at trace levels in animal products, a sensitive ultra-high performance liquid chromatography-tandem mass spectrometry method using stable isotope dilution assay was developed and validated. OTA was detected or quantified (LOQ of 0.10 μg kg -1 ) in 67% (n = 47) of the 70 pig liver samples analysed, with concentrations ranging from <0.10 to 3.65 μg kg -1 . The maximum concentration was found in a sample from organic production but there were no significant differences in the content of OTA between farming systems. OTA was above the LOQ in four out of 25 samples of the pork muscles. A good agreement was found between OTA levels in muscle and liver (liver concentration = 2.9 × OTA muscle concentration, r = 0.981). Copyright © 2018 Elsevier B.V. All rights reserved.
Molecular DNA-based detection of ionising radiation in meat.
Şakalar, Ergün
2017-05-01
Ionising radiation induces molecular alterations, such as formation of ions, free radicals, and new stable molecules, and cleavage of the chemical bonds of the molecules present in food. Irradiation-treated meat should be labelled to control the process and to ensure free consumer choice. Therefore, sensitive analytical methods are required to detect the irradiation dose. Meat samples were exposed to radiation doses of 0, 0.272, 0.497, 1.063, 3.64, 8.82 and 17.42 kGy in an industrial 60 Co gamma cell. Primers were designed to amplify 998, 498 and 250-base pair (bp) regions of the 18S rRNA gene of nuclear DNA from the irradiated samples. A new DNA-based method was developed to quantify the radiation exposed to the unstored meat and the meat stored at -20 °C for 3 and 6 months. The method was able to detect meat samples stored and unstored with dose limits of 1.063 and 3.64 kGy, respectively. The level of irradiation can be detected using primer pairs that target particularly different-sized sequences for DNA amplification by PCR. This method can be widely used for the analysis of not only meat samples, but also all biological materials containing DNA. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheung, WanYin; Zhang, Jie; Florita, Anthony
2015-12-08
Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance,more » cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.« less
Schaefer, Alexandre; Buratto, Luciano G.; Goto, Nobuhiko; Brotherhood, Emilie V.
2016-01-01
A large body of evidence shows that buying behaviour is strongly determined by consumers’ price expectations and the extent to which real prices violate these expectations. Despite the importance of this phenomenon, little is known regarding its neural mechanisms. Here we show that two patterns of electrical brain activity known to index prediction errors–the Feedback-Related Negativity (FRN) and the feedback-related P300 –were sensitive to price offers that were cheaper than participants’ expectations. In addition, we also found that FRN amplitude time-locked to price offers predicted whether a product would be subsequently purchased or not, and further analyses suggest that this result was driven by the sensitivity of the FRN to positive price expectation violations. This finding strongly suggests that ensembles of neurons coding positive prediction errors play a critical role in real-life consumer behaviour. Further, these findings indicate that theoretical models based on the notion of prediction error, such as the Reinforcement Learning Theory, can provide a neurobiologically grounded account of consumer behavior. PMID:27658301
Attwood, A S; Higgs, S; Terry, P
2007-03-01
Individual differences in responsiveness to caffeine occur even within a caffeine-consuming population, but the factors that mediate differential responsiveness remain unclear. To compare caffeine's effects on performance and mood in a group of high vs moderate consumers of caffeine and to examine the potential role of subjective awareness of the effects of caffeine in mediating any differential responsiveness. Two groups of regular caffeine consumers (<200 mg/day and >200 mg/day) attended two sessions at which mood and cognitive functions were measured before and 30 min after consumption of 400-mg caffeine or placebo in a capsule. Cognitive tests included visual information processing, match-to-sample visual search (MTS) and simple and choice reaction times. Post-session questionnaires asked participants to describe any perceived effect of capsule consumption. High consumers, but not moderate consumers, demonstrated significantly faster simple and choice reaction times after caffeine relative to placebo. These effects were not attributable to obvious group differences in withdrawal or tolerance because there were no group differences in baseline mood or in reports of negative affect after caffeine. Instead, the high consumers were more likely to report experiencing positive effects of caffeine, whereas the moderate consumers were more likely to report no effect. The sensitivity of caffeine consumers to the mood- and performance-enhancing effects of caffeine is related to their levels of habitual intake. High caffeine consumers are more likely than moderate consumers to perceive broadly positive effects of caffeine, and this may contribute to their levels of use.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Energy and Commerce.
H.R. 3216, the Children's Television Act of 1985--a bill to amend the Communications Act of 1934 to increase the availability of educational and informational television programs for children, deals with establishing a quantifiable children's programming guideline. This bill would establish substantial burdens under the license renewal process for…
2016-05-16
metrics involve regulating automation of complex systems , such as aircraft .12 Additionally, adaptive management of content in user interfaces has also...both the user and environmental context would aid in deciding how to present the information to the Warfighter. The prototype system currently...positioning system , and rate sensors can provide user - specific context to disambiguate physiologic data. The consumer “quantified self” market has driven
Quantifying swallowing function for healthy adults in different age groups using acoustic analysis
NASA Astrophysics Data System (ADS)
Leung, Man-Yin
Dysphagia is a medical condition that can lead to devastating complications including weight loss, aspiration pneumonia, dehydration, and malnutrition; hence, timely identification is essential. Current dysphagia evaluation tools are either invasive, time consuming, or highly dependent on the experience of an individual clinician. The present study aims to develop a non-invasive, quantitative screening tool for dysphagia identification by capturing acoustic data from swallowing and mastication. The first part of this study explores the feasibility of using acoustic data to quantify swallowing and mastication. This study then further identifies mastication and swallowing trends in a neurotypical adult population. An acoustic capture protocol for dysphagia screening is proposed. Finally, the relationship among speaking, lingual and mastication rates are explored. Results and future directions are discussed.
Natsch, Andreas; Gfeller, Hans
2008-12-01
A key step in the skin sensitization process is the formation of a covalent adduct between skin sensitizers and endogenous proteins and/or peptides in the skin. Based on this mechanistic understanding, there is a renewed interest in in vitro assays to determine the reactivity of chemicals toward peptides in order to predict their sensitization potential. A standardized peptide reactivity assay yielded a promising predictivity. This published assay is based on high-performance liquid chromatography with ultraviolet detection to quantify peptide depletion after incubation with test chemicals. We had observed that peptide depletion may be due to either adduct formation or peptide oxidation. Here we report a modified assay based on both liquid chromatography-mass spectrometry (LC-MS) analysis and detection of free thiol groups. This approach allows simultaneous determination of (1) peptide depletion, (2) peptide oxidation (dimerization), (3) adduct formation, and (4) thiol reactivity and thus generates a more detailed characterization of the reactivity of a molecule. Highly reactive molecules are further discriminated with a kinetic measure. The assay was validated on 80 chemicals. Peptide depletion could accurately be quantified both with LC-MS detection and depletion of thiol groups. The majority of the moderate/strong/extreme sensitizers formed detectable peptide adducts, but many sensitizers were also able to catalyze peptide oxidation. Whereas adduct formation was only observed for sensitizers, this oxidation reaction was also observed for two nonsensitizing fragrance aldehydes, indicating that peptide depletion might not always be regarded as sufficient evidence for rating a chemical as a sensitizer. Thus, this modified assay gives a more informed view of the peptide reactivity of chemicals to better predict their sensitization potential.
Keegan, Conor; Teljeur, Conor; Turner, Brian; Thomas, Steve
2016-09-01
The determinants of consumer mobility in voluntary health insurance markets providing duplicate cover are not well understood. Consumer mobility can have important implications for competition. Consumers should be price-responsive and be willing to switch insurer in search of the best-value products. Moreover, although theory suggests low-risk consumers are more likely to switch insurer, this process should not be driven by insurers looking to attract low risks. This study utilizes data on 320,830 VHI healthcare policies due for renewal between August 2013 and June 2014. At the time of renewal, policyholders were categorized as either 'switchers' or 'stayers', and policy information was collected for the prior 12 months. Differences between these groups were assessed by means of logistic regression. The ability of Ireland's risk equalization scheme to account for the relative attractiveness of switchers was also examined. Policyholders were price sensitive (OR 1.052, p < 0.01), however, price-sensitivity declined with age. Age (OR 0.971; p < 0.01) and hospital utilization (OR 0.977; p < 0.01) were both negatively associated with switching. In line with these findings, switchers were less costly than stayers for the 12 months prior to the switch/renew decision for single person (difference in average cost = €540.64) and multiple-person policies (difference in average cost = €450.74). Some cost differences remain for single-person policies following risk equalization (difference in average cost = €88.12). Consumers appear price-responsive, which is important for competition provided it is based on correct incentives. Risk equalization payments largely eliminated the profitable status of switchers, although further refinements may be required.
Sensitive, Selective Test For Hydrazines
NASA Technical Reports Server (NTRS)
Roundbehler, David; Macdonald, Stephen
1993-01-01
Derivatives of hydrazines formed, then subjected to gas chromatography and detected via chemiluminescence. In method of detecting and quantifying hydrazine vapors, vapors reacted with dinitro compound to enhance sensitivity and selectivity. Hydrazine (HZ), monomethyl hydrazine, (MMH), and unsymmetrical dimethylhydrazine (UDMH) analyzed quantitatively and qualitatively, either alone or in mixtures. Vapors collected and reacted with 2,4-dinitrobenzaldehyde, (DNB), making it possible to concentrate hydrazine in derivative form, thereby increasing sensitivity to low initial concentrations. Increases selectivity because only those constituents of sample reacting with DNB concentrated for analysis.
Artacho, Paulina; Saravia, Julia; Ferrandière, Beatriz Decencière; Perret, Samuel; Le Galliard, Jean-François
2015-01-01
Phenotypic selection is widely accepted as the primary cause of adaptive evolution in natural populations, but selection on complex functional properties linking physiology, behavior, and morphology has been rarely quantified. In ectotherms, correlational selection on thermal physiology, thermoregulatory behavior, and energy metabolism is of special interest because of their potential coadaptation. We quantified phenotypic selection on thermal sensitivity of locomotor performance (sprint speed), thermal preferences, and resting metabolic rate in captive populations of an ectothermic vertebrate, the common lizard, Zootoca vivipara. No correlational selection between thermal sensitivity of performance, thermoregulatory behavior, and energy metabolism was found. A combination of high body mass and resting metabolic rate was positively correlated with survival and negatively correlated with fecundity. Thus, different mechanisms underlie selection on metabolism in lizards with small body mass than in lizards with high body mass. In addition, lizards that selected the near average preferred body temperature grew faster that their congeners. This is one of the few studies that quantifies significant correlational selection on a proxy of energy expenditure and stabilizing selection on thermoregulatory behavior. PMID:26380689
Artacho, Paulina; Saravia, Julia; Ferrandière, Beatriz Decencière; Perret, Samuel; Le Galliard, Jean-François
2015-09-01
Phenotypic selection is widely accepted as the primary cause of adaptive evolution in natural populations, but selection on complex functional properties linking physiology, behavior, and morphology has been rarely quantified. In ectotherms, correlational selection on thermal physiology, thermoregulatory behavior, and energy metabolism is of special interest because of their potential coadaptation. We quantified phenotypic selection on thermal sensitivity of locomotor performance (sprint speed), thermal preferences, and resting metabolic rate in captive populations of an ectothermic vertebrate, the common lizard, Zootoca vivipara. No correlational selection between thermal sensitivity of performance, thermoregulatory behavior, and energy metabolism was found. A combination of high body mass and resting metabolic rate was positively correlated with survival and negatively correlated with fecundity. Thus, different mechanisms underlie selection on metabolism in lizards with small body mass than in lizards with high body mass. In addition, lizards that selected the near average preferred body temperature grew faster that their congeners. This is one of the few studies that quantifies significant correlational selection on a proxy of energy expenditure and stabilizing selection on thermoregulatory behavior.
Quantifying serum antibody in bird fanciers' hypersensitivity pneumonitis.
McSharry, Charles; Dye, George M; Ismail, Tengku; Anderson, Kenneth; Spiers, Elizabeth M; Boyd, Gavin
2006-06-26
Detecting serum antibody against inhaled antigens is an important diagnostic adjunct for hypersensitivity pneumonitis (HP). We sought to validate a quantitative fluorimetric assay testing serum from bird fanciers. Antibody activity was assessed in bird fanciers and control subjects using various avian antigens and serological methods, and the titer was compared with symptoms of HP. IgG antibody against pigeon serum antigens, quantified by fluorimetry, provided a good discriminator of disease. Levels below 10 mg/L were insignificant, and increasing titers were associated with disease. The assay was unaffected by total IgG, autoantibodies and antibody to dietary hen's egg antigens. Antigens from pigeon serum seem sufficient to recognize immune sensitivity to most common pet avian species. Decreasing antibody titers confirmed antigen avoidance. Increasing antibody titer reflected the likelihood of HP, and decreasing titers confirmed antigen avoidance. Quantifying antibody was rapid and the increased sensitivity will improve the rate of false-negative reporting and obviate the need for invasive diagnostic procedures. Automated fluorimetry provides a method for the international standardization of HP serology thereby improving quality control and improving its suitability as a diagnostic adjunct.
NASA Technical Reports Server (NTRS)
Wilson, D. J.
1971-01-01
Time-dependent notch sensitivity of Inconel 718 sheet was observed at 900 F to 1200 F (482 - 649 C). It occurred when edge-notched specimens were loaded below the yield strength and smooth specimen tests showed that small amounts of creep consumed large rupture life fractions. The severity of the notch sensitivity was reduced by decreasing the solution temperature, increasing the time and/or temperature of aging and increasing the test temperature to 1400 F (760 C). Elimination of time-dependent notch sensitivity correlated with a change in dislocation motion mechanism from shearing to by-passing precipitate particles.
J. Stephen Brewer
2015-01-01
Although repeated fires are generally thought to reduce competition, direct tests of this hypothesis are rare. Furthermore, recent theory predicts that fires can increase competitive effects of fireresistant species on fire-sensitive species and thus create stable assemblages dominated by the former. In this study, I quantified competition between saplings of fire-...
Li, Xin; Kaattari, Stephen L; Vogelbein, Mary A; Vadas, George G; Unger, Michael A
2016-03-01
Immunoassays based on monoclonal antibodies (mAbs) are highly sensitive for the detection of polycyclic aromatic hydrocarbons (PAHs) and can be employed to determine concentrations in near real-time. A sensitive generic mAb against PAHs, named as 2G8, was developed by a three-step screening procedure. It exhibited nearly uniformly high sensitivity against 3-ring to 5-ring unsubstituted PAHs and their common environmental methylated PAHs, with IC 50 values between 1.68-31 μg/L (ppb). 2G8 has been successfully applied on the KinExA Inline Biosensor system for quantifying 3-5 ring PAHs in aqueous environmental samples. PAHs were detected at a concentration as low as 0.2 μg/L. Furthermore, the analyses only required 10 min for each sample. To evaluate the accuracy of the 2G8-based biosensor, the total PAH concentrations in a series of environmental samples analyzed by biosensor and GC-MS were compared. In most cases, the results yielded a good correlation between methods. This indicates that generic antibody 2G8 based biosensor possesses significant promise for a low cost, rapid method for PAH determination in aqueous samples.
Hess, Julie M; Slavin, Joanne L
2017-09-01
To quantify and compare the nutrient-density of commonly consumed snacks using two nutrient-density measures, Nutrient Rich Foods Indices 9.3 (NRF 9.3) and 15.3 (NRF 15.3). Identify commonly consumed categories of snacks and individual snack foods, calculate NRF 9.3 and 15.3 scores, rank snacks by category and by individual food based on nutrient density, compare and contrast scores generated by the two NRF Indices. NRF 9.3 and 15.3 scores. Averages and standard deviations of nutrient-density scores for each snack category. Vegetables and coffee/tea received the highest category scores on both indices. Cakes/cookies/pastries and sweets had the lowest category scores. NRF 9.3 scores for individual snacks ranged from -46 (soda) to 524 (coffee). NRF 15.3 scores ranged from -45 (soda) to 736 (coffee). If added to food labels, NRF scores could help consumers identify more nutritious choices. The differences between NRF 9.3 and 15.3 scores generated for the same foods and the limitations of these indices highlight the need for careful consideration of which nutrient-density measure to include on food labels as well as consumer education. © 2017 Institute of Food Technologists®.
Martín, Verónica; Perales, Celia; Fernández-Algar, María; Dos Santos, Helena G; Garrido, Patricia; Pernas, María; Parro, Víctor; Moreno, Miguel; García-Pérez, Javier; Alcamí, José; Torán, José Luis; Abia, David; Domingo, Esteban; Briones, Carlos
2016-01-01
The response of human immunodeficiency virus type 1 (HIV-1) quasispecies to antiretroviral therapy is influenced by the ensemble of mutants that composes the evolving population. Low-abundance subpopulations within HIV-1 quasispecies may determine the viral response to the administered drug combinations. However, routine sequencing assays available to clinical laboratories do not recognize HIV-1 minority variants representing less than 25% of the population. Although several alternative and more sensitive genotyping techniques have been developed, including next-generation sequencing (NGS) methods, they are usually very time consuming, expensive and require highly trained personnel, thus becoming unrealistic approaches in daily clinical practice. Here we describe the development and testing of a HIV-1 genotyping DNA microarray that detects and quantifies, in majority and minority viral subpopulations, relevant mutations and amino acid insertions in 42 codons of the pol gene associated with drug- and multidrug-resistance to protease (PR) and reverse transcriptase (RT) inhibitors. A customized bioinformatics protocol has been implemented to analyze the microarray hybridization data by including a new normalization procedure and a stepwise filtering algorithm, which resulted in the highly accurate (96.33%) detection of positive/negative signals. This microarray has been tested with 57 subtype B HIV-1 clinical samples extracted from multi-treated patients, showing an overall identification of 95.53% and 89.24% of the queried PR and RT codons, respectively, and enough sensitivity to detect minority subpopulations representing as low as 5-10% of the total quasispecies. The developed genotyping platform represents an efficient diagnostic and prognostic tool useful to personalize antiviral treatments in clinical practice.
Bunnak, Phumthep; Allmendinger, Richard; Ramasamy, Sri V.; Lettieri, Paola
2016-01-01
Life‐cycle assessment (LCA) is an environmental assessment tool that quantifies the environmental impact associated with a product or a process (e.g., water consumption, energy requirements, and solid waste generation). While LCA is a standard approach in many commercial industries, its application has not been exploited widely in the bioprocessing sector. To contribute toward the design of more cost‐efficient, robust and environmentally‐friendly manufacturing process for monoclonal antibodies (mAbs), a framework consisting of an LCA and economic analysis combined with a sensitivity analysis of manufacturing process parameters and a production scale‐up study is presented. The efficiency of the framework is demonstrated using a comparative study of the two most commonly used upstream configurations for mAb manufacture, namely fed‐batch (FB) and perfusion‐based processes. Results obtained by the framework are presented using a range of visualization tools, and indicate that a standard perfusion process (with a pooling duration of 4 days) has similar cost of goods than a FB process but a larger environmental footprint because it consumed 35% more water, demanded 17% more energy, and emitted 17% more CO2 than the FB process. Water consumption was the most important impact category, especially when scaling‐up the processes, as energy was required to produce process water and water‐for‐injection, while CO2 was emitted from energy generation. The sensitivity analysis revealed that the perfusion process can be made more environmentally‐friendly than the FB process if the pooling duration is extended to 8 days. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1324–1335, 2016 PMID:27390260
Bunnak, Phumthep; Allmendinger, Richard; Ramasamy, Sri V; Lettieri, Paola; Titchener-Hooker, Nigel J
2016-09-01
Life-cycle assessment (LCA) is an environmental assessment tool that quantifies the environmental impact associated with a product or a process (e.g., water consumption, energy requirements, and solid waste generation). While LCA is a standard approach in many commercial industries, its application has not been exploited widely in the bioprocessing sector. To contribute toward the design of more cost-efficient, robust and environmentally-friendly manufacturing process for monoclonal antibodies (mAbs), a framework consisting of an LCA and economic analysis combined with a sensitivity analysis of manufacturing process parameters and a production scale-up study is presented. The efficiency of the framework is demonstrated using a comparative study of the two most commonly used upstream configurations for mAb manufacture, namely fed-batch (FB) and perfusion-based processes. Results obtained by the framework are presented using a range of visualization tools, and indicate that a standard perfusion process (with a pooling duration of 4 days) has similar cost of goods than a FB process but a larger environmental footprint because it consumed 35% more water, demanded 17% more energy, and emitted 17% more CO 2 than the FB process. Water consumption was the most important impact category, especially when scaling-up the processes, as energy was required to produce process water and water-for-injection, while CO 2 was emitted from energy generation. The sensitivity analysis revealed that the perfusion process can be made more environmentally-friendly than the FB process if the pooling duration is extended to 8 days. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1324-1335, 2016. © 2016 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.
Cabreira, Verónica; Pinto, Carla; Pinheiro, Manuela; Lopes, Paula; Peixoto, Ana; Santos, Catarina; Veiga, Isabel; Rocha, Patrícia; Pinto, Pedro; Henrique, Rui; Teixeira, Manuel R
2017-01-01
Lynch syndrome (LS) accounts for up to 4 % of all colorectal cancers (CRC). Detection of a pathogenic germline mutation in one of the mismatch repair genes is the definitive criterion for LS diagnosis, but it is time-consuming and expensive. Immunohistochemistry is the most sensitive prescreening test and its predictive value is very high for loss of expression of MSH2, MSH6, and (isolated) PMS2, but not for MLH1. We evaluated if LS predictive models have a role to improve the molecular testing algorithm in this specific setting by studying 38 individuals referred for molecular testing and who were subsequently shown to have loss of MLH1 immunoexpression in their tumors. For each proband we calculated a risk score, which represents the probability that the patient with CRC carries a pathogenic MLH1 germline mutation, using the PREMM 1,2,6 and MMRpro predictive models. Of the 38 individuals, 18.4 % had a pathogenic MLH1 germline mutation. MMRpro performed better for the purpose of this study, presenting a AUC of 0.83 (95 % CI 0.67-0.9; P < 0.001) compared with a AUC of 0.68 (95 % CI 0.51-0.82, P = 0.09) for PREMM 1,2,6 . Considering a threshold of 5 %, MMRpro would eliminate unnecessary germline mutation analysis in a significant proportion of cases while keeping very high sensitivity. We conclude that MMRpro is useful to correctly predict who should be screened for a germline MLH1 gene mutation and propose an algorithm to improve the cost-effectiveness of LS diagnosis.
Nicolaou, Nicoletta; Goodacre, Royston
2008-10-01
Microbiological safety plays a very significant part in the quality control of milk and dairy products worldwide. Current methods used in the detection and enumeration of spoilage bacteria in pasteurized milk in the dairy industry, although accurate and sensitive, are time-consuming. FT-IR spectroscopy is a metabolic fingerprinting technique that can potentially be used to deliver results with the same accuracy and sensitivity, within minutes after minimal sample preparation. We tested this hypothesis using attenuated total reflectance (ATR), and high throughput (HT) FT-IR techniques. Three main types of pasteurized milk - whole, semi-skimmed and skimmed - were used and milk was allowed to spoil naturally by incubation at 15 degrees C. Samples for FT-IR were obtained at frequent, fixed time intervals and pH and total viable counts were also recorded. Multivariate statistical methods, including principal components-discriminant function analysis and partial least squares regression (PLSR), were then used to investigate the relationship between metabolic fingerprints and the total viable counts. FT-IR ATR data for all milks showed reasonable results for bacterial loads above 10(5) cfu ml(-1). By contrast, FT-IR HT provided more accurate results for lower viable bacterial counts down to 10(3) cfu ml(-1) for whole milk and, 4 x 10(2) cfu ml(-1) for semi-skimmed and skimmed milk. Using FT-IR with PLSR we were able to acquire a metabolic fingerprint rapidly and quantify the microbial load of milk samples accurately, with very little sample preparation. We believe that metabolic fingerprinting using FT-IR has very good potential for future use in the dairy industry as a rapid method of detection and enumeration.
A Sensitive Luminescent Assay for the Histone Methyltransferase NSD1 and Other SAM-Dependent Enzymes
Drake, Katherine M.; Watson, Venita G.; Kisielewski, Anne; Glynn, Rebecca
2014-01-01
Abstract A major focus of our pediatric cancer research is the discovery of chemical probes to further our understanding of the biology of leukemia harboring fusion proteins arising from chromosomal rearrangements, and to develop novel specifically targeted therapies. The NUP98-NSD1 fusion protein occurs in a highly aggressive subtype of acute myeloid leukemia after rearrangement of the genes NUP98 and NSD1. The methyltransferase activity of NSD1 is retained in the fusion, and it gives rise to abnormally high levels of methylation at lysine 36 on histone 3, enforcing oncogene activation. Therefore, inhibition of the methyltransferase activity of NUP98-NSD1 may be considered a viable therapeutic strategy. Here, we report the development and validation of a highly sensitive and robust luminescence-based assay for NSD1 and other methyltransferases that use S-adenosylmethionine (SAM) as a methyl donor. The assay quantifies S-adenosylhomocysteine (SAH), which is produced during methyl transfer from SAM. SAH is converted enzymatically to adenosine monophosphate (AMP); in the process, adenosine triphosphate (ATP) is consumed and the amount of ATP remaining is measured using a luminescent assay kit. The assay was validated by pilot high-throughput screening (HTS), dose-response confirmation of hits, and elimination of artifacts through counterscreening against SAH detection in the absence of NSD1. The known methyltransferase inhibitor suramin was identified, and profiled for selectivity against the histone methyltransferases EZH2, SETD7, and PRMT1. HTS using the luminescent NSD1 assay described here has the potential to deliver selective NSD1 inhibitors that may serve as leads in the development of targeted therapies for NUP98-NSD1-driven leukemias. PMID:24927133
Xu, Ying; Cohen Hubal, Elaine A.; Little, John C.
2010-01-01
Background Because of the ubiquitous nature of phthalates in the environment and the potential for adverse human health effects, an urgent need exists to identify the most important sources and pathways of exposure. Objectives Using emissions of di(2-ethylhexyl) phthalate (DEHP) from vinyl flooring (VF) as an illustrative example, we describe a fundamental approach that can be used to identify the important sources and pathways of exposure associated with phthalates in indoor material. Methods We used a three-compartment model to estimate the emission rate of DEHP from VF and the evolving exposures via inhalation, dermal absorption, and oral ingestion of dust in a realistic indoor setting. Results A sensitivity analysis indicates that the VF source characteristics (surface area and material-phase concentration of DEHP), as well as the external mass-transfer coefficient and ventilation rate, are important variables that influence the steady-state DEHP concentration and the resulting exposure. In addition, DEHP is sorbed by interior surfaces, and the associated surface area and surface/air partition coefficients strongly influence the time to steady state. The roughly 40-fold range in predicted exposure reveals the inherent difficulty in using biomonitoring to identify specific sources of exposure to phthalates in the general population. Conclusions The relatively simple dependence on source and chemical-specific transport parameters suggests that the mechanistic modeling approach could be extended to predict exposures arising from other sources of phthalates as well as additional sources of other semivolatile organic compounds (SVOCs) such as biocides and flame retardants. This modeling approach could also provide a relatively inexpensive way to quantify exposure to many of the SVOCs used in indoor materials and consumer products. PMID:20123613
Xu, Ying; Cohen Hubal, Elaine A; Little, John C
2010-02-01
Because of the ubiquitous nature of phthalates in the environment and the potential for adverse human health effects, an urgent need exists to identify the most important sources and pathways of exposure. Using emissions of di(2-ethylhexyl) phthalate (DEHP) from vinyl flooring (VF) as an illustrative example, we describe a fundamental approach that can be used to identify the important sources and pathways of exposure associated with phthalates in indoor material. We used a three-compartment model to estimate the emission rate of DEHP from VF and the evolving exposures via inhalation, dermal absorption, and oral ingestion of dust in a realistic indoor setting. A sensitivity analysis indicates that the VF source characteristics (surface area and material-phase concentration of DEHP), as well as the external mass-transfer coefficient and ventilation rate, are important variables that influence the steady-state DEHP concentration and the resulting exposure. In addition, DEHP is sorbed by interior surfaces, and the associated surface area and surface/air partition coefficients strongly influence the time to steady state. The roughly 40-fold range in predicted exposure reveals the inherent difficulty in using biomonitoring to identify specific sources of exposure to phthalates in the general population. The relatively simple dependence on source and chemical-specific transport parameters suggests that the mechanistic modeling approach could be extended to predict exposures arising from other sources of phthalates as well as additional sources of other semivolatile organic compounds (SVOCs) such as biocides and flame retardants. This modeling approach could also provide a relatively inexpensive way to quantify exposure to many of the SVOCs used in indoor materials and consumer products.
Automated Morphological Analysis of Microglia After Stroke.
Heindl, Steffanie; Gesierich, Benno; Benakis, Corinne; Llovera, Gemma; Duering, Marco; Liesz, Arthur
2018-01-01
Microglia are the resident immune cells of the brain and react quickly to changes in their environment with transcriptional regulation and morphological changes. Brain tissue injury such as ischemic stroke induces a local inflammatory response encompassing microglial activation. The change in activation status of a microglia is reflected in its gradual morphological transformation from a highly ramified into a less ramified or amoeboid cell shape. For this reason, the morphological changes of microglia are widely utilized to quantify microglial activation and studying their involvement in virtually all brain diseases. However, the currently available methods, which are mainly based on manual rating of immunofluorescent microscopic images, are often inaccurate, rater biased, and highly time consuming. To address these issues, we created a fully automated image analysis tool, which enables the analysis of microglia morphology from a confocal Z-stack and providing up to 59 morphological features. We developed the algorithm on an exploratory dataset of microglial cells from a stroke mouse model and validated the findings on an independent data set. In both datasets, we could demonstrate the ability of the algorithm to sensitively discriminate between the microglia morphology in the peri-infarct and the contralateral, unaffected cortex. Dimensionality reduction by principal component analysis allowed to generate a highly sensitive compound score for microglial shape analysis. Finally, we tested for concordance of results between the novel automated analysis tool and the conventional manual analysis and found a high degree of correlation. In conclusion, our novel method for the fully automatized analysis of microglia morphology shows excellent accuracy and time efficacy compared to traditional analysis methods. This tool, which we make openly available, could find application to study microglia morphology using fluorescence imaging in a wide range of brain disease models.
Detection of Antigenic Variants of Subtype H3 Swine Influenza A Viruses from Clinical Samples.
Martin, Brigitte E; Bowman, Andrew S; Li, Lei; Nolting, Jacqueline M; Smith, David R; Hanson, Larry A; Wan, Xiu-Feng
2017-04-01
A large population of genetically and antigenically diverse influenza A viruses (IAVs) are circulating among the swine population, playing an important role in influenza ecology. Swine IAVs not only cause outbreaks among swine but also can be transmitted to humans, causing sporadic infections and even pandemic outbreaks. Antigenic characterizations of swine IAVs are key to understanding the natural history of these viruses in swine and to selecting strains for effective vaccines. However, influenza outbreaks generally spread rapidly among swine, and the conventional methods for antigenic characterization require virus propagation, a time-consuming process that can significantly reduce the effectiveness of vaccination programs. We developed and validated a rapid, sensitive, and robust method, the polyclonal serum-based proximity ligation assay (polyPLA), to identify antigenic variants of subtype H3N2 swine IAVs. This method utilizes oligonucleotide-conjugated polyclonal antibodies and quantifies antibody-antigen binding affinities by quantitative reverse transcription-PCR (RT-PCR). Results showed the assay can rapidly detect H3N2 IAVs directly from nasal wash or nasal swab samples collected from laboratory-challenged animals or during influenza surveillance at county fairs. In addition, polyPLA can accurately separate the viruses at two contemporary swine IAV antigenic clusters (H3N2 swine IAV-α and H3N2 swine IAV-ß) with a sensitivity of 84.9% and a specificity of 100.0%. The polyPLA can be routinely used in surveillance programs to detect antigenic variants of influenza viruses and to select vaccine strains for use in controlling and preventing disease in swine. Copyright © 2017 American Society for Microbiology.
Nonlinear ultrasonic stimulated thermography for damage assessment in isotropic fatigued structures
NASA Astrophysics Data System (ADS)
Fierro, Gian Piero Malfense; Calla', Danielle; Ginzburg, Dmitri; Ciampa, Francesco; Meo, Michele
2017-09-01
Traditional non-destructive evaluation (NDE) and structural health monitoring (SHM) systems are used to analyse that a structure is free of any harmful damage. However, these techniques still lack sensitivity to detect the presence of material micro-flaws in the form of fatigue damage and often require time-consuming procedures and expensive equipment. This research work presents a novel "nonlinear ultrasonic stimulated thermography" (NUST) method able to overcome some of the limitations of traditional linear ultrasonic/thermography NDE-SHM systems and to provide a reliable, rapid and cost effective estimation of fatigue damage in isotropic materials. Such a hybrid imaging approach combines the high sensitivity of nonlinear acoustic/ultrasonic techniques to detect micro-damage, with local defect frequency selection and infrared imaging. When exciting structures with an optimised frequency, nonlinear elastic waves are observed and higher frictional work at the fatigue damaged area is generated due to clapping and rubbing of the crack faces. This results in heat at cracked location that can be measured using an infrared camera. A Laser Vibrometer (LV) was used to evaluate the extent that individual frequency components contribute to the heating of the damage region by quantifying the out-of-plane velocity associated with the fundamental and second order harmonic responses. It was experimentally demonstrated the relationship between a nonlinear ultrasound parameter (βratio) of the material nonlinear response to the actual temperature rises near the crack. These results demonstrated that heat generation at damaged regions could be amplified by exciting at frequencies that provide nonlinear responses, thus improving the imaging of material damage and the reliability of NUST in a quick and reproducible manner.
Wilson, Stephen M; Eriksson, Dana K; Schneck, Sarah M; Lucanie, Jillian M
2018-01-01
This paper describes a quick aphasia battery (QAB) that aims to provide a reliable and multidimensional assessment of language function in about a quarter of an hour, bridging the gap between comprehensive batteries that are time-consuming to administer, and rapid screening instruments that provide limited detail regarding individual profiles of deficits. The QAB is made up of eight subtests, each comprising sets of items that probe different language domains, vary in difficulty, and are scored with a graded system to maximize the informativeness of each item. From the eight subtests, eight summary measures are derived, which constitute a multidimensional profile of language function, quantifying strengths and weaknesses across core language domains. The QAB was administered to 28 individuals with acute stroke and aphasia, 25 individuals with acute stroke but no aphasia, 16 individuals with chronic post-stroke aphasia, and 14 healthy controls. The patients with chronic post-stroke aphasia were tested 3 times each and scored independently by 2 raters to establish test-retest and inter-rater reliability. The Western Aphasia Battery (WAB) was also administered to these patients to assess concurrent validity. We found that all QAB summary measures were sensitive to aphasic deficits in the two groups with aphasia. All measures showed good or excellent test-retest reliability (overall summary measure: intraclass correlation coefficient (ICC) = 0.98), and excellent inter-rater reliability (overall summary measure: ICC = 0.99). Sensitivity and specificity for diagnosis of aphasia (relative to clinical impression) were 0.91 and 0.95 respectively. All QAB measures were highly correlated with corresponding WAB measures where available. Individual patients showed distinct profiles of spared and impaired function across different language domains. In sum, the QAB efficiently and reliably characterized individual profiles of language deficits.
Eriksson, Dana K.; Schneck, Sarah M.; Lucanie, Jillian M.
2018-01-01
This paper describes a quick aphasia battery (QAB) that aims to provide a reliable and multidimensional assessment of language function in about a quarter of an hour, bridging the gap between comprehensive batteries that are time-consuming to administer, and rapid screening instruments that provide limited detail regarding individual profiles of deficits. The QAB is made up of eight subtests, each comprising sets of items that probe different language domains, vary in difficulty, and are scored with a graded system to maximize the informativeness of each item. From the eight subtests, eight summary measures are derived, which constitute a multidimensional profile of language function, quantifying strengths and weaknesses across core language domains. The QAB was administered to 28 individuals with acute stroke and aphasia, 25 individuals with acute stroke but no aphasia, 16 individuals with chronic post-stroke aphasia, and 14 healthy controls. The patients with chronic post-stroke aphasia were tested 3 times each and scored independently by 2 raters to establish test-retest and inter-rater reliability. The Western Aphasia Battery (WAB) was also administered to these patients to assess concurrent validity. We found that all QAB summary measures were sensitive to aphasic deficits in the two groups with aphasia. All measures showed good or excellent test-retest reliability (overall summary measure: intraclass correlation coefficient (ICC) = 0.98), and excellent inter-rater reliability (overall summary measure: ICC = 0.99). Sensitivity and specificity for diagnosis of aphasia (relative to clinical impression) were 0.91 and 0.95 respectively. All QAB measures were highly correlated with corresponding WAB measures where available. Individual patients showed distinct profiles of spared and impaired function across different language domains. In sum, the QAB efficiently and reliably characterized individual profiles of language deficits. PMID:29425241
Rasmussen, Erin B; Reilly, William; Buckley, Jessica; Boomhower, Steven R
2012-02-01
Research on free-food intake suggests that cannabinoids are implicated in the regulation of feeding. Few studies, however, have characterized how environmental factors that affect food procurement interact with cannabinoid drugs that reduce food intake. Demand analysis provides a framework to understand how cannabinoid blockers, such as rimonabant, interact with effort in reducing demand for food. The present study examined the effects rimonabant had on demand for sucrose in obese Zucker rats when effort to obtain food varied and characterized the data using the exponential ("essential value") model of demand. Twenty-nine male (15 lean, 14 obese) Zucker rats lever-pressed under eight fixed ratio (FR) schedules of sucrose reinforcement, in which the number of lever-presses to gain access to a single sucrose pellet varied between 1 and 300. After behavior stabilized under each FR schedule, acute doses of rimonabant (1-10mg/kg) were administered prior to some sessions. The number of food reinforcers and responses in each condition was averaged and the exponential and linear demand equations were fit to the data. These demand equations quantify the value of a reinforcer by its sensitivity to price (FR) increases. Under vehicle conditions, obese Zucker rats consumed more sucrose pellets than leans at smaller fixed ratios; however, they were equally sensitive to price increases with both models of demand. Rimonabant dose-dependently reduced reinforcers and responses for lean and obese rats across all FR schedules. Data from the exponential analysis suggest that rimonabant dose-dependently increased elasticity, i.e., reduced the essential value of sucrose, a finding that is consistent with graphical depictions of normalized demand curves. Copyright © 2011 Elsevier Inc. All rights reserved.
Siruguri, Vasanthi; Bhat, Ramesh V
2015-01-11
Measurement of dietary intake of spices is gaining significance because of recognition of their health promoting benefits as well as its use for risk assessment of contaminant exposures. Estimating intake of spices at the individual level, presents several challenges since various spices are used as an integrated part of a prepared food and consumed in amounts much smaller than other dietary components. The objective of the present study is to assess intake of spices at the household and individual level on the basis of pattern of spice use and portion size of spice consumed from routinely prepared dishes in Hyderabad city in Southern India. The study was conducted in 100 households in urban areas of Hyderabad city in India with the help of a spice intake questionnaire that was prepared to collect information on the pattern of spice use, frequency, and quantity of spice consumption of 17 spices routinely used in Indian cuisine. The quantity of spice intake was assessed by measuring portion size of spice consumed from the quantity of i) spices added in routinely prepared dishes and ii) the prepared dish consumed by an individual. Based on the type of dish prepared and frequency of preparing the dishes, 11 out of 17 spices were found to be consumed by more than 50% of the households. Maximum number of spices was consumed at weekly frequencies. Red chillies and turmeric were the most frequently consumed spices by 100% of the households. The mean total intake of spices was observed to be higher through dishes consumed daily (10.4 g/portion) than from those consumed at weekly or monthly frequencies. Highest portion size intake was observed for chillies (mean 3.0 g; range 0.05-20.2 g) and lowest for nutmeg (mean 0.14 g; range 0.02-0.64 g) and mace (mean 0.21 g; range: 0.02-0.6 g). The study suggested that assessment of intake of spices varies with frequency of use of spices and type of dish consumed. Portion size estimations of spices consumed and the frequency of consumption of the spice containing dishes facilitates in quantifying spice intake at the individual level.
Puig-Junoy, Jaume; Moreno-Torres, Iván
2010-12-01
To assess the impact of competition on the consumer price and the average price paid by the National Health System (NHS) under reference pricing in the Spanish generic market. Descriptive analysis of the time trend in consumer prices before and after the application of reference pricing for the eight most sold active ingredients from 1997 to 2009. The entry of a generic at a lower consumer price than that of the brand-name pharmaceutical or the first generic does not cause a voluntary reduction in the consumer price of either the brand drug or the first generic, either before or after the application of RP. Generic entry at a lower consumer price than previously existing pharmaceuticals always causes a slight reduction in the average price paid by the NHS; however, the average price paid by the NHS is always notably higher than the lowest, the difference being greater in relative terms under reference pricing. The Spanish RP system results in very little consumer price competition between generic firms, price reduction thus being limited to regulatory measures. NHS purchases show little sensitivity to price differences between equivalent drugs priced at or below the reference price. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Association of Alimentary Factors and Nutritional Status with Caries in Children of Leon, Mexico.
Guizar, Juan Manuel; Muñoz, Nathalie; Amador, Norma; Garcia, Gabriela
To determine the association between types of food consumed, nutritional status (BMI) and caries in schoolchildren. A cross-sectional study was performed with 224 schoolchildren 6 to 12 years of age. DMFT/ dmft indices, level of oral hygiene, nutritional status as quantified by BMI and types of food consumed were determined in all participants. Data were analysed using multiple linear regression with significance set at p < 0.05. Caries prevalence was 36%. In the multiple linear regression analysis adjusted for BMI, variables related to a higher number of caries were younger age and lower intake of vitamin D, calcium and fiber, with higher consumption of phosphorous and carbohydrates (R2 = 0.30; p < 0.0001 for the model). Sweetened softdrinks and chewy candy were risk factors for higher caries prevalence, while consuming milk and carrots were protectors. Caries in schoolchildren is highly prevalent in this community and is related to younger age and lower intake of vitamin D, calcium and fiber, but a higher consumption of phosphorous and carbohydrates. No relationship was found between caries and nutritional status.
Hong, Young-seoub; Ye, Byeong-jin; Kim, Yu-mi; Kim, Byoung-gwon; Kang, Gyeong-hui; Kim, Jeong-jin; Song, Ki-hoon; Kim, Young-hun
2017-01-01
Recent epidemiological studies have reported adverse health effects, including skin cancer, due to low concentrations of arsenic via drinking water. We conducted a study to assess whether low arsenic contaminated ground water affected health of the residents who consumed it. For precise biomonitoring results, the inorganic (trivalent arsenite (As III) and pentavalent arsenate (As V)) and organic forms (monomethylarsonate (MMA) and dimethylarsinate (DMA)) of arsenic were separately quantified by combining high-performance liquid chromatography and inductively coupled plasma mass spectroscopy from urine samples. In conclusion, urinary As III, As V, MMA, and hair arsenic concentrations were significantly higher in residents who consumed arsenic contaminated ground water than control participants who consumed tap water. But, most health screening results did not show a statistically significant difference between exposed and control subjects. We presume that the elevated arsenic concentrations may not be sufficient to cause detectable health effects. Consumption of arsenic contaminated ground water could result in elevated urinary organic and inorganic arsenic concentrations. We recommend immediate discontinuation of ground water supply in this area for the safety of the residents. PMID:29186890
Bulluck, Heerajnarain; Hammond-Haley, Matthew; Fontana, Marianna; Knight, Daniel S; Sirker, Alex; Herrey, Anna S; Manisty, Charlotte; Kellman, Peter; Moon, James C; Hausenloy, Derek J
2017-08-01
A comprehensive cardiovascular magnetic resonance (CMR) in reperfused ST-segment myocardial infarction (STEMI) patients can be challenging to perform and can be time-consuming. We aimed to investigate whether native T1-mapping can accurately delineate the edema-based area-at-risk (AAR) and post-contrast T1-mapping and synthetic late gadolinium (LGE) images can quantify MI size at 1.5 T. Conventional LGE imaging and T2-mapping could then be omitted, thereby shortening the scan duration. Twenty-eight STEMI patients underwent a CMR scan at 1.5 T, 3 ± 1 days following primary percutaneous coronary intervention. The AAR was quantified using both native T1 and T2-mapping. MI size was quantified using conventional LGE, post-contrast T1-mapping and synthetic magnitude-reconstructed inversion recovery (MagIR) LGE and synthetic phase-sensitive inversion recovery (PSIR) LGE, derived from the post-contrast T1 maps. Native T1-mapping performed as well as T2-mapping in delineating the AAR (41.6 ± 11.9% of the left ventricle [% LV] versus 41.7 ± 12.2% LV, P = 0.72; R 2 0.97; ICC 0.986 (0.969-0.993); bias -0.1 ± 4.2% LV). There were excellent correlation and inter-method agreement with no bias, between MI size by conventional LGE, synthetic MagIR LGE (bias 0.2 ± 2.2%LV, P = 0.35), synthetic PSIR LGE (bias 0.4 ± 2.2% LV, P = 0.060) and post-contrast T1-mapping (bias 0.3 ± 1.8% LV, P = 0.10). The mean scan duration was 58 ± 4 min. Not performing T2 mapping (6 ± 1 min) and conventional LGE (10 ± 1 min) would shorten the CMR study by 15-20 min. T1-mapping can accurately quantify both the edema-based AAR (using native T1 maps) and acute MI size (using post-contrast T1 maps) in STEMI patients without major cardiovascular risk factors. This approach would shorten the duration of a comprehensive CMR study without significantly compromising on data acquisition and would obviate the need to perform T2 maps and LGE imaging.
A clustering approach to segmenting users of internet-based risk calculators.
Harle, C A; Downs, J S; Padman, R
2011-01-01
Risk calculators are widely available Internet applications that deliver quantitative health risk estimates to consumers. Although these tools are known to have varying effects on risk perceptions, little is known about who will be more likely to accept objective risk estimates. To identify clusters of online health consumers that help explain variation in individual improvement in risk perceptions from web-based quantitative disease risk information. A secondary analysis was performed on data collected in a field experiment that measured people's pre-diabetes risk perceptions before and after visiting a realistic health promotion website that provided quantitative risk information. K-means clustering was performed on numerous candidate variable sets, and the different segmentations were evaluated based on between-cluster variation in risk perception improvement. Variation in responses to risk information was best explained by clustering on pre-intervention absolute pre-diabetes risk perceptions and an objective estimate of personal risk. Members of a high-risk overestimater cluster showed large improvements in their risk perceptions, but clusters of both moderate-risk and high-risk underestimaters were much more muted in improving their optimistically biased perceptions. Cluster analysis provided a unique approach for segmenting health consumers and predicting their acceptance of quantitative disease risk information. These clusters suggest that health consumers were very responsive to good news, but tended not to incorporate bad news into their self-perceptions much. These findings help to quantify variation among online health consumers and may inform the targeted marketing of and improvements to risk communication tools on the Internet.
Giombi, Kristen C; Kosa, Katherine M; Rains, Carrie; Cates, Sheryl C
2018-03-21
Edible marijuana products have become extremely popular in states that have legalized marijuana for recreational use. The goal of this research was to provide a better understanding of consumer perceptions of edible marijuana products, including why they prefer edibles relative to other forms of marijuana (e.g., smoking) and their concerns regarding the consumption of edibles. We conducted eight focus groups (four groups in Denver, Colorado, and four groups in Seattle, Washington) in February 2016 with 62 adult consumers of edibles. Focus group transcripts were coded in QSR NVivo 10.0 qualitative analysis software, and coding reports identified trends across participants. Most participants preferred edibles to smoking marijuana because there is no smell from smoke and no secondhand smoke. Other reasons participants like edibles included convenience, discreetness, longer-lasting highs, less intense highs, and edibles' ability to aid in relaxation and reduce anxiety more so than smoking marijuana. Concerns and dislikes about edibles included delayed effects, unexpected highs, the unpredictability of the high, and inconsistency of distribution of marijuana in the product. No participants in either location mentioned harmful health effects from consuming edibles as a concern. Conclusions/Importance: The present study was qualitative in nature and provides a good starting point for further research to quantify through surveys how consumers understand and use edibles. Such information will help guide policy makers and regulators as they establish regulations for edibles. Also, such research can help inform educational campaigns on proper use of edibles for recreational purposes.
Using TRMM Data To Understand Interannual Variations In the Tropical Water Balance
NASA Technical Reports Server (NTRS)
Robertson, Franklin R.; Fitzjarrald, Dan; Arnold, James E. (Technical Monitor)
2002-01-01
A significant element of the science rationale for TRMM centered on assembling rainfall data needed to validate climate models-- climatological estimates of precipitation, its spatial and temporal variability, and vertical modes of latent heat release. Since the launch of TRMM, a great interest in the science community has emerged for quantifying interannual variability (IAV) of precipitation and its relationship to sea-surface temperature (SST) changes. The fact that TRMM has sampled one strong warm/ cold ENSO couplet, together with the prospect for a mission lifetime approaching ten years, has bolstered this interest in these longer time scales. Variability on a regional basis as well as for the tropics as a whole is of concern. Our analysis of TRMM results so far has shown surprising lack of concordance between various algorithms in quantifying IAV of precipitation. The first objective of this talk is to quantify the sensitivity of tropical precipitation to changes in SSTs. We analyze performance of the 3A11, 3A25, and 3B31 algorithms and investigate their relationship to scattering-- based algorithms constructed from SSM/I and TRMM 85 kHz data. The physical basis for the differences (and similarities) in depicting tropical oceanic and land rainfall will be discussed. We argue that scattering-based estimates of variability constitute a useful upper bound for precipitation variations. These results lead to the second question addressed in this talk-- How do TRMM precipitation / SST sensitivities compare to estimates of oceanic evaporation and what are the implications of these uncertainties in determining interannual changes in large-scale moisture transport? We summarize results of an analysis performed using COADS data supplemented by SSM/I estimates of near-surface variables to assess evaporation sensitivity to SST. The response of near 5 W sq m/K is compared to various TRMM precipitation sensitivities. Implied moisture convergence over the tropics and its sensitivity to errors of these algorithms is discussed.
Chiu, Bernard; Chen, Weifu; Cheng, Jieyu
2016-12-01
Rapid progression in total plaque area and volume measured from ultrasound images has been shown to be associated with an elevated risk of cardiovascular events. Since atherosclerosis is focal and predominantly occurring at the bifurcation, biomarkers that are able to quantify the spatial distribution of vessel-wall-plus-plaque thickness (VWT) change may allow for more sensitive detection of treatment effect. The goal of this paper is to develop simple and sensitive biomarkers to quantify the responsiveness to therapies based on the spatial distribution of VWT-Change on the entire 2D carotid standardized map previously described. Point-wise VWT-Changes computed for each patient were reordered lexicographically to a high-dimensional data node in a graph. A graph-based random walk framework was applied with the novel Weighted Cosine (WCos) similarity function introduced, which was tailored for quantification of responsiveness to therapy. The converging probability of each data node to the VWT regression template in the random walk process served as a scalar descriptor for VWT responsiveness to treatment. The WCos-based biomarker was 14 times more sensitive than the mean VWT-Change in discriminating responsive and unresponsive subjects based on the p-values obtained in T-tests. The proposed framework was extended to quantify where VWT-Change occurred by including multiple VWT-Change distribution templates representing focal changes at different regions. Experimental results show that the framework was effective in classifying carotid arteries with focal VWT-Change at different locations and may facilitate future investigations to correlate risk of cardiovascular events with the location where focal VWT-Change occurs. Copyright © 2016 Elsevier Ltd. All rights reserved.
Predictive value of auscultation of femoropopliteal arteries.
Kaufmann, Carla; Jacomella, Vincenzo; Kovacicova, Ludmila; Husmann, Marc; Clemens, Robert K; Thalhammer, Christopf; Amann-Vesti, Beatrice R
2013-03-05
Femoropopliteal bruits indicate flow turbulences and increased blood flow velocity, usually caused by an atherosclerotic plaque or stenosis. No data exist on the quality of bruits as a means for quantifying the degree of stenosis. We therefore conducted a prospective observational study to investigate the sensitivity and specificity of femoropopliteal auscultation, differentiated on the basis of bruit quality, to detect and quantify clinically relevant stenoses in patients with symptomatic and asymptomatic peripheral arterial disease (PAD). Patients with known chronic and stable PAD were recruited in the outpatient clinic. We included patients with known PAD and an ankle-brachial index (ABI) <0.90 and/or an ABI ≥0.90 with a history of lower limb revascularisation. Auscultation was performed independently by three investigators with varied clinical experience after a 10-minute period of rest. Femoropopliteal lesions were classified as follows: normal vessel wall or slight wall thickening (<20%), atherosclerotic plaque with below 50% reduction of the vessel lumen, prestenotic/intrastenotic ratio over 2.5 (<70%), over 3.5 (<99%) and complete occlusion (100%). Weighted Cohen's κ coefficients for differentiated auscultation were low in all vascular regions and did not differ between investigators. Sensitivity was low in most areas with an increase after exercise. The highest sensitivity in detecting relevant (>50%) stenosis was found in the common femoral artery (86%). Vascular auscultation is known to be of great use in routine clinical practice in recognising arterial abnormalities. Diagnosis of PAD is based on various diagnostic tools (pulse palpation, ABI measurement) and auscultation can localise relevant stenosis. However, auscultation alone is of limited sensitivity and specificity in grading stenosis in femoropopliteal arteries. Where PAD is clinically suspected further diagnostic tools, especially colour-coded duplex ultrasound, should be employed to quantify the underlying lesion.
Appelhans, Bradley M.; Woolf, Kathleen; Pagoto, Sherry L.; Schneider, Kristin L.; Whited, Matthew C.; Liebman, Rebecca
2012-01-01
Overeating is believed to result when the appetitive motivation to consume palatable food exceeds an individual’s capacity for inhibitory control of eating. This hypothesis was supported in recent studies involving predominantly normal weight women, but has not been tested in obese populations. The current study tested the interaction between food reward sensitivity and inhibitory control in predicting palatable food intake among energy-replete overweight and obese women (N=62). Sensitivity to palatable food reward was measured with the Power of Food Scale. Inhibitory control was assessed with a computerized choice task that captures the tendency to discount large delayed rewards relative to smaller immediate rewards. Participants completed an eating in the absence of hunger protocol in which homeostatic energy needs were eliminated with a bland preload of plain oatmeal, followed by a bogus laboratory taste test of palatable and bland snacks. The interaction between food reward sensitivity and inhibitory control was a significant predictor of palatable food intake in regression analyses controlling for body mass index and the amount of preload consumed. Probing this interaction indicated that higher food reward sensitivity predicted greater palatable food intake at low levels of inhibitory control, but was not associated with intake at high levels of inhibitory control. As expected, no associations were found in a similar regression analysis predicting intake of bland foods. Findings support a neurobehavioral model of eating behavior in which sensitivity to palatable food reward drives overeating only when accompanied by insufficient inhibitory control. Strengthening inhibitory control could enhance weight management programs. PMID:21475139
The use of QLF to quantify in vitro whitening in a product testing model.
Pretty, I A; Edgar, W M; Higham, S M
2001-11-24
Professional and consumer interest in whitening products continues to increase against a background of both increased oral health awareness and demand for cosmetic procedures. In the current legal climate, few dentists are providing 'in-office' whitening treatments, and thus many patients turn to home-use products. The most common of these are the whitening toothpastes. Researchers are keen to quantify the effectiveness of such products through clinically relevant trials. Previous studies examining whitening products have employed a variety of stained substrates to monitor stain removal. This study aimed to quantify the removal of stain from human enamel using a new device, quantitative light-induced fluorescence (QLF). The experimental design follows that of a product-testing model. A total of 11 previously extracted molar teeth were coated with transparent nail varnish leaving an exposed window of enamel. The sound, exposed enamel was subject to a staining regime of human saliva, chlorhexidine and tea. Each of the eleven teeth was subjected to serial exposures of a positive control (Bocasan), a negative control (water) and a test product (Yotuel toothpaste). Following each two-minute exposure QLF images of the teeth were taken (a total of 5 applications). Following completion of one test solution, the teeth were cleaned, re-stained and the procedure repeated with the next solution. QLF images were stored on a PC and analysed by a blinded single examiner. The deltaQ value at 5% threshold was reported. ANOVA and paired t-tests were used to analyse the data. The study confirmed the ability of QLF to longitudinally quantify stain reduction from human enamel. The reliability of the technique in relation to positive and negative test controls was proven. The positive control had a significantly (alpha = 0.05) higher stain removal efficacy than water (p = 0.023) and Yotuel (p = 0.046). Yotuel was more effective than water (p = 0.023). The research community, the practicing clinician and the consumer all require sound product evaluation data. The use of human enamel specimens may offer more relevant clinical data. QLF has been designed as an in vivo device. Further development of the technique should permit in vivo clinical whitening trials.
Sarlo, Katherine; Kirchner, Donald B; Troyano, Esperanza; Smith, Larry A; Carr, Gregory J; Rodriguez, Carlos
2010-05-27
Microbial enzymes have been used in laundry detergent products for several decades. These enzymes have also long been known to have the potential to give rise to occupational type 1 allergic responses. A few cases of allergy among consumers using dusty enzyme detergents were reported in the early 1970s. Encapsulation of the enzymes along with other formula changes were made to ensure that consumer exposure levels were sufficiently low that the likelihood of either the induction of IgE antibody (sensitization) or the elicitation of clinical symptoms be highly improbable. Understanding the consumer exposure to enzymes which are used in laundry and cleaning products is a key step to the risk management process. Validation of the risk assessment conclusions and the risk management process only comes with practical experience and evidence from the marketplace. In the present work, clinical data from a range of sources collected over the past 40 years have been analysed. These include data from peer reviewed literature and enzyme specific IgE antibody test results in detergent manufacturers' employees and from clinical study subjects. In total, enzyme specific IgE antibody data were available on 15,765 individuals. There were 37 individuals with IgE antibody. The majority of these cases were from the 1970s where 23 of 4687 subjects (0.49%) were IgE positive and 15 of the 23 were reported to have symptoms of allergy. The remaining 14 cases were identified post-1977 for a prevalence of 0.126% (14/11,078). No symptoms were reported and no relationship to exposure to laundry and cleaning products was found. There was a significant difference between the pre- and post-1977 cohorts in that the higher rates of sensitization with symptoms were associated with higher exposure to enzyme. The clinical testing revealed that the prevalence of enzyme specific IgE in the population is very rare (0.126% since 1977). This demonstrates that exposure to these strong respiratory allergens via use of laundry and cleaning products does not lead to the development of sensitization and disease. These data confirm that the risk to consumers has been properly assessed and managed and support the concept that thresholds of exposure exist for respiratory allergy. Expansion of enzyme use into new consumer product categories should follow completion of robust risk assessments in order to continue ensuring the safe use of enzymes among consumers.
Managing uncertainty about food risks - Consumer use of food labelling.
Tonkin, Emma; Coveney, John; Meyer, Samantha B; Wilson, Annabelle M; Webb, Trevor
2016-12-01
General consumer knowledge of and engagement with the production of food has declined resulting in increasing consumer uncertainty about, and sensitivity to, food risks. Emphasis is therefore placed on providing information for consumers to reduce information asymmetry regarding food risks, particularly through food labelling. This study examines the role of food labelling in influencing consumer perceptions of food risks. In-depth, 1-h interviews were conducted with 24 Australian consumers. Participants were recruited based on an a priori defined food safety risk scale, and to achieve a diversity of demographic characteristics. The methodological approach used, adaptive theory, was chosen to enable a constant interweaving of theoretical understandings and empirical data throughout the study. Participants discussed perceiving both traditional (food spoilage/microbial contamination) and modern (social issues, pesticide and 'chemical' contamination) risks as present in the food system. Food labelling was a symbol of the food system having managed traditional risks, and a tool for consumers to personally manage perceived modern risks. However, labelling also raised awareness of modern risks not previously considered. The consumer framing of risk presented demonstrates the need for more meaningful consumer engagement in policy decision making to ensure risk communication and management meet public expectations. This research innovatively identifies food labelling as both a symbol of, and a tool for, the management of perceived risks for consumers. Therefore it is imperative that food system actors ensure the authenticity and trustworthiness of all aspects of food labelling, not only those related to food safety. Copyright © 2016 Elsevier Ltd. All rights reserved.
Multiscale contact mechanics model for RF-MEMS switches with quantified uncertainties
NASA Astrophysics Data System (ADS)
Kim, Hojin; Huda Shaik, Nurul; Xu, Xin; Raman, Arvind; Strachan, Alejandro
2013-12-01
We introduce a multiscale model for contact mechanics between rough surfaces and apply it to characterize the force-displacement relationship for a metal-dielectric contact relevant for radio frequency micro-electromechanicl system (MEMS) switches. We propose a mesoscale model to describe the history-dependent force-displacement relationships in terms of the surface roughness, the long-range attractive interaction between the two surfaces, and the repulsive interaction between contacting asperities (including elastic and plastic deformation). The inputs to this model are the experimentally determined surface topography and the Hamaker constant as well as the mechanical response of individual asperities obtained from density functional theory calculations and large-scale molecular dynamics simulations. The model captures non-trivial processes including the hysteresis during loading and unloading due to plastic deformation, yet it is computationally efficient enough to enable extensive uncertainty quantification and sensitivity analysis. We quantify how uncertainties and variability in the input parameters, both experimental and theoretical, affect the force-displacement curves during approach and retraction. In addition, a sensitivity analysis quantifies the relative importance of the various input quantities for the prediction of force-displacement during contact closing and opening. The resulting force-displacement curves with quantified uncertainties can be directly used in device-level simulations of micro-switches and enable the incorporation of atomic and mesoscale phenomena in predictive device-scale simulations.
Quantifying the water-energy nexus in Greece
NASA Astrophysics Data System (ADS)
Ziogou, Isidoros; Zachariadis, Theodoros
2017-11-01
In this paper we provide an assessment of the water-energy nexus for Greece. More specifically, the amount of freshwater consumed per unit of energy produced is determined: for both conventional (lignite, diesel and fuel oil-fired) and advanced (combined operation of gas turbine) thermal power plants in the electricity generation sector; for extraction and refining activities in the primary energy production sector; and for the production of biodiesel that is used as a blend in the ultimately delivered automotive diesel fuel. In addition, the amount of electricity consumed for the purposes of water supply and sewerage is presented. In view of the expected effects of climate change in the Mediterranean region, the results of this study highlight the need for authorities to prepare a national strategy that will ensure climate resilience in both energy and water sectors of the country.
Food safety and organic meats.
Van Loo, Ellen J; Alali, Walid; Ricke, Steven C
2012-01-01
The organic meat industry in the United States has grown substantially in the past decade in response to consumer demand for nonconventionally produced products. Consumers are often not aware that the United States Department of Agriculture (USDA) organic standards are based only on the methods used for production and processing of the product and not on the product's safety. Food safety hazards associated with organic meats remain unclear because of the limited research conducted to determine the safety of organic meat from farm-to-fork. The objective of this review is to provide an overview of the published results on the microbiological safety of organic meats. In addition, antimicrobial resistance of microbes in organic food animal production is addressed. Determining the food safety risks associated with organic meat production requires systematic longitudinal studies that quantify the risks of microbial and nonmicrobial hazards from farm-to-fork.
Using sensitivity analysis in model calibration efforts
Tiedeman, Claire; Hill, Mary C.
2003-01-01
In models of natural and engineered systems, sensitivity analysis can be used to assess relations among system state observations, model parameters, and model predictions. The model itself links these three entities, and model sensitivities can be used to quantify the links. Sensitivities are defined as the derivatives of simulated quantities (such as simulated equivalents of observations, or model predictions) with respect to model parameters. We present four measures calculated from model sensitivities that quantify the observation-parameter-prediction links and that are especially useful during the calibration and prediction phases of modeling. These four measures are composite scaled sensitivities (CSS), prediction scaled sensitivities (PSS), the value of improved information (VOII) statistic, and the observation prediction (OPR) statistic. These measures can be used to help guide initial calibration of models, collection of field data beneficial to model predictions, and recalibration of models updated with new field information. Once model sensitivities have been calculated, each of the four measures requires minimal computational effort. We apply the four measures to a three-layer MODFLOW-2000 (Harbaugh et al., 2000; Hill et al., 2000) model of the Death Valley regional ground-water flow system (DVRFS), located in southern Nevada and California. D’Agnese et al. (1997, 1999) developed and calibrated the model using nonlinear regression methods. Figure 1 shows some of the observations, parameters, and predictions for the DVRFS model. Observed quantities include hydraulic heads and spring flows. The 23 defined model parameters include hydraulic conductivities, vertical anisotropies, recharge rates, evapotranspiration rates, and pumpage. Predictions of interest for this regional-scale model are advective transport paths from potential contamination sites underlying the Nevada Test Site and Yucca Mountain.
Brian K. Hand; Samuel A. Cushman; Erin L. Landguth; John Lucotch
2014-01-01
Quantifying the effects of landscape change on population connectivity is compounded by uncertainties about population size and distribution and a limited understanding of dispersal ability for most species. In addition, the effects of anthropogenic landscape change and sensitivity to regional climatic conditions interact to strongly affect habitat...
2011-09-30
man-made structures provide artificial reefs that promote habitats for fish and octopus in an otherwise barren seascape (Fig. 2). Furthermore...Triglidae) and large octopus (e.g. Octopus maorum), while others target small rays or sharks (Fig. 3). Furthermore, a significant finding of the results...is that the prey species commonly observed in the video data and representing the greatest biomass being consumed ( octopus , gurnards), have not been
Dietary water affects human skin hydration and biomechanics.
Palma, Lídia; Marques, Liliana Tavares; Bujan, Julia; Rodrigues, Luís Monteiro
2015-01-01
It is generally assumed that dietary water might be beneficial for the health, especially in dermatological (age preventing) terms. The present study was designed to quantify the impact of dietary water on major indicators of skin physiology. A total of 49 healthy females (mean 24.5±4.3 years) were selected and characterized in terms of their dietary daily habits, especially focused in water consumption, by a Food Frequency Questionnaire. This allowed two groups to be set - Group 1 consuming less than 3,200 mL/day (n=38), and Group 2 consuming more than 3,200 mL/day (n=11). Approximately 2 L of water were added to the daily diet of Group 2 individuals for 1 month to quantify the impact of this surplus in their skin physiology. Measurements involving epidermal superficial and deep hydration, transepidermal water loss, and several biomechanical descriptors were taken at day 0 (T0), 15 (T1), and 30 (T2) in several anatomical sites (face, upper limb, and leg). This stress test (2 L/day for 30 days) significantly modified superficial and deep skin hydration, especially in Group 1. The same impact was registered with the most relevant biomechanical descriptors. Thus, in this study, it is clear that higher water inputs in regular diet might positively impact normal skin physiology, in particular in those individuals with lower daily water consumptions.
Hige Compression Ratio Turbo Gasoline Engine Operation Using Alcohol Enhancement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heywood, John; Jo, Young Suk; Lewis, Raymond
The overall objective of this project was to quantify the potential for improving the performance and efficiency of gasoline engine technology by use of alcohols to suppress knock. Knock-free operation is obtained by direct injection of a second “anti-knock” fuel such as ethanol, which suppresses knock when, with gasoline fuel, knock would occur. Suppressing knock enables increased turbocharging, engine downsizing, and use of higher compression ratios throughout the engine’s operating map. This project combined engine testing and simulation to define knock onset conditions, with different mixtures of gasoline and alcohol, and with this information quantify the potential for improving themore » efficiency of turbocharged gasoline spark-ignition engines, and the on-vehicle fuel consumption reductions that could then be realized. The more focused objectives of this project were therefore to: Determine engine efficiency with aggressive turbocharging and downsizing and high compression ratio (up to a compression ratio of 13.5:1) over the engine’s operating range; Determine the knock limits of a turbocharged and downsized engine as a function of engine speed and load; Determine the amount of the knock-suppressing alcohol fuel consumed, through the use of various alcohol-gasoline and alcohol-water gasoline blends, for different driving cycles, relative to the gasoline consumed; Determine implications of using alcohol-boosted engines, with their higher efficiency operation, in both light-duty and medium-duty vehicle sectors.« less
A consumer-based approach to salt reduction: Case study with bread.
Antúnez, Lucía; Giménez, Ana; Ares, Gastón
2016-12-01
In recent years high sodium intake has raised growing concern worldwide. A widespread reduction of salt concentration in processed foods has been claimed as one of the most effective strategies to achieve a short-term impact on global health. However, one of the major challenges in reducing salt in food products is its potential negative impact on consumer perception. For this reason, gradual salt reduction has been recommended. In this context, the aim of the present work was to present a consumer-based approach to salt reduction, using bread as case study. Two consumer studies with a total of 303 consumers were carried out. In the first study, four sequential difference thresholds were determined through paired-comparison tests, starting at a salt concentration of 2%. In the second study, 99 consumers performed a two-bite evaluation of their sensory and hedonic perception of five bread samples: a control bread containing 2% salt and four samples with reduced salt content according to the difference thresholds determined in the first study. Survival analysis was used to determine average difference thresholds, which ranged from 9.4% to 14.3% of the salt concentration of the control bread. Results showed that salt concentration significantly influenced consumer overall liking of the bread samples. However, large heterogeneity was found in consumer hedonic reaction towards salt reduction: two groups of consumers with different preference and hedonic sensitivity to salt reduction were found. Results from the present work confirm that cumulative series of small salt reductions may be a feasible strategy for reducing the sodium content of bread without affecting consumer hedonic perception and stress the importance of considering consumer perception in the design of gradual salt reduction programmes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Development of method for quantifying essential tremor using a small optical device.
Chen, Kai-Hsiang; Lin, Po-Chieh; Chen, Yu-Jung; Yang, Bing-Shiang; Lin, Chin-Hsien
2016-06-15
Clinical assessment scales are the most common means used by physicians to assess tremor severity. Some scientific tools that may be able to replace these scales to objectively assess the severity, such as accelerometers, digital tablets, electromyography (EMG) measurement devices, and motion capture cameras, are currently available. However, most of the operational modes of these tools are relatively complex or are only able to capture part of the clinical information; furthermore, using these tools is sometimes time consuming. Currently, there is no tool available for automatically quantifying tremor severity in clinical environments. We aimed to develop a rapid, objective, and quantitative system for measuring the severity of finger tremor using a small portable optical device (Leap Motion). A single test took 15s to conduct, and three algorithms were proposed to quantify the severity of finger tremor. The system was tested with four patients diagnosed with essential tremor. The proposed algorithms were able to quantify different characteristics of tremor in clinical environments, and could be used as references for future clinical assessments. A portable, easy-to-use, small-sized, and noncontact device (Leap Motion) was used to clinically detect and record finger movement, and three algorithms were proposed to describe tremor amplitudes. Copyright © 2016 Elsevier B.V. All rights reserved.
The influence of lifestyle on health behavior and preference for functional foods.
Szakály, Zoltán; Szente, Viktória; Kövér, György; Polereczki, Zsolt; Szigeti, Orsolya
2012-02-01
The main objective of this survey is to reveal the relationship between lifestyle, health behavior, and the consumption of functional foods on the basis of Grunert's food-related lifestyle model. In order to achieve this objective, a nationwide representative questionnaire-based survey was launched with 1000 participants in Hungary. The results indicate that a Hungarian consumer makes rational decisions, he or she seeks bargains, and he wants to know whether or not he gets good value for his money. Further on, various lifestyle segments are defined by the authors: the rational, uninvolved, conservative, careless, and adventurous consumer segments. Among these, consumers with a rational approach provide the primary target group for the functional food market, where health consciousness and moderate price sensitivity can be observed together. Adventurous food consumers stand out because they search for novelty; this makes them an equally important target group. Conservative consumers are another, one characterized by positive health behavior. According to the findings of the research, there is a significant relationship between lifestyle, health behavior, and the preference for functional food products. Copyright © 2011 Elsevier Ltd. All rights reserved.
Novel handheld x-ray fluorescence spectrometer for routine testing for the presence of lead
NASA Astrophysics Data System (ADS)
Rensing, Noa M.; Tiernan, Timothy C.; Squillante, Michael R.
2011-06-01
RMD is developing a safe, inexpensive, and easy to operate lead detector for retailers and consumers that can reliably detect dangerous levels of lead in toys and other household products. Lead and its compounds have been rated as top chemicals that pose a great threat to human health. However, widespread testing for environmental lead is rarely undertaken until lead poisoning has already been diagnosed. The problem is not due to the accuracy or sensitivity of existing lead detection technology, but rather to the high expense, safety and licensing barriers of available test equipment. An inexpensive and easy to use lead detector would enable the identification of highly contaminated objects and areas and allow for timely and cost effective remediation. The military has similar needs for testing for lead and other heavy elements such as mercury, primarily in the decontamination of former military properties prior to their return to civilian use. RMD's research and development efforts are abased on advanced solid-state detectors combined with recently patented lead detection techniques to develop a consumer oriented lead detector that will be widely available and easy and inexpensive to use. These efforts will result in an instrument that offers: (1) high sensitivity, to identify objects containing dangerous amounts of lead, (2) low cost to encourage widespread testing by consumers and other end users and (3) convenient operation requiring no training or licensing. In contrast, current handheld x-ray fluorescence spectrometers either use a radioactive source requiring licensing and operating training, or use an electronic x-ray source that limits their sensitivity to surface lead.
Lieberman, Harris R; Kellogg, Mark D; Fulgoni, Victor L; Agarwal, Sanjiv
2017-03-01
It is difficult to determine if certain dietary supplements are safe for human consumption. Extracts of leaves of Ginkgo biloba trees are dietary supplements used for various purported therapeutic benefits. However, recent studies reported they increased risk of liver cancer in rodents. Therefore, this study assessed the association between ginkgo consumption and liver function using NHANES 2001-2012 data (N = 29,684). Since alcohol is known to adversely affect liver function, association of its consumption with liver function was also assessed. Alcohol and ginkgo extract intake of adult consumers and clinical markers of liver function (alkaline phosphatase, alanine aminotransferase, aspartate aminotransferase, gamma glutamyl transferase, lactate dehydrogenase, bilirubin) were examined. Moderate consumers of alcohol (0.80 ± 0.02 drinks/day) had higher levels of aspartate aminotransferase and gamma glutamyl transferase than non-consumers (P < 0.001). There was no difference (P > 0.01) in levels of markers of liver function in 616 ginkgo consumers (65.1 ± 4.4 mg/day intake) compared to non-consumers. While moderate alcohol consumption was associated with changes in markers of liver function, ginkgo intake as typically consumed by U.S. adults was not associated with these markers. Biomarkers measured by NHANES may be useful to examine potential adverse effects of dietary supplements for which insufficient human adverse event and toxicity data are available. Not applicable, as this is secondary analysis of publicly released observational data (NHANES 2001-2012). Published by Elsevier Inc.
2016-01-01
Convenience, taste, and prices are the main determinants of food choices. Complying with dietary recommendations therefore imposes a “taste cost” on consumers, potentially hindering adoption of those recommendations. The study presents and applies a new methodology, based on economic theory, to quantify this taste cost and assess the health and welfare effects of different dietary recommendations. Then, by comparison of those effects, we identify socially desirable recommendations that are most compatible with consumer preferences (i.e., that best balance health benefits against”taste cost”) and should be prioritized for promotion. The methodology proceeds in three-steps: first, an economic-behavioral model simulates how whole diets would change if consumers complied with dietary recommendations; second, an epidemiological model estimates the number of deaths avoided (DA) due to the dietary change; third, an efficiency analysis weighs the health benefits against the taste and policy costs of each recommendation. The empirical model is calibrated using French data. We find that recommendations to reduce consumption of red meat and soft-drinks, or raise consumption of milk products and fish/seafood impose relatively moderate taste costs. By comparison, recommendations related to F&V consumption and, to a lesser extent, butter/cream/cheese, snacks, and all meats impose larger taste costs on consumers. The F&V recommendation is the costliest for consumers to comply with, but it also reduces diet-related mortality the most, so that a large budget could be allocated to promoting F&V consumption while keeping this policy cost-beneficial. We conclude that promotion of most dietary recommendations improves social welfare. Our framework complements the programming models available in nutrition and public health: those models are best used to identify dietary targets, following which our framework identifies cost-beneficial ways of moving towards those targets. PMID:27362764
Irz, Xavier; Leroy, Pascal; Réquillart, Vincent; Soler, Louis-Georges
2016-01-01
Convenience, taste, and prices are the main determinants of food choices. Complying with dietary recommendations therefore imposes a "taste cost" on consumers, potentially hindering adoption of those recommendations. The study presents and applies a new methodology, based on economic theory, to quantify this taste cost and assess the health and welfare effects of different dietary recommendations. Then, by comparison of those effects, we identify socially desirable recommendations that are most compatible with consumer preferences (i.e., that best balance health benefits against"taste cost") and should be prioritized for promotion. The methodology proceeds in three-steps: first, an economic-behavioral model simulates how whole diets would change if consumers complied with dietary recommendations; second, an epidemiological model estimates the number of deaths avoided (DA) due to the dietary change; third, an efficiency analysis weighs the health benefits against the taste and policy costs of each recommendation. The empirical model is calibrated using French data. We find that recommendations to reduce consumption of red meat and soft-drinks, or raise consumption of milk products and fish/seafood impose relatively moderate taste costs. By comparison, recommendations related to F&V consumption and, to a lesser extent, butter/cream/cheese, snacks, and all meats impose larger taste costs on consumers. The F&V recommendation is the costliest for consumers to comply with, but it also reduces diet-related mortality the most, so that a large budget could be allocated to promoting F&V consumption while keeping this policy cost-beneficial. We conclude that promotion of most dietary recommendations improves social welfare. Our framework complements the programming models available in nutrition and public health: those models are best used to identify dietary targets, following which our framework identifies cost-beneficial ways of moving towards those targets.
Defect Detection and Segmentation Framework for Remote Field Eddy Current Sensor Data
2017-01-01
Remote-Field Eddy-Current (RFEC) technology is often used as a Non-Destructive Evaluation (NDE) method to prevent water pipe failures. By analyzing the RFEC data, it is possible to quantify the corrosion present in pipes. Quantifying the corrosion involves detecting defects and extracting their depth and shape. For large sections of pipelines, this can be extremely time-consuming if performed manually. Automated approaches are therefore well motivated. In this article, we propose an automated framework to locate and segment defects in individual pipe segments, starting from raw RFEC measurements taken over large pipelines. The framework relies on a novel feature to robustly detect these defects and a segmentation algorithm applied to the deconvolved RFEC signal. The framework is evaluated using both simulated and real datasets, demonstrating its ability to efficiently segment the shape of corrosion defects. PMID:28984823
Defect Detection and Segmentation Framework for Remote Field Eddy Current Sensor Data.
Falque, Raphael; Vidal-Calleja, Teresa; Miro, Jaime Valls
2017-10-06
Remote-Field Eddy-Current (RFEC) technology is often used as a Non-Destructive Evaluation (NDE) method to prevent water pipe failures. By analyzing the RFEC data, it is possible to quantify the corrosion present in pipes. Quantifying the corrosion involves detecting defects and extracting their depth and shape. For large sections of pipelines, this can be extremely time-consuming if performed manually. Automated approaches are therefore well motivated. In this article, we propose an automated framework to locate and segment defects in individual pipe segments, starting from raw RFEC measurements taken over large pipelines. The framework relies on a novel feature to robustly detect these defects and a segmentation algorithm applied to the deconvolved RFEC signal. The framework is evaluated using both simulated and real datasets, demonstrating its ability to efficiently segment the shape of corrosion defects.
Variance-based interaction index measuring heteroscedasticity
NASA Astrophysics Data System (ADS)
Ito, Keiichi; Couckuyt, Ivo; Poles, Silvia; Dhaene, Tom
2016-06-01
This work is motivated by the need to deal with models with high-dimensional input spaces of real variables. One way to tackle high-dimensional problems is to identify interaction or non-interaction among input parameters. We propose a new variance-based sensitivity interaction index that can detect and quantify interactions among the input variables of mathematical functions and computer simulations. The computation is very similar to first-order sensitivity indices by Sobol'. The proposed interaction index can quantify the relative importance of input variables in interaction. Furthermore, detection of non-interaction for screening can be done with as low as 4 n + 2 function evaluations, where n is the number of input variables. Using the interaction indices based on heteroscedasticity, the original function may be decomposed into a set of lower dimensional functions which may then be analyzed separately.
The stochastic dynamics of tethered microcantilevers in a viscous fluid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robbins, Brian A.; Paul, Mark R.; Radiom, Milad
2014-10-28
We explore and quantify the coupled dynamics of a pair of micron scale cantilevers immersed in a viscous fluid that are also directly tethered to one another at their tips by a spring force. The spring force, for example, could represent the molecular stiffness or elasticity of a biomolecule or material tethered between the cantilevers. We use deterministic numerical simulations with the fluctuation-dissipation theorem to compute the stochastic dynamics of the cantilever pair for the conditions of experiment when driven only by Brownian motion. We validate our approach by comparing directly with experimental measurements in the absence of the tethermore » which shows excellent agreement. Using numerical simulations, we quantify the correlated dynamics of the cantilever pair over a range of tether stiffness. Our results quantify the sensitivity of the auto- and cross-correlations of equilibrium fluctuations in cantilever displacement to the stiffness of the tether. We show that the tether affects the magnitude of the correlations which can be used in a measurement to probe the properties of an attached tethering substance. For the configurations of current interest using micron scale cantilevers in water, we show that the magnitude of the fluid coupling between the cantilevers is sufficiently small such that the influence of the tether can be significant. Our results show that the cross-correlation is more sensitive to tether stiffness than the auto-correlation indicating that a two-cantilever measurement has improved sensitivity when compared with a measurement using a single cantilever.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, E.; Gonder, J.; Lopp, S.
It is widely understood that cold-temperature engine operation negatively impacts vehicle fuel use due to a combination of increased friction (high-viscosity engine oil) and temporary enrichment (accelerated catalyst heating). However, relatively little effort has been dedicated to thoroughly quantifying these impacts across a large number of driving cycles and ambient conditions. This work leverages high-quality dynamometer data collected at various ambient conditions to develop a modeling framework for quantifying engine cold-start fuel penalties over a wide array of real-world usage profiles. Additionally, mitigation strategies including energy retention and exhaust heat recovery are explored with benefits quantified for each approach.
Oldroyd, Rachel A; Morris, Michelle A; Birkin, Mark
2018-06-06
Traditional methods of monitoring foodborne illness are associated with problems of untimeliness and underreporting. In recent years, alternative data sources such as social media data have been used to monitor the incidence of disease in the population (infodemiology and infoveillance). These data sources prove timelier than traditional general practitioner data, they can help to fill the gaps in the reporting process, and they often include additional metadata that is useful for supplementary research. The aim of the study was to identify and formally analyze research papers using consumer-generated data, such as social media data or restaurant reviews, to quantify a disease or public health ailment. Studies of this nature are scarce within the food safety domain, therefore identification and understanding of transferrable methods in other health-related fields are of particular interest. Structured scoping methods were used to identify and analyze primary research papers using consumer-generated data for disease or public health surveillance. The title, abstract, and keyword fields of 5 databases were searched using predetermined search terms. A total of 5239 papers matched the search criteria, of which 145 were taken to full-text review-62 papers were deemed relevant and were subjected to data characterization and thematic analysis. The majority of studies (40/62, 65%) focused on the surveillance of influenza-like illness. Only 10 studies (16%) used consumer-generated data to monitor outbreaks of foodborne illness. Twitter data (58/62, 94%) and Yelp reviews (3/62, 5%) were the most commonly used data sources. Studies reporting high correlations against baseline statistics used advanced statistical and computational approaches to calculate the incidence of disease. These include classification and regression approaches, clustering approaches, and lexicon-based approaches. Although they are computationally intensive due to the requirement of training data, studies using classification approaches reported the best performance. By analyzing studies in digital epidemiology, computer science, and public health, this paper has identified and analyzed methods of disease monitoring that can be transferred to foodborne disease surveillance. These methods fall into 4 main categories: basic approach, classification and regression, clustering approaches, and lexicon-based approaches. Although studies using a basic approach to calculate disease incidence generally report good performance against baseline measures, they are sensitive to chatter generated by media reports. More computationally advanced approaches are required to filter spurious messages and protect predictive systems against false alarms. Research using consumer-generated data for monitoring influenza-like illness is expansive; however, research regarding the use of restaurant reviews and social media data in the context of food safety is limited. Considering the advantages reported in this review, methods using consumer-generated data for foodborne disease surveillance warrant further investment. ©Rachel A Oldroyd, Michelle A Morris, Mark Birkin. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 06.06.2018.
NASA Astrophysics Data System (ADS)
Crossman, J.; Futter, M. N.; Whitehead, P. G.; Stainsby, E.; Baulch, H. M.; Jin, L.; Oni, S. K.; Wilby, R. L.; Dillon, P. J.
2014-07-01
Hydrological processes determine the transport of nutrients and passage of diffuse pollution. Consequently, catchments are likely to exhibit individual hydrochemical responses (sensitivities) to climate change, which is expected to alter the timing and amount of runoff, and to impact in-stream water quality. In developing robust catchment management strategies and quantifying plausible future hydrochemical conditions it is therefore equally important to consider the potential for spatial variability in, and causal factors of, catchment sensitivity, as to explore future changes in climatic pressures. This study seeks to identify those factors which influence hydrochemical sensitivity to climate change. A perturbed physics ensemble (PPE), derived from a series of Global Climate Model (GCM) variants with specific climate sensitivities was used to project future climate change and uncertainty. Using the Integrated Catchment Model of Phosphorus Dynamics (INCA-P), we quantified potential hydrochemical responses in four neighbouring catchments (with similar land use but varying topographic and geological characteristics) in southern Ontario, Canada. Responses were assessed by comparing a 30 year baseline (1968-1997) to two future periods: 2020-2049 and 2060-2089. Although projected climate change and uncertainties were similar across these catchments, hydrochemical responses (sensitivity) were highly varied. Sensitivity was governed by soil type (influencing flow pathways) and nutrient transport mechanisms. Clay-rich catchments were most sensitive, with total phosphorus (TP) being rapidly transported to rivers via overland flow. In these catchments large annual reductions in TP loads were projected. Sensitivity in the other two catchments, dominated by sandy-loams, was lower due to a larger proportion of soil matrix flow, longer soil water residence times and seasonal variability in soil-P saturation. Here smaller changes in TP loads, predominantly increases, were projected. These results suggest that the clay content of soils could be a good indicator of the sensitivity of catchments to climatic input, and reinforces calls for catchment-specific management plans.
NASA Astrophysics Data System (ADS)
Crossman, J.; Futter, M. N.; Whitehead, P. G.; Stainsby, E.; Baulch, H. M.; Jin, L.; Oni, S. K.; Wilby, R. L.; Dillon, P. J.
2014-12-01
Hydrological processes determine the transport of nutrients and passage of diffuse pollution. Consequently, catchments are likely to exhibit individual hydrochemical responses (sensitivities) to climate change, which are expected to alter the timing and amount of runoff, and to impact in-stream water quality. In developing robust catchment management strategies and quantifying plausible future hydrochemical conditions it is therefore equally important to consider the potential for spatial variability in, and causal factors of, catchment sensitivity, as it is to explore future changes in climatic pressures. This study seeks to identify those factors which influence hydrochemical sensitivity to climate change. A perturbed physics ensemble (PPE), derived from a series of global climate model (GCM) variants with specific climate sensitivities was used to project future climate change and uncertainty. Using the INtegrated CAtchment model of Phosphorus dynamics (INCA-P), we quantified potential hydrochemical responses in four neighbouring catchments (with similar land use but varying topographic and geological characteristics) in southern Ontario, Canada. Responses were assessed by comparing a 30 year baseline (1968-1997) to two future periods: 2020-2049 and 2060-2089. Although projected climate change and uncertainties were similar across these catchments, hydrochemical responses (sensitivities) were highly varied. Sensitivity was governed by quaternary geology (influencing flow pathways) and nutrient transport mechanisms. Clay-rich catchments were most sensitive, with total phosphorus (TP) being rapidly transported to rivers via overland flow. In these catchments large annual reductions in TP loads were projected. Sensitivity in the other two catchments, dominated by sandy loams, was lower due to a larger proportion of soil matrix flow, longer soil water residence times and seasonal variability in soil-P saturation. Here smaller changes in TP loads, predominantly increases, were projected. These results suggest that the clay content of soils could be a good indicator of the sensitivity of catchments to climatic input, and reinforces calls for catchment-specific management plans.
Rablen, Paul R; McLarney, Brett D; Karlow, Brandon J; Schneider, Jean E
2014-02-07
High-level electronic structure calculations, including a continuum treatment of solvent, are employed to elucidate and quantify the effects of alkyl halide structure on the barriers of SN2 and E2 reactions. In cases where such comparisons are available, the results of these calculations show close agreement with solution experimental data. Structural factors investigated include α- and β-methylation, adjacency to unsaturated functionality (allyl, benzyl, propargyl, α to carbonyl), ring size, and α-halogenation and cyanation. While the influence of these factors on SN2 reactivity is mostly well-known, the present study attempts to provide a broad comparison of both SN2 and E2 reactivity across many cases using a single methodology, so as to quantify relative reactivity trends. Despite the fact that most organic chemistry textbooks say far more about how structure affects SN2 reactions than about how it affects E2 reactions, the latter are just as sensitive to structural variation as are the former. This sensitivity of E2 reactions to structure is often underappreciated.
Saito, L.; Redd, C.; Chandra, S.; Atwell, L.; Fritsen, C.H.; Rosen, Michael R.
2007-01-01
Aquatic foodweb models for 2 seasons (relatively high- [March] and low-flow [August] conditions) were constructed for 4 reaches on the Truckee River using ??13C and ??15N data from periphyton, macroinvertebrate, and fish samples collected in 2003 and 2004. The models were constructed with isotope values that included measured periphyton signatures and calculated mean isotope values for detritus and seston as basal food sources of each food web. The pseudo-optimization function in Excel's Solver module was used to minimize the sum of squared error between predicted and observed stable-isotope values while simultaneously solving for diet proportions for all foodweb consumers and estimating ??13C and ??15N trophic enrichment factors. This approach used an underdetermined set of simultaneous linear equations and was tested by running the pseudo-optimization procedure for 500 randomly selected sets of initial conditions. Estimated diet proportions had average standard deviations (SDs) of 0.03 to 0.04??? and SDs of trophic enrichment factors ranged from <0.005 to 0.05??? based on the results of the 500 runs, indicating that the modeling approach was very robust. However, sensitivity analysis of calculated detritus and seston ??13C and ??15N values indicated that the robustness of the approach is dependent on having accurate measures of all observed foodweb-component ??13c and ??15N values. Model results from the 500 runs using the mean isotope values for detritus and seston indicated that upstream food webs were the simplest, with fewer feeding groups and trophic interactions (e.g., 21 interactions for 10 feeding groups), whereas food webs for the reach downstream of the Reno-Sparks metropolitan area were the most complex (e.g., 58 interactions for 16 feeding groups). Nonnative crayfish were important omnivores in each reach and drew energy from multiple sources, but appeared to be energetic dead ends because they generally were not consumed. Predatory macroinvertebrate diets varied along the river and affected estimated trophic positions of fish that consumed them. Differences in complexity and composition of the food webs appeared to be related to season, but could also have been caused by interactions with nonnative species, especially invasive crayfish. ?? 2007 by The North American Benthological Society.
Calorie-induced ER stress suppresses uroguanylin satiety signaling in diet-induced obesity.
Kim, G W; Lin, J E; Snook, A E; Aing, A S; Merlino, D J; Li, P; Waldman, S A
2016-05-23
The uroguanylin-GUCY2C gut-brain axis has emerged as one component regulating feeding, energy homeostasis, body mass and metabolism. Here, we explore a role for this axis in mechanisms underlying diet-induced obesity (DIO). Intestinal uroguanylin expression and secretion, and hypothalamic GUCY2C expression and anorexigenic signaling, were quantified in mice on high-calorie diets for 14 weeks. The role of endoplasmic reticulum (ER) stress in suppressing uroguanylin in DIO was explored using tunicamycin, an inducer of ER stress, and tauroursodeoxycholic acid (TUDCA), a chemical chaperone that inhibits ER stress. The impact of consumed calories on uroguanylin expression was explored by dietary manipulation. The role of uroguanylin in mechanisms underlying obesity was examined using Camk2a-Cre-ER(T2)-Rosa-STOP(loxP/loxP)-Guca2b mice in which tamoxifen induces transgenic hormone expression in brain. DIO suppressed intestinal uroguanylin expression and eliminated its postprandial secretion into the circulation. DIO suppressed uroguanylin through ER stress, an effect mimicked by tunicamycin and blocked by TUDCA. Hormone suppression by DIO reflected consumed calories, rather than the pathophysiological milieu of obesity, as a diet high in calories from carbohydrates suppressed uroguanylin in lean mice, whereas calorie restriction restored uroguanylin in obese mice. However, hypothalamic GUCY2C, enriched in the arcuate nucleus, produced anorexigenic signals mediating satiety upon exogenous agonist administration, and DIO did not impair these responses. Uroguanylin replacement by transgenic expression in brain repaired the hormone insufficiency and reconstituted satiety responses opposing DIO and its associated comorbidities, including visceral adiposity, glucose intolerance and hepatic steatosis. These studies reveal a novel pathophysiological mechanism contributing to obesity in which calorie-induced suppression of intestinal uroguanylin impairs hypothalamic mechanisms regulating food consumption through loss of anorexigenic endocrine signaling. The correlative therapeutic paradigm suggests that, in the context of hormone insufficiency with preservation of receptor sensitivity, obesity may be prevented or treated by GUCY2C hormone replacement.
Comparative analysis of EPA and DHA in fish oil nutritional capsules by GC-MS.
Yi, Tao; Li, Shuk-Man; Fan, Jia-Yi; Fan, Lan-Lan; Zhang, Zhi-Feng; Luo, Pei; Zhang, Xiao-Jun; Wang, Jian-Gang; Zhu, Lin; Zhao, Zhong-Zhen; Chen, Hu-Biao
2014-12-13
Fish oil is a popular nutritional product consumed in Hong Kong. Eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) are the two main bioactive components responsible for the health benefits of fish oil. Market survey in Hong Kong demonstrated that various fish oil capsules with different origins and prices are sold simultaneously. However, these capsules are labelled with same ingredient levels, namely EPA 180 mg/g and DHA 120 mg/g. This situation makes the consumers very confused. To evaluate the quality of various fish oil capsules, a comparative analysis of the contents of EPA and DHA in fish oil is crucial. A gas chromatography-mass spectrometry (GC-MS) method was developed for identification and determination of EPA and DHA in fish oil capsules. A comprehensive validation of the developed method was conducted. Ten batches of fish oil capsules samples purchased from drugstores of Hong Kong were analyzed by using the developed method. The present method presented good sensitivity, precision and accuracy. The limits of detection (LOD) for EPA and DHA were 0.08 ng and 0.21 ng, respectively. The relative standard deviation (RSD) values of EPA and DHA for repeatability tests were both less than 1.05%; and the recovery for accuracy test of EPA and DHA were 100.50% and 103.83%, respectively. In ten fish oil samples, the contents of EPA ranged from 39.52 mg/g to 509.16 mg/g, and the contents of DHA ranged from 35.14 mg/g to 645.70 mg/g. The present method is suitable for the quantitative analysis of EPA and DHA in fish oil capsules. There is a significant variation in the contents of the quantified components in fish oil samples, and there is not a linear relationship between price and contents of EPA and DHA. Strict supervision of the labelling of the fish oil capsules is urgently needed.
Dziorny, Adam C; Orlando, Mark S; Strain, J J; Davidson, Philip W; Myers, Gary J
2013-09-01
Determining if associations exist between child neurodevelopment and environmental exposures, especially low level or background ones, is challenging and dependent upon being able to measure specific and sensitive endpoints. Psychometric or behavioral measures of CNS function have traditionally been used in such studies, but do have some limitations. Auditory neurophysiologic measures examine different nervous system structures and mechanisms, have fewer limitations, can more easily be quantified, and might be helpful additions to testing. To date, their use in human epidemiological studies has been limited. We reviewed the use of auditory brainstem responses (ABR) and otoacoustic emissions (OAE) in studies designed to determine the relationship of exposures to methyl mercury (MeHg) and nutrients from fish consumption with neurological development. We included studies of experimental animals and humans in an effort to better understand the possible benefits and risks of fish consumption. We reviewed the literature on the use of ABR and OAE to measure associations with environmental exposures that result from consuming a diet high in fish. We focused specifically on long chain polyunsaturated fatty acids (LCPUFA) and MeHg. We performed a comprehensive review of relevant studies using web-based search tools and appropriate search terms. Gestational exposure to both LCPUFA and MeHg has been reported to influence the developing auditory system. In experimental studies supplemental LCPUFA is reported to prolong ABR latencies and human studies also suggest an association. Experimental studies of acute and gestational MeHg exposure are reported to prolong ABR latencies and impair hair cell function. In humans, MeHg exposure is reported to prolong ABR latencies, but the impact on hair cell function is unknown. The auditory system can provide objective measures and may be useful in studying exposures to nutrients and toxicants and whether they are associated with children's neurodevelopment. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Provenzano, G.; Vardy, M. E.; Henstock, T.; Zervos, A.
2017-12-01
A quantitative high-resolution physical model of the top 100 meters of the sub-seabed is of key importance for a wide range of shallow geohazard scenarios: identification of potential shallow landsliding, monitoring of gas storage sites, and assessment of offshore structures stability. Cur- rently, engineering-scale sediment characterisation relies heavily on direct sampling of the seabed and in-situ measurements. Such an approach is expensive and time-consuming, as well as liable to alter the sediment properties during the coring process. As opposed to reservoir-scale seismic exploration, ultra-high-frequency (UHF, 0.2-4.0 kHz) multi-channel marine reflection seismic data are most often limited to a to semi-quantitative interpretation of the reflection amplitudes and facies geometries, leaving largely unexploited its intrinsic value as a remote characterisation tool. In this work, we develop a seismic inversion methodology to obtain a robust sub-metric resolution elastic model from limited-offset, limited-bandwidth UHF seismic reflection data, with minimal pre-processing and limited a priori information. The Full Waveform Inversion is implemented as a stochastic optimiser based upon a Genetic Algorithm, modified in order to improve the robustness against inaccurate starting model populations. Multiple independent runs are used to create a robust posterior model distribution and quantify the uncertainties on the solution. The methodology has been applied to complex synthetic examples and to real datasets acquired in areas prone to shallow landsliding. The inverted elastic models show a satisfactory match with the ground-truths and a good sensitivity to relevant variations in the sediment texture and saturation state. We apply the methodology to a range of synthetic consolidating slopes under different loading conditions and sediment properties distributions. Our work demonstrates that the seismic inversion of UHF data has the potential to become an important practical tool for marine ground model building in spatially heterogeneous areas, reducing the reliance on expensive and time-consuming coring campaigns.
Escherichia coli biosensors for environmental, food industry and biological warfare agent detection
NASA Astrophysics Data System (ADS)
Allil, R. C. S. B.; Werneck, M. M.; da Silva-Neto, J. L.; Miguel, M. A. L.; Rodrigues, D. M. C.; Wandermur, G. L.; Rambauske, D. C.
2013-06-01
This work has the objective to research and develop a plastic optical fiber biosensor based taper and mPOF LPG techniques to detect Escherichia coli by measurements of index of refraction. Generally, cell detection is crucial in microbiological analysis of clinical, food, water or environmental samples. However, methods current employed are time consuming, taking at least 72 hours in order to produce reliable responses as they depend on sample collection and cell culture in controlled conditions. The delay in obtaining the results of the analysis can result in contamination of a great number of consumers. Plastic Optical Fiber (POF) biosensors consist in a viable alternative for rapid and inexpensive scheme for cells detection. A study the sensitivity of these sensors for microbiological detection, fiber Tapers and Long Period Grating (LPG) both in poly-methyl-methacrylate (PMMA) were realized as possible candidates to take part of a biosensor system to detect Escherichia coli in water samples. In this work we adopted the immunocapture technique, which consists of quantifying bacteria in a liquid sample, attract-ing and fixing the bacteria on the surface of the polymer optical fiber, by the antigen-antibody reaction. The results were obtained by optical setup that consists in a side of the fiber a LED coupled to a photodetector through a POF with the taper in the middle of it. On the other side of the POF a photodetector receives this light producting a photocurrent. The output voltage is fed into the microcontroller A/D input port and its output data is sent via USB to a LabView software running in a microcomputer. The results showed the possibility of the POF in biosensor application capable to detect E. coli for environmental and food industry and for detecting and identifying biological-warfare agents using a very rapid response sensor, applicable to field detection prototypes.
Calorie-induced ER stress suppresses uroguanylin satiety signaling in diet-induced obesity
Kim, G W; Lin, J E; Snook, A E; Aing, A S; Merlino, D J; Li, P; Waldman, S A
2016-01-01
Background/Objectives: The uroguanylin-GUCY2C gut–brain axis has emerged as one component regulating feeding, energy homeostasis, body mass and metabolism. Here, we explore a role for this axis in mechanisms underlying diet-induced obesity (DIO). Subjects/Methods: Intestinal uroguanylin expression and secretion, and hypothalamic GUCY2C expression and anorexigenic signaling, were quantified in mice on high-calorie diets for 14 weeks. The role of endoplasmic reticulum (ER) stress in suppressing uroguanylin in DIO was explored using tunicamycin, an inducer of ER stress, and tauroursodeoxycholic acid (TUDCA), a chemical chaperone that inhibits ER stress. The impact of consumed calories on uroguanylin expression was explored by dietary manipulation. The role of uroguanylin in mechanisms underlying obesity was examined using Camk2a-Cre-ERT2-Rosa-STOPloxP/loxP-Guca2b mice in which tamoxifen induces transgenic hormone expression in brain. Results: DIO suppressed intestinal uroguanylin expression and eliminated its postprandial secretion into the circulation. DIO suppressed uroguanylin through ER stress, an effect mimicked by tunicamycin and blocked by TUDCA. Hormone suppression by DIO reflected consumed calories, rather than the pathophysiological milieu of obesity, as a diet high in calories from carbohydrates suppressed uroguanylin in lean mice, whereas calorie restriction restored uroguanylin in obese mice. However, hypothalamic GUCY2C, enriched in the arcuate nucleus, produced anorexigenic signals mediating satiety upon exogenous agonist administration, and DIO did not impair these responses. Uroguanylin replacement by transgenic expression in brain repaired the hormone insufficiency and reconstituted satiety responses opposing DIO and its associated comorbidities, including visceral adiposity, glucose intolerance and hepatic steatosis. Conclusions: These studies reveal a novel pathophysiological mechanism contributing to obesity in which calorie-induced suppression of intestinal uroguanylin impairs hypothalamic mechanisms regulating food consumption through loss of anorexigenic endocrine signaling. The correlative therapeutic paradigm suggests that, in the context of hormone insufficiency with preservation of receptor sensitivity, obesity may be prevented or treated by GUCY2C hormone replacement. PMID:27214655
Smith, Robert A; Gottlieb, Geoffrey S; Anderson, Donovan J; Pyrak, Crystal L; Preston, Bradley D
2008-01-01
Using an indicator cell assay that directly quantifies viral replication, we show that human immunodeficiency virus types 1 and 2 (HIV-1 and HIV-2, respectively) exhibit similar sensitivities to 3'-azido-3'-deoxythymidine (zidovudine) as well as other nucleoside analog inhibitors of reverse transcriptase. These data support the use of nucleoside analogs for antiviral therapy of HIV-2 infection.
X-ray Polarimetry with a Micro-Pattern Gas Detector
NASA Technical Reports Server (NTRS)
Hill, Joe
2005-01-01
Topics covered include: Science drivers for X-ray polarimetry; Previous X-ray polarimetry designs; The photoelectric effect and imaging tracks; Micro-pattern gas polarimeter design concept. Further work includes: Verify results against simulator; Optimize pressure and characterize different gases for a given energy band; Optimize voltages for resolution and sensitivity; Test meshes with 80 micron pitch; Characterize ASIC operation; and Quantify quantum efficiency for optimum polarization sensitivity.
Consumer reporting of adverse events following immunization.
Clothier, Hazel J; Selvaraj, Gowri; Easton, Mee Lee; Lewis, Georgina; Crawford, Nigel W; Buttery, Jim P
2014-01-01
Surveillance of adverse events following immunisation (AEFI) is an essential component of vaccine safety monitoring. The most commonly utilized passive surveillance systems rely predominantly on reporting by health care providers (HCP). We reviewed adverse event reports received in Victoria, Australia since surveillance commencement in July 2007, to June 2013 (6 years) to ascertain the contribution of consumer (vaccinee or their parent/guardian) reporting to vaccine safety monitoring and to inform future surveillance system development directions. Categorical data included were: reporter type; serious and non-serious AEFI category; and, vaccinee age group. Chi-square test and 2-sample test of proportions were used to compare categories; trend changes were assessed using linear regression. Consumer reporting increased over the 6 years, reaching 21% of reports received in 2013 (P<0.001), most commonly for children aged less than 7 years. Consumer reports were 5% more likely to describe serious AEFI than HCP (P=0.018) and 10% more likely to result in specialist clinic attendance (P<0.001). Although online reporting increased to 32% of all report since its introduction in 2010, 85% of consumers continued to report by phone. Consumer reporting of AEFI is a valuable component of vaccine safety surveillance in addition to HCP reporting. Changes are required to AEFI reporting systems to implement efficient consumer AEFI reporting, but may be justified for their potential impact on signal detection sensitivity.
Between- and within-lake responses of macrophyte richness metrics to shoreline developmen
Beck, Marcus W.; Vondracek, Bruce C.; Hatch, Lorin K.
2013-01-01
Aquatic habitat in littoral environments can be affected by residential development of shoreline areas. We evaluated the relationship between macrophyte richness metrics and shoreline development to quantify indicator response at 2 spatial scales for Minnesota lakes. First, the response of total, submersed, and sensitive species to shoreline development was evaluated within lakes to quantify macrophyte response as a function of distance to the nearest dock. Within-lake analyses using generalized linear mixed models focused on 3 lakes of comparable size with a minimal influence of watershed land use. Survey points farther from docks had higher total species richness and presence of species sensitive to disturbance. Second, between-lake effects of shoreline development on total, submersed, emergent-floating, and sensitive species were evaluated for 1444 lakes. Generalized linear models were developed for all lakes and stratified subsets to control for lake depth and watershed land use. Between-lake analyses indicated a clear response of macrophyte richness metrics to increasing shoreline development, such that fewer emergent-floating and sensitive species were correlated with increasing density of docks. These trends were particularly evident for deeper lakes with lower watershed development. Our results provide further evidence that shoreline development is associated with degraded aquatic habitat, particularly by illustrating the response of macrophyte richness metrics across multiple lake types and different spatial scales.
Li, Xin; Kaattari, Stephen L.; Vogelbein, Mary A.; Vadas, George G.; Unger, Michael A.
2016-01-01
Immunoassays based on monoclonal antibodies (mAbs) are highly sensitive for the detection of polycyclic aromatic hydrocarbons (PAHs) and can be employed to determine concentrations in near real-time. A sensitive generic mAb against PAHs, named as 2G8, was developed by a three-step screening procedure. It exhibited nearly uniformly high sensitivity against 3-ring to 5-ring unsubstituted PAHs and their common environmental methylated PAHs, with IC50 values between 1.68–31 μg/L (ppb). 2G8 has been successfully applied on the KinExA Inline Biosensor system for quantifying 3-5 ring PAHs in aqueous environmental samples. PAHs were detected at a concentration as low as 0.2 μg/L. Furthermore, the analyses only required 10 min for each sample. To evaluate the accuracy of the 2G8-based biosensor, the total PAH concentrations in a series of environmental samples analyzed by biosensor and GC-MS were compared. In most cases, the results yielded a good correlation between methods. This indicates that generic antibody 2G8 based biosensor possesses significant promise for a low cost, rapid method for PAH determination in aqueous samples. PMID:26925369
Effectively managing consumer fuel price driven transit demand.
DOT National Transportation Integrated Search
2013-05-01
This study presents a literature review of transit demand elasticities with respect to gas prices, describes features of a transit service area population that may be more sensitive to fuel prices, identifies where stress points in the family of tran...
Gidlöf, Kerstin; Anikin, Andrey; Lingonblad, Martin; Wallin, Annika
2017-09-01
There is a battle in the supermarket isle, a battle between what the consumer wants and what the retailer and others want her to see, and subsequently to buy. Product packages and displays contain a number of features and attributes tailored to catch consumers' attention. These are what we call external factors comprising the visual saliency, the number of facings, and the placement of each product. But a consumer also brings with her a number of goals and interests related to the products and their attributes. These are important internal factors, including brand preferences, price sensitivity, and dietary inclinations. We fit mobile eye trackers to consumers visiting real-life supermarkets in order to investigate to what extent external and internal factors affect consumers' visual attention and purchases. Both external and internal factors influenced what products consumers looked at, with a strong positive interaction between visual saliency and consumer preferences. Consumers appear to take advantage of visual saliency in their decision making, using their knowledge about products' appearance to guide their visual attention towards those that fit their preferences. When it comes to actual purchases, however, visual attention was by far the most important predictor, even after controlling for all other internal and external factors. In other words, the very act of looking longer or repeatedly at a package, for any reason, makes it more likely that this product will be bought. Visual attention is thus crucial for understanding consumer behaviour, even in the cluttered supermarket environment, but it cannot be captured by measurements of visual saliency alone. Copyright © 2017 Elsevier Ltd. All rights reserved.
Acquiring Research-grade ALSM Data in the Commercial Marketplace
NASA Astrophysics Data System (ADS)
Haugerud, R. A.; Harding, D. J.; Latypov, D.; Martinez, D.; Routh, S.; Ziegler, J.
2003-12-01
The Puget Sound Lidar Consortium, working with TerraPoint, LLC, has procured a large volume of ALSM (topographic lidar) data for scientific research. Research-grade ALSM data can be characterized by their completeness, density, and accuracy. Complete data include-at a minimum-X, Y, Z, time, and classification (ground, vegetation, structure, blunder) for each laser reflection. Off-nadir angle and return number for multiple returns are also useful. We began with a pulse density of 1/sq m, and after limited experiments still find this density satisfactory in the dense second-growth forests of western Washington. Lower pulse densities would have produced unacceptably limited sampling in forested areas and aliased some topographic features. Higher pulse densities do not produce markedly better topographic models, in part because of limitations of reproducibility between the overlapping survey swaths used to achieve higher density. Our experience in a variety of forest types demonstrates that the fraction of pulses that produce ground returns varies with vegetation cover, laser beam divergence, laser power, and detector sensitivity, but have not quantified this relationship. The most significant operational limits on vertical accuracy of ALSM appear to be instrument calibration and the accuracy with which returns are classified as ground or vegetation. TerraPoint has recently implemented in-situ calibration using overlapping swaths (Latypov and Zosse, 2002, see http://www.terrapoint.com/News_damirACSM_ASPRS2002.html). On the consumer side, we routinely perform a similar overlap analysis to produce maps of relative Z error between swaths; we find that in bare, low-slope regions the in-situ calibration has reduced this internal Z error to 6-10 cm RMSE. Comparison with independent ground control points commonly illuminates inconsistencies in how GPS heights have been reduced to orthometric heights. Once these inconsistencies are resolved, it appears that the internal errors are the bulk of the error of the survey. The error maps suggest that with in-situ calibration, minor time-varying errors with a period of circa 1 sec are the largest remaining source of survey error. For forested terrain, limited ground penetration and errors in return classification can severely limit the accuracy of resulting topographic models. Initial work by Haugerud and Harding demonstrated the feasibility of fully-automatic return classification; however, TerraPoint has found that better results can be obtained more effectively with 3rd-party classification software that allows a mix of automated routines and human intervention. Our relationship has been evolving since early 2000. Important aspects of this relationship include close communication between data producer and consumer, a willingness to learn from each other, significant technical expertise and resources on the consumer side, and continued refinement of achievable, quantitative performance and accuracy specifications. Most recently we have instituted a slope-dependent Z accuracy specification that TerraPoint first developed as a heuristic for surveying mountainous terrain in Switzerland. We are now working on quantifying the internal consistency of topographic models in forested areas, using a variant of overlap analysis, and standards for the spatial distribution of internal errors.
Baudart, J; Guillaume, C; Mercier, A; Lebaron, P; Binet, M
2015-05-01
To develop a rapid and sensitive method to quantify viable Legionella spp. in cooling tower water samples. A rapid, culture-based method capable of quantifying as few as 600 Legionella microcolonies per litre within 2 days in industrial waters was developed. The method combines a short cultivation step of microcolonies on GVPC agar plate, specific detection of Legionella cells by a fluorescent in situ hybridization (FISH) approach, and a sensitive enumeration using a solid-phase cytometer. Following optimization of the cultivation conditions, the qualitative and quantitative performance of the method was assessed and the method was applied to 262 nuclear power plant cooling water samples. The performance of this method was in accordance with the culture method (NF-T 90-431) for Legionella enumeration. The rapid detection of viable Legionella in water is a major concern to the effective monitoring of this pathogenic bacterium in the main water sources involved in the transmission of legionellosis infection (Legionnaires' disease). The new method proposed here appears to be a robust, efficient and innovative means for rapidly quantifying cultivable Legionella in cooling tower water samples within 48 h. © 2015 The Society for Applied Microbiology.
NASA Astrophysics Data System (ADS)
Lougheed, Bryan; van der Lubbe, Jeroen; Davies, Gareth
2016-04-01
Accurate geochronologies are crucial for reconstructing the sensitivity of brackish and estuarine environments to rapidly changing past external impacts. A common geochronological method used for such studies is radiocarbon (14C) dating, but its application in brackish environments is severely limited by an inability to quantify spatiotemporal variations in 14C reservoir age, or R(t), due to dynamic interplay between river runoff and marine water. Additionally, old carbon effects and species-specific behavioural processes also influence 14C ages. Using the world's largest brackish water body (the estuarine Baltic Sea) as a test-bed, combined with a comprehensive approach that objectively excludes both old carbon and species-specific effects, we demonstrate that it is possible to use 87Sr/86Sr ratios to quantify R(t) in ubiquitous mollusc shell material, leading to almost one order of magnitude increase in Baltic Sea 14C geochronological precision over the current state-of-the-art. We propose that this novel proxy method can be developed for other brackish water bodies worldwide, thereby improving geochronological control in these climate sensitive, near-coastal environments.
El-Zakhem Naous, Ghada; Merhi, Areej; Abboud, Martine I; Mroueh, Mohamad; Taleb, Robin I
2018-06-06
The present study aims to quantify acrylamide in caffeinated beverages including American coffee, Lebanese coffee, espresso, instant coffee and hot chocolate, and to determine their carcinogenic and neurotoxic risks. A survey was carried for this purpose whereby 78% of the Lebanese population was found to consume at least one type of caffeinated beverages. Gas Chromatography Mass Spectrometry analysis revealed that the average acrylamide level in caffeinated beverages is 29,176 μg/kg sample. The daily consumption of acrylamide from Lebanese coffee (10.9 μg/kg-bw/day), hot chocolate (1.2 μg/kg-bw/day) and Espresso (7.4 μg/kg-bw/day) was found to be higher than the risk intake for carcinogenicity and neurotoxicity as set by World Health Organization (WHO; 0.3-2 μg/kg-bw/day) at both the mean (average consumers) and high (high consumers) dietary exposures. On the other hand, American coffee (0.37 μg/kg-bw/day) was shown to pose no carcinogenic or neurotoxic risks among the Lebanese community for consumers with a mean dietary exposure. The study shows alarming results that call for regulating the caffeinated product industry by setting legislations and standard protocols for product preparation in order to limit the acrylamide content and protect consumers. In order to avoid carcinogenic and neurotoxic risks, we propose that WHO/FAO set acrylamide levels in caffeinated beverages to 7000 μg acrylamide/kg sample, a value which is 4-folds lower than the average acrylamide levels of 29,176 μg/kg sample found in caffeinated beverages sold in the Lebanese market. Alternatively, consumers of caffeinated products, especially Lebanese coffee and espresso, would have to lower their daily consumption to 0.3-0.4 cups/day. Copyright © 2018 Elsevier Ltd. All rights reserved.
Consumer exposure scenarios: development, challenges and possible solutions.
Van Engelen, J G M; Heinemeyer, G; Rodriguez, C
2007-12-01
Exposure scenarios (ES) under REACH (Registration, Evaluation, and Authorisation of Chemicals; new EU legislation) aim to describe safe conditions of product and substance use. Both operational conditions and risk management measures (RMMs) are part of the ES. For consumer use of chemicals, one of the challenges will be to identify all of the consumer uses of a given chemical and then quantify the exposure derived from each of them. Product use categories can be established to identify in a systematic fashion how products are used. These product categories comprise products that are used similarly (e.g. paints, adhesives). They deliver information about product use characteristics, and provide an easy-to-handle tool for exchanging standardised information. For practical reasons, broad ES will have to be developed, which cover a wide range of products and use. The challenge will be to define them broadly, but not in a way that they provide such an overestimation of exposure that a next iteration or a more complex model is always needed. Tiered and targeted approaches for estimation of exposure at the right level of detail may offer the best solution. RMMs relevant for consumers include those inherent to product design (controllable) and those that are communicated to consumers as directions for use (non-controllable). Quantification of the effect of non-controllable RMMs on consumer exposure can prove to be difficult. REACH requires aggregation of exposure from all relevant identified sources. Development of appropriate methodology for realistic aggregation of exposure will be no small challenge and will likely require probabilistic approaches and comprehensive databases on populations' habits, practices and behaviours. REACH regulation aims at controlling the use of chemicals so that exposure to every chemical can be demonstrated to be safe for consumers, workers, and the environment when considered separately, but also when considered in an integrated way. This integration will be another substantial challenge for the future.
DEVELOPMENT OF MONOCLONAL ANTIBODIES AGAINST FATHEAD MINNOW (PIMEPHALES PROMELAS) VITELLOGENIN
We have obtained a panel of monoclonal antibodies directed against fathead minnow vitellogenin (Vtg) for use in sensitive ELISAs to quantify the response of exposure in vivo to estrogen or estrogen mimics.
Discord as a quantum resource for bi-partite communication
NASA Astrophysics Data System (ADS)
Chrzanowski, Helen M.; Gu, Mile; Assad, Syed M.; Symul, Thomas; Modi, Kavan; Ralph, Timothy C.; Vedral, Vlatko; Lam, Ping Koy
2014-12-01
Coherent interactions that generate negligible entanglement can still exhibit unique quantum behaviour. This observation has motivated a search beyond entanglement for a complete description of all quantum correlations. Quantum discord is a promising candidate. Here, we experimentally demonstrate that under certain measurement constraints, discord between bipartite systems can be consumed to encode information that can only be accessed by coherent quantum interactions. The inability to access this information by any other means allows us to use discord to directly quantify this `quantum advantage'.
Life Cycle Assessment to support the quantification of the environmental impacts of an event
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toniolo, Sara; Mazzi, Anna; Fedele, Andrea
In recent years, several tools have been used to define and quantify the environmental impacts associated with an event; however, a lack of uniform approaches for conducting environmental evaluations has been revealed. The aim of this paper is to evaluate whether the Life Cycle Assessment methodology, which is rarely applied to an event, can be an appropriate tool for calculating the environmental impacts associated with the assembly, disassembly, and use phase of an event analysing in particular the components and the displays used to establish the exhibits. The aim is also to include the issues reported by ISO 20121:2012 involvingmore » the interested parties that can be monitored but also affected by the event owner, namely the event organiser, the workforce and the supply chain. A small event held in Northern Italy was selected as the subject of the research. The results obtained show that the main contributors are energy consumption for lighting and heating and the use of aluminium materials, such as bars for supporting the spotlights, carpet and the electronic equipment. A sensitivity analysis for estimating the effects of the impact assessment method chosen has also been conducted and an uncertainty analysis has been performed using the Monte Carlo technique. This study highlighted the importance of the energy consumed by heating and lighting on the environmental implications, and indicated that the preparation and assembly should always be considered when quantifying the environmental profile of an event. - Highlights: • LCA methodology, developed for products and services, is applied to an event. • A small event held in Northern Italy is analysed. • The main contributors are energy consumption and the use of aluminium and carpet. • Exhibition site preparation can have important environmental implications. • This study demonstrates the importance of the assembly, disassembly and use phase.« less
Sut-Lohmann, Magdalena; Raab, Thomas
2017-08-01
The continuous release of persistent iron-cyanide (Fe-CN) complexes from various industrial sources poses a high hazard to the environment and indicates the necessity to analyze a considerable amount of samples. Conventional flow injection analysis (FIA) is a time and cost consuming method for cyanide (CN) determination. Thus, a rapid and economic alternative needs to be developed to quantify the Fe-CN complexes. 52 soil samples were collected at a former Manufactured Gas Plant (MGP) site in order to determine the feasibility of diffuse reflectance infrared Fourier spectroscopy (DRIFTS). Soil analysis revealed CN concentrations in a range from 8 to 14.809 mg kg -1 , where 97% was in the solid form (Fe 4 [Fe(CN) 6 ] 3 ), which is characterized by a single symmetrical CN band in the range 2092-2084 cm -1 . The partial least squares (PLS) calibration-validation model revealed IR response to CN tot which exceeds 2306 mg kg -1 (limit of detection, LOD). Leave-one-out cross-validation (LOO-CV) was performed on soil samples, which contained low CN tot (<900 mg kg -1 ). This improved the sensitivity of the model by reducing the LOD to 154 mg kg -1 . Finally, the LOO-CV conducted on the samples with CN tot > 900 mg kg -1 resulted in LOD equal to 3751 mg kg -1 . It was found that FTIR spectroscopy provides the information concerning different CN species in the soil samples. Additionally, it is suitable for quantifying Fe-CN species in matrixes with CN tot > 154 mg kg -1 . Thus, FTIR spectroscopy, in combination with the statistical approach applied here seems to be a feasible and quick method for screening of contaminated sites. Copyright © 2017 Elsevier Ltd. All rights reserved.
2017-01-01
Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this. PMID:28777822
Demšar, Urška; Çöltekin, Arzu
2017-01-01
Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this.
An intercept study to measure the extent to which New Zealand university students pre-game.
Riordan, Benjamin C; Conner, Tamlin S; Flett, Jayde A M; Droste, Nic; Cody, Louise; Brookie, Kate L; Riordan, Jessica K; Scarf, Damian
2018-02-01
We aimed to quantify the degree to which students pre-gamed in New Zealand, using self-report and breathalysers. A total of 569 New Zealand undergraduate students were interviewed (men = 45.2%; first year = 81.4%) entering three university-run concerts. We asked participants to report how many drinks they had consumed, their self-reported intoxication and the duration of their pre-gaming session. We then recorded participants' Breath Alcohol Concentration (BrAC; µg/L) and the time they arrived at the event. The number of participants who reported consuming alcohol before the event was 504 (88.6%) and the number of standard drinks consumed was high (M=6.9; median=6.0). A total of 237 (41.7%) participants could not have their BrAC recorded due to having consumed alcohol ≤10 minutes before the interview. The remaining 332 participants (57.3%) recorded a mean BrAC of 288.8µg/L (median=280.0 µg/L). Gender, off-campus accommodation, length of pre-gaming drinking session, and time of arrival at the event were all associated with increased pre-gaming. Conclusion and implications for public health: Pre-gaming was the norm for students. Universities must take pre-gaming into account; policy implications include earlier start times of events and limiting students' access to alcohol prior to events. © 2017 The Authors.
Nicod, Elena; Jackson, Timothy L; Grimaccia, Federico; Angelis, Aris; Costen, Marc; Haynes, Richard; Hughes, Edward; Pringle, Edward; Zambarakji, Hadi; Kanavos, Panos
2016-11-01
The direct cost to the National Health Service (NHS) in England of pars plana vitrectomy (PPV) is unknown since a bottom-up costing exercise has not been undertaken. Healthcare resource group (HRG) costing relies on a top-down approach. We aimed to quantify the direct cost of intermediate complexity PPV. Five NHS vitreoretinal units prospectively recorded all consumables, equipment and staff salaries during PPV undertaken for vitreomacular traction, epiretinal membrane and macular hole. Out-of-surgery costs between admission and discharge were estimated using a representative accounting method. The average patient time in theatre for 57 PPVs was 72 min. The average in-surgery cost for staff was £297, consumables £619, and equipment £82 (total £997). The average out-of-surgery costs were £260, including nursing and medical staff, other consumables, eye drops and hospitalisation. The total cost was therefore £1634, including 30 % overheads. This cost estimate was an under-estimate because it did not include out-of-theatre consumables or equipment. The average reimbursed HRG tariff was £1701. The cost of undertaking PPV of intermediate complexity is likely to be higher than the reimbursed tariff, except for hospitals with high throughput, where amortisation costs benefit from economies of scale. Although this research was set in England, the methodology may provide a useful template for other countries.
DeLeire, Thomas; Chappel, Andre; Finegold, Kenneth; Gee, Emily
2017-12-01
The Affordable Care Act (ACA) provides assistance to low-income consumers through both premium subsidies and cost-sharing reductions (CSRs). Low-income consumers' lack of health insurance literacy or information regarding CSRs may lead them to not take-up CSR benefits for which they are eligible. We use administrative data from 2014 to 2016 on roughly 22 million health insurance plan choices of low-income individuals enrolled in ACA Marketplace coverage to assess whether they behave in a manner consistent with being aware of the availability of CSRs. We take advantage of discontinuous changes in the schedule of CSR benefits to show that consumers are highly sensitive to the value of CSRs when selecting insurance plans and that a very low percentage select dominated plans. These findings suggest that CSR subsidies are salient to consumers and that the program is well designed to account for any lack of health insurance literacy among the low-income population it serves. Copyright © 2017 Elsevier B.V. All rights reserved.
Failure of enzyme encapsulation to prevent sensitization of workers in the dry bleach industry.
Liss, G M; Kominsky, J R; Gallagher, J S; Melius, J; Brooks, S M; Bernstein, I L
1984-03-01
BDE added to dry bleach have been associated with immunologic sensitization and development of clinical allergic disease in detergent workers and occasionally in consumers. However, improved dust control and modification of the manufacturing process through encapsulation of enzyme were believed to have reduced or eliminated these problems. To determine whether or not immunologic sensitization could still develop in the detergent industry, we studied employees of a dry bleach manufacturing plant that incorporated encapsulated BDE into a consumer product. We performed air sampling for enzyme dust and total particulates, administered questionnaires, conducted physical examinations, and spirometry in 13 currently exposed, two previously exposed and nine nonexposed, employees. To assess sensitization status, RAST and ELISA were performed. Air concentrations of enzyme dust ranged from 0.002 to 1.57 micrograms/m3; all of these levels were below the TLV of 3.9 micrograms/m3. Positive BDE-specific RAST results (3.4%, 4.4%, and 8.0% binding) were obtained in three of 12 currently exposed workers. Results of personal breathing-zone air sampling indicated that these workers had high dust-exposure levels. Specificity of RAST was verified by RAST inhibition with BDE. BDE-RAST binding was not significantly elevated in the nonworkers (range: 0.6% to 1.4% binding). Positive results for specific IgG by ELISA were obtained in four of 12 currently exposed and in one of two previously exposed workers but in none of the nonexposed workers. We conclude that immunologic sensitization can develop after occupational exposure to encapsulated BDE in the dry bleach industry. We have not proved, however, that this immunologic reactivity is related to clinical sensitivity.
Berezowska, Aleksandra; Fischer, Arnout R H; Ronteltap, Amber; Kuznesof, Sharron; Macready, Anna; Fallaize, Rosalind; van Trijp, Hans C M
2014-01-01
Personalised nutrition (PN) may provide major health benefits to consumers. A potential barrier to the uptake of PN is consumers' reluctance to disclose sensitive information upon which PN is based. This study adopts the privacy calculus to explore how PN service attributes contribute to consumers' privacy risk and personalisation benefit perceptions. Sixteen focus groups (n = 124) were held in 8 EU countries and discussed 9 PN services that differed in terms of personal information, communication channel, service provider, advice justification, scope, frequency, and customer lock-in. Transcripts were content analysed. The personal information that underpinned PN contributed to both privacy risk perception and personalisation benefit perception. Disclosing information face-to-face mitigated the perception of privacy risk and amplified the perception of personalisation benefit. PN provided by a qualified expert and justified by scientific evidence increased participants' value perception. Enhancing convenience, offering regular face-to face support, and employing customer lock-in strategies were perceived as beneficial. This study suggests that to encourage consumer adoption, PN has to account for face-to-face communication, expert advice providers, support, a lifestyle-change focus, and customised offers. The results provide an initial insight into service attributes that influence consumer adoption of PN. © 2014 S. Karger AG, Basel.
The water footprint of humanity
NASA Astrophysics Data System (ADS)
Mekonnen, M. M.; Hoekstra, A. Y.
2011-12-01
This study quantifies and maps the water footprint (WF) of humanity at a high spatial resolution level. It reports on consumptive use of rainwater (green WF) and ground and surface water (blue WF) and volumes of water polluted (grey WF). Water footprints are estimated per nation from both a production and consumption perspective. International virtual water flows are estimated based on trade in agricultural and industrial commodities. The global WF in the period 1996-2005 was 9087 Gm3/yr (74% green, 11% blue, 15% grey). Agricultural production contributes 92%. About one fifth of the global WF relates to production for export. The total volume of international virtual water flows related to trade in agricultural and industrial products was 2320 Gm3/yr (68% green, 13% blue, 19% grey). The WF of the global average consumer was 1385 m3/yr. The average consumer in the US has a WF of 2842 m3/yr, while the average citizens in China and India have WFs of 1071 m3/yr and 1089 m3/yr, respectively. Consumption of cereal products gives the largest contribution to the WF of the average consumer (27%), followed by meat (22%) and milk products (7%). The volume and pattern of consumption and the WF per ton of product of the products consumed are the main factors determining the WF of a consumer. The study illustrates the global dimension of water consumption and pollution by showing that several countries heavily rely on foreign water resources and that many countries have significant impacts on water consumption and pollution elsewhere.
Chen, Allison J; Linakis, James G; Mello, Michael J; Greenberg, Paul B
2013-06-01
To quantify and characterize eye injuries related to consumer products in the infant population (0-12 months) treated in United States hospital emergency departments during the period from 2001 to 2008. This study is a descriptive analysis of consumer-product related eye injury data derived from the National Electronic Injury Surveillance System, a probability sample of 100 hospitals nationwide with 24-hour emergency departments. Narrative data were used to assign each case with the consumer products (CPs) causing the eye injury. The proportions of eye injury visits were calculated by age, sex, diagnosis, disposition, locale of incident, and CP categories. We examined the US Consumer Product Safety Commission National Electronic Injury Surveillance System data for all nonfatal eye injuries (853 cases) in the infant population (0-12 months) treated in US emergency departments from 2001 to 2008. These data can be used to project national, annual, weighted estimates of nonfatal injury treated in US emergency departments. There were an estimated 21,271 visits to US emergency departments by patients aged 0-12 months for CP-related eye injuries during the study period. Of these, 63% involved infants aged 9-12 months and 54% involved male patients; 78% of all injuries occurred at home. The CPs causing the most eye injuries belonged to the categories of chemical (46%) and household items (24%). Contusions and abrasions were the leading eye injuries diagnoses (37%). This study suggests that most CP-related infant eye injuries in the United States occur at home and are predominantly caused by chemicals and household products. Published by Mosby, Inc.
The water footprint of humanity.
Hoekstra, Arjen Y; Mekonnen, Mesfin M
2012-02-28
This study quantifies and maps the water footprint (WF) of humanity at a high spatial resolution. It reports on consumptive use of rainwater (green WF) and ground and surface water (blue WF) and volumes of water polluted (gray WF). Water footprints are estimated per nation from both a production and consumption perspective. International virtual water flows are estimated based on trade in agricultural and industrial commodities. The global annual average WF in the period 1996-2005 was 9,087 Gm(3)/y (74% green, 11% blue, 15% gray). Agricultural production contributes 92%. About one-fifth of the global WF relates to production for export. The total volume of international virtual water flows related to trade in agricultural and industrial products was 2,320 Gm(3)/y (68% green, 13% blue, 19% gray). The WF of the global average consumer was 1,385 m(3)/y. The average consumer in the United States has a WF of 2,842 m(3)/y, whereas the average citizens in China and India have WFs of 1,071 and 1,089 m(3)/y, respectively. Consumption of cereal products gives the largest contribution to the WF of the average consumer (27%), followed by meat (22%) and milk products (7%). The volume and pattern of consumption and the WF per ton of product of the products consumed are the main factors determining the WF of a consumer. The study illustrates the global dimension of water consumption and pollution by showing that several countries heavily rely on foreign water resources and that many countries have significant impacts on water consumption and pollution elsewhere.
Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare
Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less
Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model
Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare; ...
2016-04-01
Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less
Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model
NASA Astrophysics Data System (ADS)
Urrego-Blanco, Jorge R.; Urban, Nathan M.; Hunke, Elizabeth C.; Turner, Adrian K.; Jeffery, Nicole
2016-04-01
Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual model parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. It is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.
Quantification of Carbohydrates in Grape Tissues Using Capillary Zone Electrophoresis
Zhao, Lu; Chanon, Ann M.; Chattopadhyay, Nabanita; Dami, Imed E.; Blakeslee, Joshua J.
2016-01-01
Soluble sugars play an important role in freezing tolerance in both herbaceous and woody plants, functioning in both the reduction of freezing-induced dehydration and the cryoprotection of cellular constituents. The quantification of soluble sugars in plant tissues is, therefore, essential in understanding freezing tolerance. While a number of analytical techniques and methods have been used to quantify sugars, most of these are expensive and time-consuming due to complex sample preparation procedures which require the derivatization of the carbohydrates being analyzed. Analysis of soluble sugars using capillary zone electrophoresis (CZE) under alkaline conditions with direct UV detection has previously been used to quantify simple sugars in fruit juices. However, it was unclear whether CZE-based methods could be successfully used to quantify the broader range of sugars present in complex plant extracts. Here, we present the development of an optimized CZE method capable of separating and quantifying mono-, di-, and tri-saccharides isolated from plant tissues. This optimized CZE method employs a column electrolyte buffer containing 130 mM NaOH, pH 13.0, creating a current of 185 μA when a separation voltage of 10 kV is employed. The optimized CZE method provides limits-of-detection (an average of 1.5 ng/μL) for individual carbohydrates comparable or superior to those obtained using gas chromatography–mass spectrometry, and allows resolution of non-structural sugars and cell wall components (structural sugars). The optimized CZE method was successfully used to quantify sugars from grape leaves and buds, and is a robust tool for the quantification of plant sugars found in vegetative and woody tissues. The increased analytical efficiency of this CZE method makes it ideal for use in high-throughput metabolomics studies designed to quantify plant sugars. PMID:27379118
Evangelista, Dennis J.; Ray, Dylan D.; Hedrick, Tyson L.
2016-01-01
ABSTRACT Ecological, behavioral and biomechanical studies often need to quantify animal movement and behavior in three dimensions. In laboratory studies, a common tool to accomplish these measurements is the use of multiple, calibrated high-speed cameras. Until very recently, the complexity, weight and cost of such cameras have made their deployment in field situations risky; furthermore, such cameras are not affordable to many researchers. Here, we show how inexpensive, consumer-grade cameras can adequately accomplish these measurements both within the laboratory and in the field. Combined with our methods and open source software, the availability of inexpensive, portable and rugged cameras will open up new areas of biological study by providing precise 3D tracking and quantification of animal and human movement to researchers in a wide variety of field and laboratory contexts. PMID:27444791
Incorrect interpretation of carbon mass balance biases global vegetation fire emission estimates.
Surawski, N C; Sullivan, A L; Roxburgh, S H; Meyer, C P Mick; Polglase, P J
2016-05-05
Vegetation fires are a complex phenomenon in the Earth system with many global impacts, including influences on global climate. Estimating carbon emissions from vegetation fires relies on a carbon mass balance technique that has evolved with two different interpretations. Databases of global vegetation fire emissions use an approach based on 'consumed biomass', which is an approximation to the biogeochemically correct 'burnt carbon' approach. Here we show that applying the 'consumed biomass' approach to global emissions from vegetation fires leads to annual overestimates of carbon emitted to the atmosphere by 4.0% or 100 Tg compared with the 'burnt carbon' approach. The required correction is significant and represents ∼9% of the net global forest carbon sink estimated annually. Vegetation fire emission studies should use the 'burnt carbon' approach to quantify and understand the role of this burnt carbon, which is not emitted to the atmosphere, as a sink enriched in carbon.
Sugimoto, Masahiro; Obiya, Shinichi; Kaneko, Miku; Enomoto, Ayame; Honma, Mayu; Wakayama, Masataka; Soga, Tomoyoshi; Tomita, Masaru
2017-01-18
Dry-cured hams are popular among consumers. To increase the attractiveness of the product, objective analytical methods and algorithms to evaluate the relationship between observable properties and consumer acceptability are required. In this study, metabolomics, which is used for quantitative profiling of hundreds of small molecules, was applied to 12 kinds of dry-cured hams from Japan and Europe. In total, 203 charged metabolites, including amino acids, organic acids, nucleotides, and peptides, were successfully identified and quantified. Metabolite profiles were compared for the samples with different countries of origin and processing methods (e.g., smoking or use of a starter culture). Principal component analysis of the metabolite profiles with sensory properties revealed significant correlations for redness, homogeneity, and fat whiteness. This approach could be used to design new ham products by objective evaluation of various features.
Hosios, Aaron M.; Hecht, Vivian C.; Danai, Laura V.; Johnson, Marc O.; Rathmell, Jeffrey C.; Steinhauser, Matthew L.; Manalis, Scott R.; Vander Heiden, Matthew G.
2016-01-01
Cells must duplicate their mass in order to proliferate. Glucose and glutamine are the major nutrients consumed by proliferating mammalian cells, but the extent to which these and other nutrients contribute to cell mass is unknown. We quantified the fraction of cell mass derived from different nutrients and find that the majority of carbon mass in cells is derived from other amino acids, which are consumed at much lower rates than glucose and glutamine. While glucose carbon has diverse fates, glutamine contributes most to protein, and this suggests that glutamine’s ability to replenish TCA cycle intermediates (anaplerosis) is primarily used for amino acid biosynthesis. These findings demonstrate that rates of nutrient consumption are indirectly associated with mass accumulation and suggest that high rates of glucose and glutamine consumption support rapid cell proliferation beyond providing carbon for biosynthesis. PMID:26954548
An Air Revitalization Model (ARM) for Regenerative Life Support Systems (RLSS)
NASA Technical Reports Server (NTRS)
Hart, Maxwell M.
1990-01-01
The primary objective of the air revitalization model (ARM) is to determine the minimum buffer capacities that would be necessary for long duration space missions. Several observations are supported by the current configuration sizes: the baseline values for each gas and the day to day or month to month fluctuations that are allowed. The baseline values depend on the minimum safety tolerances and the quantities of life support consumables necessary to survive the worst case scenarios within those tolerances. Most, it not all, of these quantities can easily be determined by ARM once these tolerances are set. The day to day fluctuations also require a command decision. It is already apparent from the current configuration of ARM that the tighter these fluctuations are controlled, the more energy used, the more nonregenerable hydrazine consumed, and the larger the required capacities for the various gas generators. All of these relationships could clearly be quantified by one operational ARM.
Stevens, Laura J; Burgess, John R; Stochelski, Mateusz A; Kuczek, Thomas
2014-02-01
Artificial food colors (AFCs) are widely used to color foods and beverages. The amount of AFCs the Food and Drug Administration has certified over the years has increased more than 5-fold since 1950 (12 mg/capita/day) to 2012 (68 mg/capita/day). In the past 38 years, there have been studies of adverse behavioral reactions such as hyperactivity in children to double-blind challenges with AFCs. Studies that used 50 mg or more of AFCs as the challenge showed a greater negative effect on more children than those which used less. The study reported here is the first to quantify the amounts of AFCs in foods (specifically in beverages) commonly consumed by children in the United States. Consumption data for all foods would be helpful in the design of more challenge studies. The data summarized here should help clinicians advise parents about AFCs and beverage consumption.
Pantoja-Lima, Jackson; Aride, Paulo H R; de Oliveira, Adriano T; Félix-Silva, Daniely; Pezzuti, Juarez C B; Rebêlo, George H
2014-01-27
Consumption of turtles by natives and settlers in the Amazon and Orinoco has been widely studied in scientific communities. Accepted cultural customs and the local dietary and monetary needs need to be taken into account in conservation programs, and when implementing federal laws related to consumption and fishing methods. This study was conducted around the Purus River, a region known for the consumption and illegal trade of turtles. The objective of this study was to quantify the illegal turtle trade in Tapauá and to understand its effect on the local economy. This study was conducted in the municipality of Tapauá in the state of Amazonas, Brazil. To estimate turtle consumption, interviews were conducted over 2 consecutive years (2006 and 2007) in urban areas and isolated communities. The experimental design was randomized with respect to type of household. To study the turtle fishery and trade chain, we used snowball sampling methodology. During our study period, 100% of respondents reported consuming at least three species of turtles (Podocnemis spp.). Our estimates indicate that about 34 tons of animals are consumed annually in Tapauá along the margins of a major fishing river in the Amazon. At least five components related to the chain of commercialization of turtles on the Purus River are identified: Indigenous Apurinã and (2) residents of bordering villages (communities); (3) of local smugglers buy and sell turtles to the community in exchange for manufactured goods, and (4) regional smugglers buy in Tapauá, Lábrea, and Beruri to sell in Manaus and Manacapuru; Finally, (5) there are professional fishermen. We quantify the full impact of turtle consumption and advocate the conservation of the region's turtle populations. The Brazilian government should initiate a new turtle consumption management program which involves the opinions of consumers. With these measures the conservation of freshwater turtles in the Brazilian Amazon, is possible.
Benke, Arthur C
2018-03-31
The majority of food web studies are based on connectivity, top-down impacts, bottom-up flows, or trophic position (TP), and ecologists have argued for decades which is best. Rarely have any two been considered simultaneously. The present study uses a procedure that integrates the last three approaches based on taxon-specific secondary production and gut analyses. Ingestion flows are quantified to create a flow web and the same data are used to quantify TP for all taxa. An individual predator's impacts also are estimated using the ratio of its ingestion (I) of each prey to prey production (P) to create an I/P web. This procedure was applied to 41 invertebrate taxa inhabiting submerged woody habitat in a southeastern U.S. river. A complex flow web starting with five basal food resources had 462 flows >1 mg·m -2 ·yr -1 , providing far more information than a connectivity web. Total flows from basal resources to primary consumers/omnivores were dominated by allochthonous amorphous detritus and ranged from 1 to >50,000 mg·m -2 ·yr -1 . Most predator-prey flows were much lower (<50 mg·m -2 ·yr -1 ), but some were >1,000 mg·m -2 ·yr -1 . The I/P web showed that 83% of individual predator impacts were weak (<10%), whereas total predator impacts were often strong (e.g., 35% of prey sustained an impact >90%). Quantitative estimates of TP ranged from 2 to 3.7, contrasting sharply with seven integer-based trophic levels based on longest feeding chain. Traditional omnivores (TP = 2.4-2.9) played an important role by consuming more prey and exerting higher impacts on primary consumers than strict predators (TP ≥ 3). This study illustrates how simultaneous quantification of flow pathways, predator impacts, and TP together provide an integrated characterization of natural food webs. © 2018 by the Ecological Society of America.
2014-01-01
Background Consumption of turtles by natives and settlers in the Amazon and Orinoco has been widely studied in scientific communities. Accepted cultural customs and the local dietary and monetary needs need to be taken into account in conservation programs, and when implementing federal laws related to consumption and fishing methods. This study was conducted around the Purus River, a region known for the consumption and illegal trade of turtles. The objective of this study was to quantify the illegal turtle trade in Tapauá and to understand its effect on the local economy. Methods This study was conducted in the municipality of Tapauá in the state of Amazonas, Brazil. To estimate turtle consumption, interviews were conducted over 2 consecutive years (2006 and 2007) in urban areas and isolated communities. The experimental design was randomized with respect to type of household. To study the turtle fishery and trade chain, we used snowball sampling methodology. Results During our study period, 100% of respondents reported consuming at least three species of turtles (Podocnemis spp.). Our estimates indicate that about 34 tons of animals are consumed annually in Tapauá along the margins of a major fishing river in the Amazon. At least five components related to the chain of commercialization of turtles on the Purus River are identified: Indigenous Apurinã and (2) residents of bordering villages (communities); (3) of local smugglers buy and sell turtles to the community in exchange for manufactured goods, and (4) regional smugglers buy in Tapauá, Lábrea, and Beruri to sell in Manaus and Manacapuru; Finally, (5) there are professional fishermen. Conclusions We quantify the full impact of turtle consumption and advocate the conservation of the region’s turtle populations. The Brazilian government should initiate a new turtle consumption management program which involves the opinions of consumers. With these measures the conservation of freshwater turtles in the Brazilian Amazon, is possible. PMID:24467796
Bonny, S P F; Gardner, G E; Pethick, D W; Allen, P; Legrand, I; Wierzbicki, J; Farmer, L J; Polkinghorne, R J; Hocquette, J-F
2017-08-01
The beef industry must become more responsive to the changing market place and consumer demands. An essential part of this is quantifying a consumer's perception of the eating quality of beef and their willingness to pay for that quality, across a broad range of demographics. Over 19 000 consumers from Northern Ireland, Poland, Ireland and France each tasted seven beef samples and scored them for tenderness, juiciness, flavour liking and overall liking. These scores were weighted and combined to create a fifth score, termed the Meat Quality 4 score (MQ4) (0.3×tenderness, 0.1×juiciness, 0.3×flavour liking and 0.3×overall liking). They also allocated the beef samples into one of four quality grades that best described the sample; unsatisfactory, good-every-day, better-than-every-day or premium. After the completion of the tasting panel, consumers were then asked to detail, in their own currency, their willingness to pay for these four categories which was subsequently converted to a proportion relative to the good-every-day category (P-WTP). Consumers also answered a short demographic questionnaire. The four sensory scores, the MQ4 score and the P-WTP were analysed separately, as dependant variables in linear mixed effects models. The answers from the demographic questionnaire were included in the model as fixed effects. Overall, there were only small differences in consumer scores and P-WTP between demographic groups. Consumers who preferred their beef cooked medium or well-done scored beef higher, except in Poland, where the opposite trend was found. This may be because Polish consumers were more likely to prefer their beef cooked well-done, but samples were cooked medium for this group. There was a small positive relationship with the importance of beef in the diet, increasing sensory scores by about 4% in Poland and Northern Ireland. Men also scored beef about 2% higher than women for most sensory scores in most countries. In most countries, consumers were willing to pay between 150 and 200% more for premium beef, and there was a 50% penalty in value for unsatisfactory beef. After quality grade, by far the greatest influence on P-WTP was country of origin. Consumer age also had a small negative relationship with P-WTP. The results indicate that a single quality score could reliably describe the eating quality experienced by all consumers. In addition, if reliable quality information is delivered to consumers they will pay more for better quality beef, which would add value to the beef industry and encourage improvements in quality.
Remote identification of individual volunteer cotton plants
USDA-ARS?s Scientific Manuscript database
Although airborne multispectral remote sensing can identify fields of small cotton plants, improvements to detection sensitivity are needed to identify individual or small clusters of plants that can similarly provide habitat for boll weevils. However, when consumer-grade cameras are used, each pix...
76 FR 79275 - Truth in Savings (Regulation DD)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-21
... disclosure. Sensitive personal information, such as account numbers or social security numbers, should not be... improved, and consumers' ability to make informed decisions regarding deposit accounts would be... regulations, while making information on the other regulations available. The Bureau expects to conduct...
78 FR 46578 - Agency Information Collection Activities: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-01
... , including any personal information provided. Sensitive personal information, such as account numbers or... to understand the impact of bundled products and services on the financial decision- making of... BUREAU OF CONSUMER FINANCIAL PROTECTION [Docket No CFPB-2013-0024] Agency Information Collection...
Cook, Jesse D; Prairie, Michael L; Plante, David T
2018-04-30
To evaluate the ability of a multisensory fitness tracker, the Jawbone UP3 (JB3), to quantify and classify sleep in patients with suspected central disorders of hypersomnolence. This study included 43 patients who completed polysomnography (PSG) and a Multiple Sleep Latency Test (MSLT) with concurrent wrist-worn JB3 and Actiwatch 2 (AW2) recordings for comparison. Mean differences in nocturnal sleep architecture variables were compared using Bland-Altman analysis. Sensitivity, specificity, and accuracy were derived for both devices relative to PSG. Ability of the JB3 to detect sleep onset rapid eye movement periods (SOREMPs) during MSLT naps was also quantified. JB3 demonstrated a significant overestimation of total sleep time (39.6 min, P < .0001) relative to PSG, but performed comparably to AW2. Although the ability of the JB3 to detect epochs of sleep was relatively good (sensitivity = 0.97), its ability to distinguish light, deep, and REM sleep was poor. Similarly, the JB3 did not correctly identify a single SOREMP during any MSLT nap opportunity. The JB3 did not accurately quantify or classify sleep in patients with suspected central disorders of hypersomnolence, and was particularly poor at identifying REM sleep. Thus, this device cannot be used as a surrogate for PSG or MSLT in the assessment of patients with suspected central disorders of hypersomnolence. Copyright © 2018 American Academy of Sleep Medicine. All rights reserved.
Detection of potentially skin sensitizing hydroperoxides of linalool in fragranced products.
Kern, Susanne; Dkhil, Hafida; Hendarsa, Prisca; Ellis, Graham; Natsch, Andreas
2014-10-01
On prolonged exposure to air, linalool can form sensitizing hydroperoxides. Positive hydroperoxide patch tests in dermatitis patients have frequently been reported, but their relevance has not been established. Owing to a lack of analytical methods and data, it is unclear from which sources the public might be exposed to sufficient quantities of hydroperoxides for induction of sensitization to occur. To address this knowledge gap, we developed analytical methods and performed stability studies for fine fragrances and deodorants/antiperspirants. In parallel, products recalled from consumers were analysed to investigate exposure to products used in everyday life. Liquid chromatography-mass spectrometry with high mass resolution was found to be optimal for the selective and sensitive detection of the organic hydroperoxide in the complex product matrix. Linalool hydroperoxide was detected in natural linalool, but the amount was not elevated by storage in a perfume formulation exposed to air. No indication of hydroperoxide formation in fine fragrances was found in stability studies. Aged fine fragrances recalled from consumers contained a geometric mean linalool concentration of 1,888 μg/g and, corrected for matrix effects, linalool hydroperoxide at a concentration of around 14 μg/g. In antiperspirants, we detected no oxidation products. In conclusion, very low levels of linalool hydroperoxide in fragranced products may originate from raw materials, but we found no evidence for oxidation during storage of products. The levels detected are orders of magnitude below the levels inducing sensitization in experimental animals, and these results therefore do not substantiate a causal link between potential hydroperoxide formation in cosmetics and positive results of patch tests.
Microscopic lymph node tumor burden quantified by macroscopic dual-tracer molecular imaging
Tichauer, Kenneth M.; Samkoe, Kimberley S.; Gunn, Jason R.; Kanick, Stephen C.; Hoopes, P. Jack; Barth, Richard J.; Kaufman, Peter A.; Hasan, Tayyaba; Pogue, Brian W.
2014-01-01
Lymph node biopsy (LNB) is employed in many cancer surgeries to identify metastatic disease and stage the cancer, yet morbidity and diagnostic delays associated with LNB could be avoided if non-invasive imaging of nodal involvement was reliable. Molecular imaging has potential in this regard; however, variable delivery and nonspecific uptake of imaging tracers has made conventional approaches ineffective clinically. A method of correcting for non-specific uptake with injection of a second untargeted tracer is presented, allowing tumor burden in lymph nodes to be quantified. The approach was confirmed in an athymic mouse model of metastatic human breast cancer targeting epidermal growth factor receptor, a cell surface receptor overexpressed by many cancers. A significant correlation was observed between in vivo (dual-tracer) and ex vivo measures of tumor burden (r = 0.97, p < 0.01), with an ultimate sensitivity of approximately 200 cells (potentially more sensitive than conventional LNB). PMID:25344739
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peace, Gerald; Goering, Timothy James; Miller, Mark Laverne
2007-01-01
A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations whenmore » data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses.« less
Wan, Haibao; Umstot, Edward S; Szeto, Hazel H; Schiller, Peter W; Desiderio, Dominic M
2004-04-15
The synthetic opioid peptide analog Dmt-D-Arg-Phe-Lys-NH(2) ([Dmt(1)]DALDA; [Dmt= 2',6'-dimethyltyrosine) is a highly potent and selective mu opioid-receptor agonist. A very sensitive and robust capillary liquid chromatography/nanospray ion-trap (IT) mass spectrometry method has been developed to quantify [Dmt(1)]DALDA in ovine plasma, using deuterated [Dmt(1)]DALDA as the internal standard. The standard MS/MS spectra of d(0)- and d(5)-[Dmt(1)]DALDA were obtained, and the collision energy was experimentally optimized to 25%. The product ion [ M + 2H-NH(3)](2+) (m/z 312.2) was used to identify and to quantify the synthetic opioid peptide analog in ovine plasma samples. The MS/MS detection sensitivity for [Dmt(1)]DALDA was 625 amol. A calibration curve was constructed, and quantitative analysis was performed on a series of ovine plasma samples.
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close “neighborhood” of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa. PMID:26327290
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close "neighborhood" of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa.
Heden, Timothy D; Liu, Ying; Kearney, Monica L; Kanaley, Jill A
2014-05-01
Obesity and high-fructose corn syrup (HFCS)-sweetened beverages are associated with an increased risk of chronic disease, but it is not clear whether obese (Ob) individuals are more susceptible to the detrimental effects of HFCS-sweetened beverages. The purpose of this study was to examine the endocrine and metabolic effects of consuming HFCS-sweetened beverages, and whether weight classification (normal weight (NW) vs. Ob) influences these effects. Ten NW and 10 Ob men and women who habitually consumed ≤355 mL per day of sugar-sweetened beverages were included in this study. Initially, the participants underwent a 4-h mixed-meal test after a 12-h overnight fast to assess insulin sensitivity, pancreatic and gut endocrine responses, insulin secretion and clearance, and glucose, triacylglycerol, and cholesterol responses. Next, the participants consumed their normal diet ad libitum, with 1065 mL per day (117 g·day(-1)) of HFCS-sweetened beverages added for 2 weeks. After the intervention, the participants repeated the mixed-meal test. HFCS-sweetened beverages did not significantly alter body weight, insulin sensitivity, insulin secretion or clearance, or endocrine, glucose, lipid, or cholesterol responses in either NW or Ob individuals. Regardless of previous diet, Ob individuals, compared with NW individuals, had ∼28% lower physical activity levels, 6%-9% lower insulin sensitivity, 12%-16% lower fasting high-density-lipoprotein cholesterol concentrations, 84%-144% greater postprandial triacylglycerol concentrations, and 46%-79% greater postprandial insulin concentrations. Greater insulin responses were associated with reduced insulin clearance, and there were no differences in insulin secretion. These findings suggest that weight classification does not influence the short-term endocrine and metabolic effects of HFCS-sweetened beverages.
Food Waste in the Food-Energy-Water Nexus: Energy and Water Footprints of Wasted Food
NASA Astrophysics Data System (ADS)
Kibler, K. M.; Sarker, T.; Reinhart, D.
2016-12-01
The impact of wasted food to the food-energy-water (FEW) nexus is not well conceptualized or quantified, and is thus poorly understood. While improved understanding of water and energy requirements for food production may be applied to estimate costs associated with production of wasted food, the post-disposal costs of food waste to energy and water sectors are unknown. We apply both theoretical methods and direct observation of landfill leachate composition to quantify the net energy and water impact of food waste that is disposed in landfills. We characterize necessary energy inputs and biogas production to compute net impact to the energy sector. With respect to water, we quantify the volumes of water needed to attain permitted discharge concentrations of treated leachate, as well as the gray water footprint necessary for waste assimilation to the ambient regulatory standard. We find that approximately three times the energy produced as biogas (4.6E+8 kWh) is consumed in managing food waste and treating contamination from wasted food (1.3E+9 kWh). This energy requirement represents around 3% of the energy consumed in food production. The water requirement for leachate treatment and assimilation may exceed the amount of water needed to produce food. While not a consumptive use, the existence and replenishment of sufficient quantities of water in the environment for waste assimilation is an ecosystem service of the hydrosphere. This type of analysis may be applied to create water quality-based standards for necessary instream flows to perform the ecosystem service of waste assimilation. Clearer perception of wasted food as a source/sink for energy and water within the FEW nexus could be a powerful approach towards reducing the quantities of wasted food and more efficiently managing food that is wasted. For instance, comparative analysis of FEW impact across waste management strategies (e.g. landfilling, composting, anaerobic digestion) may assist local governments in developing integrated waste and water management strategies.
Phthalate Metabolites, Consumer Habits and Health Effects
Wallner, Peter; Kundi, Michael; Hohenblum, Philipp; Scharf, Sigrid; Hutter, Hans-Peter
2016-01-01
Phthalates are multifunctional chemicals used in a wide variety of consumer products. The aim of this study was to investigate whether levels of urinary phthalate metabolites in urine samples of Austrian mothers and their children were associated with consumer habits and health indicators. Within an Austrian biomonitoring survey, urine samples from 50 mother-child pairs of five communities (two-stage random stratified sampling) were analysed. The concentrations of 14 phthalate metabolites were determined, and a questionnaire was administered. Monoethyl phthalate (MEP), mono-n-butyl phthalate (MnBP), mono-isobutyl phthalate (MiBP), monobenzyl phthalate (MBzP), mono-(2-ethylhexyl) phthalate (MEHP), mono-(2-ethyl-5-hydroxyhexyl) phthalate (5OH-MEHP), mono-(2-ethyl-5-oxohexyl) phthalate (5oxo-MEHP), mono-(5-carboxy-2-ethylpentyl) phthalate (5cx-MEPP), and 3-carboxy-mono-propyl phthalate (3cx-MPP) could be quantified in the majority of samples. Significant correlations were found between the use of hair mousse, hair dye, makeup, chewing gum, polyethylene terephthalate (PET) bottles and the diethyl phthalate (DEP) metabolite MEP. With regard to health effects, significant associations of MEP in urine with headache, repeated coughing, diarrhoea, and hormonal problems were observed. MBzP was associated with repeated coughing and MEHP was associated with itching. PMID:27428989
The protozooplankton-ichthyoplankton trophic link: an overlooked aspect of aquatic food webs.
Montagnes, David J S; Dower, John F; Figueiredo, Gisela M
2010-01-01
Since the introduction of the microbial loop concept, awareness of the role played by protozooplankton in marine food webs has grown. By consuming bacteria, and then being consumed by metazooplankton, protozoa form a trophic link that channels dissolved organic material into the "classic" marine food chain. Beyond enhancing energy transfer to higher trophic levels, protozoa play a key role in improving the food quality of metazooplankton. Here, we consider a third role played by protozoa, but one that has received comparatively little attention: that as prey items for ichthyoplankton. For >100 years it has been known that fish larvae consume protozoa. Despite this, fisheries scientists and biological oceanographers still largely ignore protozoa when assessing the foodweb dynamics that regulate the growth and survival of larval fish. We review evidence supporting the importance of the protozooplankton-ichthyoplankton link, including examples from the amateur aquarium trade, the commercial aquaculture industry, and contemporary studies of larval fish. We then consider why this potentially important link continues to receive very little attention. We conclude by offering suggestions for quantifying the importance of the protozooplankton-ichthyoplankton trophic link, using both existing methods and new technologies.
Effects of food processing on pesticide residues in fruits and vegetables: a meta-analysis approach.
Keikotlhaile, B M; Spanoghe, P; Steurbaut, W
2010-01-01
Pesticides are widely used in food production to increase food security despite the fact that they can have negative health effects on consumers. Pesticide residues have been found in various fruits and vegetables; both raw and processed. One of the most common routes of pesticide exposure in consumers is via food consumption. Most foods are consumed after passing through various culinary and processing treatments. A few literature reviews have indicated the general trend of reduction or concentration of pesticide residues by certain methods of food processing for a particular active ingredient. However, no review has focused on combining the obtained results from different studies on different active ingredients with differences in experimental designs, analysts and analysis equipment. In this paper, we present a meta-analysis of response ratios as a possible method of combining and quantifying effects of food processing on pesticide residue levels. Reduction of residue levels was indicated by blanching, boiling, canning, frying, juicing, peeling and washing of fruits and vegetables with an average response ratio ranging from 0.10 to 0.82. Baking, boiling, canning and juicing indicated both reduction and increases for the 95% and 99.5% confidence intervals. Copyright 2009 Elsevier Ltd. All rights reserved.
Sleep and the use of energy products in a combat environment.
Waits, Wendi M; Ganz, Michael B; Schillreff, Theresa; Dell, Peter J
2014-01-01
The use of energy products appears to be widespread among deployed personnel, presumably to combat fatigue and sleep deprivation. However, these products have been associated with unpleasant side effects and adverse events, including insomnia, mood swings, fatigue, cardiac arrest, and even death. To quantify the sleep habits and energy products used among deployed service members in Afghanistan from 2010-2011. Participants completed an anonymous survey querying their demographic information, sleep habits, combat exposure, and energy product use. Respondent data: 83% experienced some degree of insomnia; 28% were using a prescription or over-the-counter sleep aid; 81% reported using at least one energy product daily. The most frequently consumed energy products were caffeinated coffee and soda. Only 4 energy products were used more frequently during deployment than prior to deployment: Rip-It, Tiger, Hydroxycut, and energy drink powders. On average, respondents who increased their use consumed only 2 more servings per week during deployment than they had prior to deployment. Only degree of combat exposure, not quantity of energy products consumed, predicted degree of insomnia. Energy product consumption by service members during deployment was not dramatically different than predeployment and was not associated with insomnia.
Phthalate Metabolites, Consumer Habits and Health Effects.
Wallner, Peter; Kundi, Michael; Hohenblum, Philipp; Scharf, Sigrid; Hutter, Hans-Peter
2016-07-15
Phthalates are multifunctional chemicals used in a wide variety of consumer products. The aim of this study was to investigate whether levels of urinary phthalate metabolites in urine samples of Austrian mothers and their children were associated with consumer habits and health indicators. Within an Austrian biomonitoring survey, urine samples from 50 mother-child pairs of five communities (two-stage random stratified sampling) were analysed. The concentrations of 14 phthalate metabolites were determined, and a questionnaire was administered. Monoethyl phthalate (MEP), mono-n-butyl phthalate (MnBP), mono-isobutyl phthalate (MiBP), monobenzyl phthalate (MBzP), mono-(2-ethylhexyl) phthalate (MEHP), mono-(2-ethyl-5-hydroxyhexyl) phthalate (5OH-MEHP), mono-(2-ethyl-5-oxohexyl) phthalate (5oxo-MEHP), mono-(5-carboxy-2-ethylpentyl) phthalate (5cx-MEPP), and 3-carboxy-mono-propyl phthalate (3cx-MPP) could be quantified in the majority of samples. Significant correlations were found between the use of hair mousse, hair dye, makeup, chewing gum, polyethylene terephthalate (PET) bottles and the diethyl phthalate (DEP) metabolite MEP. With regard to health effects, significant associations of MEP in urine with headache, repeated coughing, diarrhoea, and hormonal problems were observed. MBzP was associated with repeated coughing and MEHP was associated with itching.
Community trait overdispersion due to trophic interactions: concerns for assembly process inference
Petchey, Owen L.
2016-01-01
The expected link between competitive exclusion and community trait overdispersion has been used to infer competition in local communities, and trait clustering has been interpreted as habitat filtering. Such community assembly process inference has received criticism for ignoring trophic interactions, as competition and trophic interactions might create similar trait patterns. While other theoretical studies have generally demonstrated the importance of predation for coexistence, ours provides the first quantitative demonstration of such effects on assembly process inference, using a trait-based ecological model to simulate the assembly of a competitive primary consumer community with and without the influence of trophic interactions. We quantified and contrasted trait dispersion/clustering of the competitive communities with the absence and presence of secondary consumers. Trophic interactions most often decreased trait clustering (i.e. increased dispersion) in the competitive communities due to evenly distributed invasions of secondary consumers and subsequent competitor extinctions over trait space. Furthermore, effects of trophic interactions were somewhat dependent on model parameters and clustering metric. These effects create considerable problems for process inference from trait distributions; one potential solution is to use more process-based and inclusive models in inference. PMID:27733548
The Extent of Consumer Product Involvement in Paediatric Injuries
Catchpoole, Jesani; Walker, Sue; Vallmuur, Kirsten
2016-01-01
A challenge in utilising health sector injury data for Product Safety purposes is that clinically coded data have limited ability to inform regulators about product involvement in injury events, given data entry is bound by a predefined set of codes. Text narratives collected in emergency departments can potentially address this limitation by providing relevant product information with additional accompanying context. This study aims to identify and quantify consumer product involvement in paediatric injuries recorded in emergency department-based injury surveillance data. A total of 7743 paediatric injuries were randomly selected from Queensland Injury Surveillance Unit database and associated text narratives were manually reviewed to determine product involvement in the injury event. A Product Involvement Factor classification system was used to categorise these injury cases. Overall, 44% of all reviewed cases were associated with consumer products, with proximity factor (25%) being identified as the most common involvement of a product in an injury event. Only 6% were established as being directly due to the product. The study highlights the importance of utilising injury data to inform product safety initiatives where text narratives can be used to identify the type and involvement of products in injury cases. PMID:27399744
The Extent of Consumer Product Involvement in Paediatric Injuries.
Catchpoole, Jesani; Walker, Sue; Vallmuur, Kirsten
2016-07-07
A challenge in utilising health sector injury data for Product Safety purposes is that clinically coded data have limited ability to inform regulators about product involvement in injury events, given data entry is bound by a predefined set of codes. Text narratives collected in emergency departments can potentially address this limitation by providing relevant product information with additional accompanying context. This study aims to identify and quantify consumer product involvement in paediatric injuries recorded in emergency department-based injury surveillance data. A total of 7743 paediatric injuries were randomly selected from Queensland Injury Surveillance Unit database and associated text narratives were manually reviewed to determine product involvement in the injury event. A Product Involvement Factor classification system was used to categorise these injury cases. Overall, 44% of all reviewed cases were associated with consumer products, with proximity factor (25%) being identified as the most common involvement of a product in an injury event. Only 6% were established as being directly due to the product. The study highlights the importance of utilising injury data to inform product safety initiatives where text narratives can be used to identify the type and involvement of products in injury cases.