Sample records for quantify previously inaccessible

  1. Emperor penguins nesting on Inaccessible Island

    USGS Publications Warehouse

    Jonkel, G.M.; Llano, G.A.

    1975-01-01

    Emperor penguins were observed nesting on Inaccessible I. during the 1973 winter. This is the southernmost nesting of emperor penguins thus far recorded; it also could be the first record of emperors attempting to start a new rookery. This site, however, may have been used by emperors in the past. The closest reported nesting of these penguins to Inaccessible I. is on the Ross Ice Shelf east of Cape Crozier. With the exception of the Inaccessible I. record, there is little evidence that emperor penguins breed in McMurdo Sound proper.

  2. Out of Reach, Out of Mind? Infants' Comprehension of References to Hidden Inaccessible Objects.

    PubMed

    Osina, Maria A; Saylor, Megan M; Ganea, Patricia A

    2017-09-01

    This study investigated the nature of infants' difficulty understanding references to hidden inaccessible objects. Twelve-month-old infants (N = 32) responded to the mention of objects by looking at, pointing at, or approaching them when the referents were visible or accessible, but not when they were hidden and inaccessible (Experiment I). Twelve-month-olds (N = 16) responded robustly when a container with the hidden referent was moved from a previously inaccessible position to an accessible position before the request, but failed to respond when the reverse occurred (Experiment II). This suggests that infants might be able to track the hidden object's dislocations and update its accessibility as it changes. Knowing the hidden object is currently inaccessible inhibits their responding. Older, 16-month-old (N = 17) infants' performance was not affected by object accessibility. © 2016 The Authors. Child Development © 2016 Society for Research in Child Development, Inc.

  3. Cytoskeleton reorganization/disorganization is a key feature of induced inaccessibility for defence to successive pathogen attacks.

    PubMed

    Moral, Juan; Montilla-Bascón, Gracia; Canales, Francisco J; Rubiales, Diego; Prats, Elena

    2017-06-01

    In this work, we investigated the involvement of the long-term dynamics of cytoskeletal reorganization on the induced inaccessibility phenomenon by which cells that successfully defend against a previous fungal attack become highly resistant to subsequent attacks. This was performed on pea through double inoculation experiments using inappropriate (Blumeria graminis f. sp. avenae, Bga) and appropriate (Erysiphe pisi, Ep) powdery mildew fungi. Pea leaves previously inoculated with Bga showed a significant reduction of later Ep infection relative to leaves inoculated only with Ep, indicating that cells had developed induced inaccessibility. This reduction in Ep infection was higher when the time interval between Bga and Ep inoculation ranged between 18 and 24 h, although increased penetration resistance in co-infected cells was observed even with time intervals of 24 days between inoculations. Interestingly, this increase in resistance to Ep following successful defence to the inappropriate Bga was associated with an increase in actin microfilament density that reached a maximum at 18-24 h after Bga inoculation and very slowly decreased afterwards. The putative role of cytoskeleton reorganization/disorganization leading to inaccessibility is supported by the suppression of the induced resistance mediated by specific actin (cytochalasin D, latrunculin B) or general protein (cycloheximide) inhibitors. © 2016 BSPP AND JOHN WILEY & SONS LTD.

  4. Asian elephants acquire inaccessible food by blowing.

    PubMed

    Mizuno, Kaori; Irie, Naoko; Hiraiwa-Hasegawa, Mariko; Kutsukake, Nobuyuki

    2016-01-01

    Many animals acquire otherwise inaccessible food with the aid of sticks and occasionally water. As an exception, some reports suggest that elephants manipulate breathing through their trunks to acquire inaccessible food. Here, we report on two female Asian elephants (Elephas maximus) in Kamine Zoo, Japan, who regularly blew to drive food within their reach. We experimentally investigated this behaviour by placing foods in inaccessible places. The elephants blew the food until it came within accessible range. Once the food was within range, the elephants were increasingly less likely to blow as the distance to the food became shorter. One subject manipulated her blowing duration based on food distance: longer when the food was distant. These results suggest that the elephants used their breath to achieve goals: that is, they used it not only to retrieve the food but also to fine-tune the food position for easy grasping. We also observed individual differences in the elephants' aptitude for this technique, which altered the efficiency of food acquisition. Thus, we added a new example of spontaneous behaviour for achieving a goal in animals. The use of breath to drive food is probably unique to elephants, with their dexterous trunks and familiarity with manipulating the act of blowing, which is commonly employed for self-comfort and acoustic communication.

  5. 49 CFR 214.327 - Inaccessible track.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the control of the switch, and (ii) The control operator has notified the roadway worker who has..., DEPARTMENT OF TRANSPORTATION RAILROAD WORKPLACE SAFETY Roadway Worker Protection § 214.327 Inaccessible track... effective securing device by the roadway worker in charge of the working limits; (3) A discontinuity in the...

  6. Atomic resolution mechanism of ligand binding to a solvent inaccessible cavity in T4 lysozyme

    PubMed Central

    Ahalawat, Navjeet; Pandit, Subhendu; Kay, Lewis E.

    2018-01-01

    Ligand binding sites in proteins are often localized to deeply buried cavities, inaccessible to bulk solvent. Yet, in many cases binding of cognate ligands occurs rapidly. An intriguing system is presented by the L99A cavity mutant of T4 Lysozyme (T4L L99A) that rapidly binds benzene (~106 M-1s-1). Although the protein has long served as a model system for protein thermodynamics and crystal structures of both free and benzene-bound T4L L99A are available, the kinetic pathways by which benzene reaches its solvent-inaccessible binding cavity remain elusive. The current work, using extensive molecular dynamics simulation, achieves this by capturing the complete process of spontaneous recognition of benzene by T4L L99A at atomistic resolution. A series of multi-microsecond unbiased molecular dynamics simulation trajectories unequivocally reveal how benzene, starting in bulk solvent, diffuses to the protein and spontaneously reaches the solvent inaccessible cavity of T4L L99A. The simulated and high-resolution X-ray derived bound structures are in excellent agreement. A robust four-state Markov model, developed using cumulative 60 μs trajectories, identifies and quantifies multiple ligand binding pathways with low activation barriers. Interestingly, none of these identified binding pathways required large conformational changes for ligand access to the buried cavity. Rather, these involve transient but crucial opening of a channel to the cavity via subtle displacements in the positions of key helices (helix4/helix6, helix7/helix9) leading to rapid binding. Free energy simulations further elucidate that these channel-opening events would have been unfavorable in wild type T4L. Taken together and via integrating with results from experiments, these simulations provide unprecedented mechanistic insights into the complete ligand recognition process in a buried cavity. By illustrating the power of subtle helix movements in opening up multiple pathways for ligand access

  7. 30 CFR 75.389 - Mining into inaccessible areas.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Mining into inaccessible areas. 75.389 Section 75.389 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR COAL MINE SAFETY AND HEALTH MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Ventilation § 75.389 Mining into...

  8. 30 CFR 75.389 - Mining into inaccessible areas.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Mining into inaccessible areas. 75.389 Section 75.389 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR COAL MINE SAFETY AND HEALTH MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Ventilation § 75.389 Mining into...

  9. Monitoring small reservoirs' storage with satellite remote sensing in inaccessible areas

    NASA Astrophysics Data System (ADS)

    Avisse, Nicolas; Tilmant, Amaury; François Müller, Marc; Zhang, Hua

    2017-12-01

    In river basins with water storage facilities, the availability of regularly updated information on reservoir level and capacity is of paramount importance for the effective management of those systems. However, for the vast majority of reservoirs around the world, storage levels are either not measured or not readily available due to financial, political, or legal considerations. This paper proposes a novel approach using Landsat imagery and digital elevation models (DEMs) to retrieve information on storage variations in any inaccessible region. Unlike existing approaches, the method does not require any in situ measurement and is appropriate for monitoring small, and often undocumented, irrigation reservoirs. It consists of three recovery steps: (i) a 2-D dynamic classification of Landsat spectral band information to quantify the surface area of water, (ii) a statistical correction of DEM data to characterize the topography of each reservoir, and (iii) a 3-D reconstruction algorithm to correct for clouds and Landsat 7 Scan Line Corrector failure. The method is applied to quantify reservoir storage in the Yarmouk basin in southern Syria, where ground monitoring is impeded by the ongoing civil war. It is validated against available in situ measurements in neighbouring Jordanian reservoirs. Coefficients of determination range from 0.69 to 0.84, and the normalized root-mean-square error from 10 to 16 % for storage estimations on six Jordanian reservoirs with maximal water surface areas ranging from 0.59 to 3.79 km2.

  10. Inaccessibility of reinforcement increases persistence and signaling behavior in the fox squirrel (Sciurus niger).

    PubMed

    Delgado, Mikel M; Jacobs, Lucia F

    2016-05-01

    Under natural conditions, wild animals encounter situations where previously rewarded actions do not lead to reinforcement. In the laboratory, a surprising omission of reinforcement induces behavioral and emotional responses described as frustration. Frustration can lead to aggressive behaviors and to the persistence of noneffective responses, but it may also lead to new behavioral responses to a problem, a potential adaptation. We assessed the responses to inaccessible reinforcement in free-ranging fox squirrels (Sciurus niger). We trained squirrels to open a box to obtain food reinforcement, a piece of walnut. After 9 training trials, squirrels were tested in 1 of 4 conditions: a control condition with the expected reward, an alternative reinforcement (a piece of dried corn), an empty box, or a locked box. We measured the presence of signals suggesting arousal (e.g., tail flags and tail twitches) and found that squirrels performed fewer of these behaviors in the control condition and increased certain behaviors (tail flags, biting box) in the locked box condition, compared to other experimental conditions. When faced with nonreinforcement, that is, frustration, squirrels increased the number of interactions with the apparatus and spent more time interacting with the apparatus. This study of frustration responses in a free-ranging animal extends the conclusions of captive studies to the field and demonstrates that fox squirrels show short-term negatively valenced responses to the inaccessibility, omission, and change of reinforcement. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  11. Buccal swab as a reliable predictor for X inactivation ratio in inaccessible tissues.

    PubMed

    de Hoon, Bas; Monkhorst, Kim; Riegman, Peter; Laven, Joop S E; Gribnau, Joost

    2015-11-01

    As a result of the epigenetic phenomenon of X chromosome inactivation (XCI) every woman is a mosaic of cells with either an inactive paternal X chromosome or an inactive maternal X chromosome. The ratio between inactive paternal and maternal X chromosomes is different for every female individual, and can influence an X-encoded trait or disease. A multitude of X linked conditions is known, and for many of them it is recognised that the phenotype in affected female carriers of the causative mutation is modulated by the XCI ratio. To predict disease severity an XCI ratio is usually determined in peripheral blood samples. However, the correlation between XCI ratios in peripheral blood and disease affected tissues, that are often inaccessible, is poorly understood. Here, we tested several tissues obtained from autopsies of 12 female individuals for patch size and XCI ratio. XCI ratios were analysed using methyl-sensitive PCR-based assays for the AR, PCSK1N and SLITRK4 loci. XCI patch size was analysed by testing the XCI ratio of tissue samples with decreasing size. XCI patch size was analysed for liver, muscle, ovary and brain samples and was found too small to confound testing for XCI ratio in these tissues. XCI ratios were determined in the easily accessible tissues, blood, buccal epithelium and hair follicle, and compared with ratios in several inaccessible tissues. Buccal epithelium is preferable over peripheral blood for predicting XCI ratios of inaccessible tissues. Ovary is the only inaccessible tissue showing a poor correlation to blood and buccal epithelium, but has a good correlation to hair follicle instead. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  12. Simulated in vivo Electrophysiology Experiments Provide Previously Inaccessible Insights into Visual Physiology

    PubMed Central

    Quiroga, Maria del Mar; Price, Nicholas SC

    2016-01-01

    Lecture content and practical laboratory classes are ideally complementary. However, the types of experiments that have led to our detailed understanding of sensory neuroscience are often not amenable to classroom experimentation as they require expensive equipment, time-consuming surgeries, specialized experimental techniques, and the use of animals. While sometimes feasible in small group teaching, these experiments are not suitable for large cohorts of students. Previous attempts to expose students to sensory neuroscience experiments include: the use of electrophysiology preparations in invertebrates, data-driven simulations that do not replicate the experience of conducting an experiment, or simply observing an experiment in a research laboratory. We developed an online simulation of a visual neuroscience experiment in which extracellular recordings are made from a motion sensitive neuron. Students have control over stimulation parameters (direction and contrast) and can see and hear the action potential responses to stimuli as they are presented. The simulation provides an intuitive way for students to gain insight into neurophysiology, including experimental design, data collection and data analysis. Our simulation allows large cohorts of students to cost-effectively “experience” the results of animal research without ethical concerns, to be exposed to realistic data variability, and to develop their understanding of how sensory neuroscience experiments are conducted. PMID:27980465

  13. 16 CFR 1500.87 - Children's products containing lead: inaccessible component parts.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... child. (c) Section 101(b)(2)(B) of the CPSIA directs the Commission to promulgate by August 14, 2009... components will be considered to be inaccessible. (d) The accessibility probes specified for sharp points or... regulations at 16 CFR 1500.50 and 16 CFR 1500.51 (excluding the bite test of § 1500.51(c)), will be used to...

  14. Salvage of inaccessible arteriovenous fistulas in obese patients: a review of 132 brachiocephalic fistulas.

    PubMed

    Stoikes, Nathaniel; Nezakatgoo, Nosratollah; Fischer, Peter; Bahr, Michael; Magnotti, Louis

    2009-08-01

    The two main factors leading to a functional fistula are maturity and accessibility. The aim of this review was to describe a technique of superficialization for inaccessible brachiocephalic fistulas, and to identify the patients that benefit from superficialization. One hundred and thirty-two brachiocephalic arteriovenous fistulas developed from November 2003 to December 2006 were reviewed for primary maturation. In the mature group, patients were evaluated for fistula accessibility. Inaccessible fistulas were selected for superficialization via our technique of vein mobilization using small skip incisions. Analysis of superficialized and nonsuperficialized groups included age, demographics, and comorbidities. Ninety-nine patients were in the mature group, and 33 in the immature group; primary nonmaturation was 25 per cent. Analysis within the mature group was between nonsuperficialized (n = 81) and superficialized (n = 18) patients. The superficialized group had less hypertension (83% vs 98%, P < 0.05), significantly higher BMI (31 vs 27, P < 0.05), and was mostly female (78% vs 49%, P < 0.05). All superficialized fistulas accommodated successful hemodialysis postoperatively. To conclude, patients with mature but inaccessible fistulas were salvaged by superficialization. This population had significantly higher BMI, less hypertension, and female prevalence. Identifying these patients is important because salvage of their fistula can prevent premature progression to alternate autogenous arteriovenous access procedures.

  15. 77 FR 45297 - Children's Toys and Child Care Articles Containing Phthalates; Proposed Guidance on Inaccessible...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    ... CONSUMER PRODUCT SAFETY COMMISSION 16 CFR Part 1199 [Docket No. CPSC-2012-0040] Children's Toys... containing phthalates does not apply to any component part of children's toys or child care articles that is... guidance on inaccessible component parts in children's toys or child care articles subject to section 108...

  16. 78 FR 10503 - Children's Toys and Child Care Articles Containing Phthalates; Final Guidance on Inaccessible...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-14

    ... and Child Care Articles Containing Phthalates; Final Guidance on Inaccessible Component Parts AGENCY... does not apply to any component part of children's toys or child care articles that is not accessible... parts in children's toys or child care articles subject to section 108 of the CPSIA. DATES: This rule is...

  17. 14 CFR 382.57 - What services must carriers provide if their automated kiosks are inaccessible?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false What services must carriers provide if their automated kiosks are inaccessible? 382.57 Section 382.57 Aeronautics and Space OFFICE OF THE... BASIS OF DISABILITY IN AIR TRAVEL Accessibility of Airport Facilities § 382.57 What services must...

  18. 14 CFR 382.57 - What services must carriers provide if their automated kiosks are inaccessible?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false What services must carriers provide if their automated kiosks are inaccessible? 382.57 Section 382.57 Aeronautics and Space OFFICE OF THE... BASIS OF DISABILITY IN AIR TRAVEL Accessibility of Airport Facilities § 382.57 What services must...

  19. 14 CFR 382.57 - What services must carriers provide if their automated kiosks are inaccessible?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false What services must carriers provide if their automated kiosks are inaccessible? 382.57 Section 382.57 Aeronautics and Space OFFICE OF THE... BASIS OF DISABILITY IN AIR TRAVEL Accessibility of Airport Facilities § 382.57 What services must...

  20. 14 CFR 382.57 - What services must carriers provide if their automated kiosks are inaccessible?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false What services must carriers provide if their automated kiosks are inaccessible? 382.57 Section 382.57 Aeronautics and Space OFFICE OF THE... BASIS OF DISABILITY IN AIR TRAVEL Accessibility of Airport Facilities § 382.57 What services must...

  1. Investigating flood susceptible areas in inaccessible regions using remote sensing and geographic information systems.

    PubMed

    Lim, Joongbin; Lee, Kyoo-Seock

    2017-03-01

    Every summer, North Korea (NK) suffers from floods, resulting in decreased agricultural production and huge economic loss. Besides meteorological reasons, several factors can accelerate flood damage. Environmental studies about NK are difficult because NK is inaccessible due to the division of Korea. Remote sensing (RS) can be used to delineate flood inundated areas in inaccessible regions such as NK. The objective of this study was to investigate the spatial characteristics of flood susceptible areas (FSAs) using multi-temporal RS data and digital elevation model data. Such study will provide basic information to restore FSAs after reunification. Defining FSAs at the study site revealed that rice paddies with low elevation and low slope were the most susceptible areas to flood in NK. Numerous sediments from upper streams, especially streams through crop field areas on steeply sloped hills, might have been transported and deposited into stream channels, thus disturbing water flow. In conclusion, NK floods may have occurred not only due to meteorological factors but also due to inappropriate land use for flood management. In order to mitigate NK flood damage, reforestation is needed for terraced crop fields. In addition, drainage capacity for middle stream channel near rice paddies should be improved.

  2. Increasing value and reducing waste: addressing inaccessible research.

    PubMed

    Chan, An-Wen; Song, Fujian; Vickers, Andrew; Jefferson, Tom; Dickersin, Kay; Gøtzsche, Peter C; Krumholz, Harlan M; Ghersi, Davina; van der Worp, H Bart

    2014-01-18

    The methods and results of health research are documented in study protocols, full study reports (detailing all analyses), journal reports, and participant-level datasets. However, protocols, full study reports, and participant-level datasets are rarely available, and journal reports are available for only half of all studies and are plagued by selective reporting of methods and results. Furthermore, information provided in study protocols and reports varies in quality and is often incomplete. When full information about studies is inaccessible, billions of dollars in investment are wasted, bias is introduced, and research and care of patients are detrimentally affected. To help to improve this situation at a systemic level, three main actions are warranted. First, academic institutions and funders should reward investigators who fully disseminate their research protocols, reports, and participant-level datasets. Second, standards for the content of protocols and full study reports and for data sharing practices should be rigorously developed and adopted for all types of health research. Finally, journals, funders, sponsors, research ethics committees, regulators, and legislators should endorse and enforce policies supporting study registration and wide availability of journal reports, full study reports, and participant-level datasets. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Increasing value and reducing waste: addressing inaccessible research

    PubMed Central

    Chan, An-Wen; Song, Fujian; Vickers, Andrew; Jefferson, Tom; Dickersin, Kay; Gøtzsche, Peter C.; Krumholz, Harlan M.; Ghersi, Davina; van der Worp, H. Bart

    2015-01-01

    The study protocol, publications, full study report detailing all analyses, and participant-level dataset constitute the main documentation of methods and results for health research. However, journal publications are available for only half of all studies and are plagued by selective reporting of methods and results. The protocol, full study report, and participant-level dataset are rarely available. The quality of information provided in study protocols and reports is variable and often incomplete. Inaccessibility of full information for the vast majority of studies wastes billions of dollars, introduces bias, and has a detrimental impact on patient care and research. To help improve this situation at a systemic level, three main actions are warranted. Firstly, it is important that academic institutions and funders reward investigators who fully disseminate their research protocols, reports, and participant-level datasets. Secondly, standards for the content of protocols, full study reports, and data sharing practices should be rigorously developed and adopted for all types of health research. Finally, journals, funders, sponsors, research ethics committees, regulators, and legislators should implement and enforce policies supporting study registration and availability of journal publications, full study reports, and participant-level datasets. PMID:24411650

  4. GENE EXPRESSION PROFILING OF ACCESSIBLE SURROGATE TISSUES TO MONITOR MOLECULAR CHANGES IN INACCESSIBLE TARGET TISSUES FOLLOWING TOXICANT EXPOSURE

    EPA Science Inventory

    Gene Expression Profiling Of Accessible Surrogate Tissues To Monitor Molecular Changes In Inaccessible Target Tissues Following Toxicant Exposure
    John C. Rockett, Chad R. Blystone, Amber K. Goetz, Rachel N. Murrell, Judith E. Schmid and David J. Dix
    Reproductive Toxicology ...

  5. Nano-optical scan probes: Opening doors to previously-inaccessible parameter spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuck, James

    2014-06-08

    I will discuss recent progress on new near-field probe geometries, including the “campanile” geometry, which has been used in recent hyperspectral imaging experiments, providing nanoscale spectral information distinct from what is obtained with other methods. Article not available.

  6. The Tip-of-the-Tongue Heuristic: How Tip-of-the-Tongue States Confer Perceptibility on Inaccessible Words

    ERIC Educational Resources Information Center

    Cleary, Anne M.; Claxton, Alexander B.

    2015-01-01

    This study shows that the presence of a tip-of-the-tongue (TOT) state--the sense that a word is in memory when its retrieval fails--is used as a heuristic for inferring that an inaccessible word has characteristics that are consistent with greater word perceptibility. When reporting a TOT state, people judged an unretrieved word as more likely to…

  7. Impact of inaccessible spaces on community participation of people with mobility limitations in Zambia

    PubMed Central

    Nitz, Jennifer C.; de Jonge, Desleigh

    2014-01-01

    Background The study investigated the perspective of people with mobility limitations (PWML) in Zambia, firstly of their accessibility to public buildings and spaces, and secondly of how their capacity to participate in a preferred lifestyle has been affected. Objectives Firstly to provide insight into the participation experiences of PWML in the social, cultural, economic, political and civic life areas and the relationship of these with disability in Zambia. Secondly to establish how the Zambian disability context shape the experiences of participation by PWML. Method A qualitative design was used to gather data from 75 PWML in five of the nine provinces of Zambia. Focus group discussions and personal interviews were used to examine the accessibility of the built environment and how this impacted on the whole family’s participation experiences. The nominal group technique was utilised to rank inaccessible buildings and facilities which posed barriers to opportunities in life areas and how this interfered with the whole family’s lifestyle. Results Inaccessibility of education institutions, workplaces and spaces have contributed to reduced participation with negative implications for personal, family, social and economic aspects of the lives of participants. Government buildings, service buildings, and transportation were universally identified as most important but least accessible. Conclusion Zambians with mobility limitations have been disadvantaged in accessing services and facilities provided to the public, depriving them and their dependants of full and equitable life participation because of reduced economic capacity. This study will assist in informing government of the need to improve environmental access to enable equal rights for all citizens. PMID:28729994

  8. SURROGATE TISSUE ANALYSIS: MONITORING TOXICANT EXPOSURE AND HEALTH STATUS OF INACCESSIBLE TISSUES THROUGH THE ANALYSIS OF ACCESSIBLE TISSUES AND CELLS

    EPA Science Inventory

    Surrogate Tissue Analysis: Monitoring Toxicant Exposure And Health Status Of Inaccessible Tissues Through The Analysis Of Accessible Tissues And Cells*
    John C. Rockett1, Michael E. Burczynski 2, Albert J. Fornace, Jr.3, Paul.C. Herrmann4, Stephen A. Krawetz5, and David J. Dix1...

  9. The tip-of-the-tongue heuristic: How tip-of-the-tongue states confer perceptibility on inaccessible words.

    PubMed

    Cleary, Anne M; Claxton, Alexander B

    2015-09-01

    This study shows that the presence of a tip-of-the-tongue (TOT) state--the sense that a word is in memory when its retrieval fails--is used as a heuristic for inferring that an inaccessible word has characteristics that are consistent with greater word perceptibility. When reporting a TOT state, people judged an unretrieved word as more likely to have previously appeared darker and clearer (Experiment 1a), and larger (Experiment 1b). They also judged an unretrieved word as more likely to be a high frequency word (Experiment 2). This was not because greater fluency or word perceptibility at encoding led to later TOT states: Increased fluency or perceptibility of a word at encoding did not increase the likelihood of a TOT state for it when its retrieval later failed; moreover, the TOT state was not diagnostic of an unretrieved word's fluency or perceptibility when it was last seen. Results instead suggest that TOT states themselves are used as a heuristic for inferring the likely characteristics of unretrieved words. During the uncertainty of retrieval failure, TOT states are a source of information on which people rely in reasoning about the likely characteristics of the unretrieved information, choosing characteristics that are consistent with greater fluency of processing. (c) 2015 APA, all rights reserved).

  10. a Hyperspectral Based Method to Detect Cannabis Plantation in Inaccessible Areas

    NASA Astrophysics Data System (ADS)

    Houmi, M.; Mohamadi, B.; Balz, T.

    2018-04-01

    The increase in drug use worldwide has led to sophisticated illegal planting methods. Most countries depend on helicopters, and local knowledge to identify such illegal plantations. However, remote sensing techniques can provide special advantages for monitoring the extent of illegal drug production. This paper sought to assess the ability of the Satellite remote sensing to detect Cannabis plantations. This was achieved in two stages: 1- Preprocessing of Hyperspectral data EO-1, and testing the capability to collect the spectral signature of Cannabis in different sites of the study area (Morocco) from well-known Cannabis plantation fields. 2- Applying the method of Spectral Angle Mapper (SAM) based on a specific angle threshold on Hyperion data EO-1 in well-known Cannabis plantation sites, and other sites with negative Cannabis plantation in another study area (Algeria), to avoid any false Cannabis detection using these spectra. This study emphasizes the benefits of using hyperspectral remote sensing data as an effective detection tool for illegal Cannabis plantation in inaccessible areas based on SAM classification method with a maximum angle (radians) less than 0.03.

  11. Tumor volumetric measurements in surgically inaccessible pediatric low-grade glioma.

    PubMed

    Kilday, John-Paul; Branson, Helen; Rockel, Conrad; Laughlin, Suzanne; Mabbott, Donald; Bouffet, Eric; Bartels, Ute

    2015-01-01

    Tumor measurement is important in unresectable pediatric low-grade gliomas (pLGGs) to determine either the need for treatment or assess response. Standard methods measure the product of the largest 2 lengths from transverse, anterior-posterior, and cranio-caudal dimensions (SM, cm). This single-institution study evaluated tumor volume measurements (VM, cm) in such pLGGs. Of 50 patients treated with chemotherapy for surgically inaccessible pLGG, 8 met the inclusion criteria of having 2 or more sequential MRI studies of T1-weighted Fast-Spoiled Gradient Recalled acquisition. SM and VM were performed by 2 independent neuroradiologists. Associations of measurement methods with defined therapeutic response criteria and patient clinical status were assessed. The mean tumor size at the first MRI scan was 20 cm and 398 cm according to SM and VM, respectively. VM results did not differ significantly from SM-derived spherical volume calculations (Pearson correlation, P<0.0001) with a high interrater reliability. Both methods were concordant in defining the tumor response according to the current criteria, although radiologic progressive disease was not associated with clinical status (SM: P=0.491, VM: P=0.208). In this limited experience, volumetric analysis of unresectable pLGGs did not seem superior to the standard linear measurements for defining tumor response.

  12. Medical performance and the 'inaccessible' experience of illness: an exploratory study.

    PubMed

    Weitkamp, Emma; Mermikides, Alex

    2016-09-01

    We report a survey of audience members' responses (147 questionnaires collected at seven performances) and 10 in-depth interviews (five former patients and two family members, three medical practitioners) to bloodlines, a medical performance exploring the experience of haematopoietic stem-cell transplant as treatment for acute leukaemia. Performances took place in 2014 and 2015. The article argues that performances that are created through interdisciplinary collaboration can convey otherwise 'inaccessible' illness experiences in ways that audience members with personal experience recognise as familiar, and find emotionally affecting. In particular such performances are adept at interweaving 'objectivist' (objective, medical) and 'subjectivist' (subjective, emotional) perspectives of the illness experience, and indeed, at challenging such distinctions. We suggest that reflecting familiar yet hard-to-articulate experiences may be beneficial for the ongoing emotional recovery of people who have survived serious disease, particularly in relation to the isolation that they experience during and as a consequence of their treatment. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  13. Incremental comprehension of spoken quantifier sentences: Evidence from brain potentials.

    PubMed

    Freunberger, Dominik; Nieuwland, Mante S

    2016-09-01

    Do people incrementally incorporate the meaning of quantifier expressions to understand an unfolding sentence? Most previous studies concluded that quantifiers do not immediately influence how a sentence is understood based on the observation that online N400-effects differed from offline plausibility judgments. Those studies, however, used serial visual presentation (SVP), which involves unnatural reading. In the current ERP-experiment, we presented spoken positive and negative quantifier sentences ("Practically all/practically no postmen prefer delivering mail, when the weather is good/bad during the day"). Different from results obtained in a previously reported SVP-study (Nieuwland, 2016) sentence truth-value N400 effects occurred in positive and negative quantifier sentences alike, reflecting fully incremental quantifier comprehension. This suggests that the prosodic information available during spoken language comprehension supports the generation of online predictions for upcoming words and that, at least for quantifier sentences, comprehension of spoken language may proceed more incrementally than comprehension during SVP reading. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  14. Repair of Inaccessible Ventral Dural Defect in Thoracic Spine: Double Layered Duraplasty

    PubMed Central

    Lee, Dong-Hyun; Park, Jeong-Ill; Park, Ki-Su; Cho, Dae-Chul; Sung, Joo-Kyung

    2016-01-01

    We propose a double layered (intradural and epidural patch) duraplasty that utilizes Lyoplant and Duraseal. We examined a 47-year-old woman after decompression for thoracic ossification of posterior longitudinal ligament was performed in another hospital. On postoperative day 7, she complained of weakness in both legs. Postoperative magnetic resonance imaging (MRI) showed cerebrospinal fluid (CSF) collection with cord compression. In the operative field, we found 2 large dural defects on the ventral dura mater. We performed a conventional fat graft with fibrin glue. However, the patient exhibited neurologic deterioration, and a postoperative MRI again showed CSF collection. We performed dorsal midline durotomy and inserted a intradural and epidural Lyoplant patch. She immediately experienced diminishing back pain postoperatively. Her visual analog scale and motor power improved markedly. Postoperative MRIs performed at 2 and 16 months showed no spinal cord compression or CSF leakage to the epidural space. We describe a new technique for double layered duraplasty. Although we do not recommend this technique for all dural repairs, double-layered duraplasty may be useful for repairing large inaccessible dural tears in cases of persistent CSF leakage refractory to conventional management. PMID:27437022

  15. IMPAIRED VERBAL COMPREHENSION OF QUANTIFIERS IN CORTICOBASAL SYNDROME

    PubMed Central

    Troiani, Vanessa; Clark, Robin; Grossman, Murray

    2011-01-01

    Objective Patients with Corticobasal Syndrome (CBS) have atrophy in posterior parietal cortex. This region of atrophy has been previously linked with their quantifier comprehension difficulty, but previous studies used visual stimuli, making it difficult to account for potential visuospatial deficits in CBS patients. The current study evaluated comprehension of generalized quantifiers using strictly verbal materials. Method CBS patients, a brain-damaged control group (consisting of Alzheimer's Disease and frontotemporal dementia), and age-matched controls participated in this study. We assessed familiar temporal, spatial, and monetary domains of verbal knowledge comparatively. Judgment accuracy was only evaluated in statements for which patients demonstrated accurate factual knowledge about the target domain. Results We found that patients with CBS are significantly impaired in their ability to evaluate quantifiers compared to healthy seniors and a brain-damaged control group, even in this strictly visual task. This impairment was seen in the vast majority of individual CBS patients. Conclusions These findings offer additional evidence of quantifier impairment in CBS patients and emphasize that this impairment cannot be attributed to potential spatial processing impairments in patients with parietal disease. PMID:21381823

  16. Templated deprotonative metalation of polyaryl systems: Facile access to simple, previously inaccessible multi-iodoarenes

    PubMed Central

    Martínez-Martínez, Antonio J.; Justice, Stephen; Fleming, Ben J.; Kennedy, Alan R.; Oswald, Iain D. H.; O’Hara, Charles T.

    2017-01-01

    The development of new methodologies to affect non–ortho-functionalization of arenes has emerged as a globally important arena for research, which is key to both fundamental studies and applied technologies. A range of simple arene feedstocks (namely, biphenyl, meta-terphenyl, para-terphenyl, 1,3,5-triphenylbenzene, and biphenylene) is transformed to hitherto unobtainable multi-iodoarenes via an s-block metal sodium magnesiate templated deprotonative approach. These iodoarenes have the potential to be used in a whole host of high-impact transformations, as precursors to key materials in the pharmaceutical, molecular electronic, and nanomaterials industries. To prove the concept, we transformed biphenyl to 3,5-bis(N-carbazolyl)-1,1′-biphenyl, a novel isomer of 4,4′-bis(N-carbazolyl)-1,1′-biphenyl (CPB), a compound which is currently widely used as a host material for organic light-emitting diodes. PMID:28695201

  17. Mechanistic pathways of recognition of a solvent-inaccessible cavity of protein by a ligand

    NASA Astrophysics Data System (ADS)

    Mondal, Jagannath; Pandit, Subhendu; Dandekar, Bhupendra; Vallurupalli, Pramodh

    One of the puzzling questions in the realm of protein-ligand recognition is how a solvent-inaccessible hydrophobic cavity of a protein gets recognized by a ligand. We address the topic by simulating, for the first time, the complete binding process of benzene from aqueous media to the well-known buried cavity of L99A T4 Lysozyme at an atomistic resolution. Our multiple unbiased microsecond-long trajectories, which were completely blind to the location of target binding site, are able to unequivocally identify the kinetic pathways along which benzene molecule meanders across the solvent and protein and ultimately spontaneously recognizes the deeply buried cavity of L99A T4 Lysozyme at an accurate precision. Our simulation, combined with analysis based on markov state model and free energy calculation, reveals that there are more than one distinct ligand binding pathways. Intriguingly, each of the identified pathways involves the transient opening of a channel of the protein prior to ligand binding. The work will also decipher rich mechanistic details on unbinding kinetics of the ligand as obtained from enhanced sampling techniques.

  18. Quantifying quantum coherence with quantum Fisher information.

    PubMed

    Feng, X N; Wei, L F

    2017-11-14

    Quantum coherence is one of the old but always important concepts in quantum mechanics, and now it has been regarded as a necessary resource for quantum information processing and quantum metrology. However, the question of how to quantify the quantum coherence has just been paid the attention recently (see, e.g., Baumgratz et al. PRL, 113. 140401 (2014)). In this paper we verify that the well-known quantum Fisher information (QFI) can be utilized to quantify the quantum coherence, as it satisfies the monotonicity under the typical incoherent operations and the convexity under the mixing of the quantum states. Differing from most of the pure axiomatic methods, quantifying quantum coherence by QFI could be experimentally testable, as the bound of the QFI is practically measurable. The validity of our proposal is specifically demonstrated with the typical phase-damping and depolarizing evolution processes of a generic single-qubit state, and also by comparing it with the other quantifying methods proposed previously.

  19. Endoscopic ultrasound-guided pancreaticobiliary intervention in patients with surgically altered anatomy and inaccessible papillae: A review of current literature

    PubMed Central

    Martin, Aaron; Kistler, Charles Andrew; Wrobel, Piotr; Yang, Juliana F.; Siddiqui, Ali A.

    2016-01-01

    The management of pancreaticobiliary disease in patients with surgically altered anatomy is a growing problem for gastroenterologists today. Over the years, endoscopic ultrasound (EUS) has emerged as an important diagnostic and therapeutic modality in the treatment of pancreaticobiliary disease. Patient anatomy has become increasingly complex due to advances in surgical resection of pancreaticobiliary disease and EUS has emerged as the therapy of choice when endoscopic retrograde cholangiopancreatography failed cannulation or when the papilla is inaccessible such as in gastric obstruction or duodenal obstruction. The current article gives a comprehensive review of the current literature for EUS-guided intervention of the pancreaticobiliary tract in patients with altered surgical anatomy. PMID:27386471

  20. Anaphoric Reference to Quantified Antecedents: An Event-Related Brain Potential Study

    ERIC Educational Resources Information Center

    Filik, Ruth; Leuthold, Hartmut; Moxey, Linda M.; Sanford, Anthony J.

    2011-01-01

    We report an event-related brain potential (ERP) study examining how readers process sentences containing anaphoric reference to quantified antecedents. Previous studies indicate that positive (e.g. "many") and negative (e.g. "not many") quantifiers cause readers to focus on different sets of entities. For example in "Many of the fans attended the…

  1. Septipyridines as conformationally controlled substitutes for inaccessible bis(terpyridine)-derived oligopyridines in two-dimensional self-assembly

    PubMed Central

    Caterbow, Daniel; Künzel, Daniela; Mavros, Michael G; Groß, Axel; Landfester, Katharina

    2011-01-01

    Summary The position of the peripheral nitrogen atoms in bis(terpyridine)-derived oligopyridines (BTPs) has a strong impact on their self-assembly behavior at the liquid/HOPG (highly oriented pyrolytic graphite) interface. The intermolecular hydrogen bonding interactions in these peripheral pyridine units show specific 2D structures for each BTP isomer. From nine possible constitutional isomers only four have been described in the literature. The synthesis and self-assembling behavior of an additional isomer is presented here, but the remaining four members of the series are synthetically inaccessible. The self-assembling properties of three of the missing four BTP isomers can be mimicked by making use of the energetically preferred N–C–C–N transoid conformation between 2,2'-bipyridine subunits in a new class of so-called septipyridines. The structures are investigated by scanning tunneling microscopy (STM) and a combination of force-field and first-principles electronic structure calculations. PMID:22003448

  2. Balloon Blocking Technique (BBT) for Superselective Catheterization of Inaccessible Arteries with Conventional and Modified Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morishita, Hiroyuki, E-mail: hmorif@koto.kpu-m.ac.jp, E-mail: mori-h33@xa2.so-net.ne.jp; Takeuchi, Yoshito, E-mail: yotake62@qg8.so-net.ne.jp; Ito, Takaaki, E-mail: takaaki@koto.kpu-m.ac.jp

    2016-06-15

    PurposeThe purpose of the study was to retrospectively evaluate the efficacy and safety of the balloon blocking technique (BBT).Materials and MethodsThe BBT was performed in six patients (all males, mean 73.5 years) in whom superselective catheterization for transcatheter arterial embolization by the conventional microcatheter techniques had failed due to anatomical difficulty, including targeted arteries originating steeply or hooked from parent arteries. All BBT procedures were performed using Seldinger’s transfemoral method. Occlusive balloons were deployed and inflated at the distal side of the target artery branching site in the parent artery via transfemoral access. A microcatheter was delivered from a 5-F cathetermore » via another femoral access and was advanced over the microguidewire into the target artery, under balloon blockage of advancement of the microguidewire into non-target branches. After the balloon catheter was deflated and withdrawn, optimal interventions were performed through the microcatheter.ResultsAfter success of accessing the targeted artery by BBT, optimal interventions were accomplished in all patients with no complications other than vasovagal hypotension, which responded to nominal therapy.ConclusionThe BBT may be useful in superselective catheterization of inaccessible arteries due to anatomical difficulties.« less

  3. Children's interpretations of general quantifiers, specific quantifiers, and generics

    PubMed Central

    Gelman, Susan A.; Leslie, Sarah-Jane; Was, Alexandra M.; Koch, Christina M.

    2014-01-01

    Recently, several scholars have hypothesized that generics are a default mode of generalization, and thus that young children may at first treat quantifiers as if they were generic in meaning. To address this issue, the present experiment provides the first in-depth, controlled examination of the interpretation of generics compared to both general quantifiers ("all Xs", "some Xs") and specific quantifiers ("all of these Xs", "some of these Xs"). We provided children (3 and 5 years) and adults with explicit frequency information regarding properties of novel categories, to chart when "some", "all", and generics are deemed appropriate. The data reveal three main findings. First, even 3-year-olds distinguish generics from quantifiers. Second, when children make errors, they tend to be in the direction of treating quantifiers like generics. Third, children were more accurate when interpreting specific versus general quantifiers. We interpret these data as providing evidence for the position that generics are a default mode of generalization, especially when reasoning about kinds. PMID:25893205

  4. Metal–Organic Frameworks Stabilize Solution-Inaccessible Cobalt Catalysts for Highly Efficient Broad-Scope Organic Transformations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Teng; Manna, Kuntal; Lin, Wenbin

    New and active earth-abundant metal catalysts are critically needed to replace precious metal-based catalysts for sustainable production of commodity and fine chemicals. We report here the design of highly robust, active, and reusable cobalt-bipyridine- and cobalt-phenanthroline-based metal–organic framework (MOF) catalysts for alkene hydrogenation and hydroboration, aldehyde/ketone hydroboration, and arene C–H borylation. In alkene hydrogenation, the MOF catalysts tolerated a variety of functional groups and displayed unprecedentedly high turnover numbers of ~2.5 × 10 6 and turnover frequencies of ~1.1 × 10 5 h –1. Structural, computational, and spectroscopic studies show that site isolation of the highly reactive (bpy)Co(THF) 2 speciesmore » in the MOFs prevents intermolecular deactivation and stabilizes solution-inaccessible catalysts for broad-scope organic transformations. Computational, spectroscopic, and kinetic evidence further support a hitherto unknown (bpy•–)CoI(THF) 2 ground state that coordinates to alkene and dihydrogen and then undergoing σ-complex-assisted metathesis to form (bpy)Co(alkyl)(H). Reductive elimination of alkane followed by alkene binding completes the catalytic cycle. MOFs thus provide a novel platform for discovering new base-metal molecular catalysts and exhibit enormous potential in sustainable chemical catalysis.« less

  5. Energetic arousal and language: predictions from the computational theory of quantifiers processing.

    PubMed

    Zajenkowski, Marcin

    2013-10-01

    The author examines the relationship between energetic arousal (EA) and the processing of sentences containing natural-language quantifiers. Previous studies and theories have shown that energy may differentially affect various cognitive functions. Recent investigations devoted to quantifiers strongly support the theory that various types of quantifiers involve different cognitive functions in the sentence-picture verification task. In the present study, 201 students were presented with a sentence-picture verification task consisting of simple propositions containing a quantifier that referred to the color of a car on display. Color pictures of cars accompanied the propositions. In addition, the level of participants' EA was measured before and after the verification task. It was found that EA and performance on proportional quantifiers (e.g., "More than half of the cars are red") are in an inverted U-shaped relationship. This result may be explained by the fact that proportional sentences engage working memory to a high degree, and previous models of EA-cognition associations have been based on the assumption that tasks that require parallel attentional and memory processes are best performed when energy is moderate. The research described in the present article has several applications, as it shows the optimal human conditions for verbal comprehension. For instance, it may be important in workplace design to control the level of arousal experienced by office staff when work is mostly related to the processing of complex texts. Energy level may be influenced by many factors, such as noise, time of day, or thermal conditions.

  6. Quantifiers more or less quantify online: ERP evidence for partial incremental interpretation

    PubMed Central

    Urbach, Thomas P.; Kutas, Marta

    2010-01-01

    Event-related brain potentials were recorded during RSVP reading to test the hypothesis that quantifier expressions are incrementally interpreted fully and immediately. In sentences tapping general knowledge (Farmers grow crops/worms as their primary source of income), Experiment 1 found larger N400s for atypical (worms) than typical objects (crops). Experiment 2 crossed object typicality with non-logical subject-noun phrase quantifiers (most, few). Off-line plausibility ratings exhibited the crossover interaction predicted by full quantifier interpretation: Most farmers grow crops and Few farmers grow worms were rated more plausible than Most farmers grow worms and Few farmers grow crops. Object N400s, although modulated in the expected direction, did not reverse. Experiment 3 replicated these findings with adverbial quantifiers (Farmers often/rarely grow crops/worms). Interpretation of quantifier expressions thus is neither fully immediate nor fully delayed. Furthermore, object atypicality was associated with a frontal slow positivity in few-type/rarely quantifier contexts, suggesting systematic processing differences among quantifier types. PMID:20640044

  7. Using field inversion to quantify functional errors in turbulence closures

    NASA Astrophysics Data System (ADS)

    Singh, Anand Pratap; Duraisamy, Karthik

    2016-04-01

    A data-informed approach is presented with the objective of quantifying errors and uncertainties in the functional forms of turbulence closure models. The approach creates modeling information from higher-fidelity simulations and experimental data. Specifically, a Bayesian formalism is adopted to infer discrepancies in the source terms of transport equations. A key enabling idea is the transformation of the functional inversion procedure (which is inherently infinite-dimensional) into a finite-dimensional problem in which the distribution of the unknown function is estimated at discrete mesh locations in the computational domain. This allows for the use of an efficient adjoint-driven inversion procedure. The output of the inversion is a full-field of discrepancy that provides hitherto inaccessible modeling information. The utility of the approach is demonstrated by applying it to a number of problems including channel flow, shock-boundary layer interactions, and flows with curvature and separation. In all these cases, the posterior model correlates well with the data. Furthermore, it is shown that even if limited data (such as surface pressures) are used, the accuracy of the inferred solution is improved over the entire computational domain. The results suggest that, by directly addressing the connection between physical data and model discrepancies, the field inversion approach materially enhances the value of computational and experimental data for model improvement. The resulting information can be used by the modeler as a guiding tool to design more accurate model forms, or serve as input to machine learning algorithms to directly replace deficient modeling terms.

  8. Quantifying coordination among the rearfoot, midfoot, and forefoot segments during running.

    PubMed

    Takabayashi, Tomoya; Edama, Mutsuaki; Yokoyama, Erika; Kanaya, Chiaki; Kubo, Masayoshi

    2018-03-01

    Because previous studies have suggested that there is a relationship between injury risk and inter-segment coordination, quantifying coordination between the segments is essential. Even though the midfoot and forefoot segments play important roles in dynamic tasks, previous studies have mostly focused on coordination between the shank and rearfoot segments. This study aimed to quantify coordination among rearfoot, midfoot, and forefoot segments during running. Eleven healthy young men ran on a treadmill. The coupling angle, representing inter-segment coordination, was calculated using a modified vector coding technique. The coupling angle was categorised into four coordination patterns. During the absorption phase, rearfoot-midfoot coordination in the frontal planes was mostly in-phase (rearfoot and midfoot eversion with similar amplitudes). The present study found that the eversion of the midfoot with respect to the rearfoot was comparable in magnitude to the eversion of the rearfoot with respect to the shank. A previous study has suggested that disruption of the coordination between the internal rotation of the shank and eversion of the rearfoot leads to running injuries such as anterior knee pain. Thus, these data might be used in the future to compare to individuals with foot deformities or running injuries.

  9. Resolving and quantifying overlapped chromatographic bands by transmutation

    PubMed

    Malinowski

    2000-09-15

    A new chemometric technique called "transmutation" is developed for the purpose of sharpening overlapped chromatographic bands in order to quantify the components. The "transmutation function" is created from the chromatogram of the pure component of interest, obtained from the same instrument, operating under the same experimental conditions used to record the unresolved chromatogram of the sample mixture. The method is used to quantify mixtures containing toluene, ethylbenzene, m-xylene, naphthalene, and biphenyl from unresolved chromatograms previously reported. The results are compared to those obtained using window factor analysis, rank annihilation factor analysis, and matrix regression analysis. Unlike the latter methods, the transmutation method is not restricted to two-dimensional arrays of data, such as those obtained from HPLC/DAD, but is also applicable to chromatograms obtained from single detector experiments. Limitations of the method are discussed.

  10. Quantifying resilience

    USGS Publications Warehouse

    Allen, Craig R.; Angeler, David G.

    2016-01-01

    Several frameworks to operationalize resilience have been proposed. A decade ago, a special feature focused on quantifying resilience was published in the journal Ecosystems (Carpenter, Westley & Turner 2005). The approach there was towards identifying surrogates of resilience, but few of the papers proposed quantifiable metrics. Consequently, many ecological resilience frameworks remain vague and difficult to quantify, a problem that this special feature aims to address. However, considerable progress has been made during the last decade (e.g. Pope, Allen & Angeler 2014). Although some argue that resilience is best kept as an unquantifiable, vague concept (Quinlan et al. 2016), to be useful for managers, there must be concrete guidance regarding how and what to manage and how to measure success (Garmestani, Allen & Benson 2013; Spears et al. 2015). Ideas such as ‘resilience thinking’ have utility in helping stakeholders conceptualize their systems, but provide little guidance on how to make resilience useful for ecosystem management, other than suggesting an ambiguous, Goldilocks approach of being just right (e.g. diverse, but not too diverse; connected, but not too connected). Here, we clarify some prominent resilience terms and concepts, introduce and synthesize the papers in this special feature on quantifying resilience and identify core unanswered questions related to resilience.

  11. Quantifying Water Stress Using Total Water Volumes and GRACE

    NASA Astrophysics Data System (ADS)

    Richey, A. S.; Famiglietti, J. S.; Druffel-Rodriguez, R.

    2011-12-01

    Water will follow oil as the next critical resource leading to unrest and uprisings globally. To better manage this threat, an improved understanding of the distribution of water stress is required today. This study builds upon previous efforts to characterize water stress by improving both the quantification of human water use and the definition of water availability. Current statistics on human water use are often outdated or inaccurately reported nationally, especially for groundwater. This study improves these estimates by defining human water use in two ways. First, we use NASA's Gravity Recovery and Climate Experiment (GRACE) to isolate the anthropogenic signal in water storage anomalies, which we equate to water use. Second, we quantify an ideal water demand by using average water requirements for the domestic, industrial, and agricultural water use sectors. Water availability has traditionally been limited to "renewable" water, which ignores large, stored water sources that humans use. We compare water stress estimates derived using either renewable water or the total volume of water globally. We use the best-available data to quantify total aquifer and surface water volumes, as compared to groundwater recharge and surface water runoff from land-surface models. The work presented here should provide a more realistic image of water stress by explicitly quantifying groundwater, defining water availability as total water supply, and using GRACE to more accurately quantify water use.

  12. A novel approach to quantify cybersecurity for electric power systems

    NASA Astrophysics Data System (ADS)

    Kaster, Paul R., Jr.

    Electric Power grid cybersecurity is a topic gaining increased attention in academia, industry, and government circles, yet a method of quantifying and evaluating a system's security is not yet commonly accepted. In order to be useful, a quantification scheme must be able to accurately reflect the degree to which a system is secure, simply determine the level of security in a system using real-world values, model a wide variety of attacker capabilities, be useful for planning and evaluation, allow a system owner to publish information without compromising the security of the system, and compare relative levels of security between systems. Published attempts at quantifying cybersecurity fail at one or more of these criteria. This document proposes a new method of quantifying cybersecurity that meets those objectives. This dissertation evaluates the current state of cybersecurity research, discusses the criteria mentioned previously, proposes a new quantification scheme, presents an innovative method of modeling cyber attacks, demonstrates that the proposed quantification methodology meets the evaluation criteria, and proposes a line of research for future efforts.

  13. Deep Learning Methods for Quantifying Invasive Benthic Species in the Great Lakes

    NASA Astrophysics Data System (ADS)

    Billings, G.; Skinner, K.; Johnson-Roberson, M.

    2017-12-01

    In recent decades, invasive species such as the round goby and dreissenid mussels have greatly impacted the Great Lakes ecosystem. It is critical to monitor these species, model their distribution, and quantify the impacts on the native fisheries and surrounding ecosystem in order to develop an effective management response. However, data collection in underwater environments is challenging and expensive. Furthermore, the round goby is typically found in rocky habitats, which are inaccessible to standard survey techniques such as bottom trawling. In this work we propose a robotic system for visual data collection to automatically detect and quantify invasive round gobies and mussels in the Great Lakes. Robotic platforms equipped with cameras can perform efficient, cost-effective, low-bias benthic surveys. This data collection can be further optimized through automatic detection and annotation of the target species. Deep learning methods have shown success in image recognition tasks. However, these methods often rely on a labelled training dataset, with up to millions of labelled images. Hand labeling large numbers of images is expensive and often impracticable. Furthermore, data collected in the field may be sparse when only considering images that contain the objects of interest. It is easier to collect dense, clean data in controlled lab settings, but this data is not a realistic representation of real field environments. In this work, we propose a deep learning approach to generate a large set of labelled training data realistic of underwater environments in the field. To generate these images, first we draw random sample images of individual fish and mussels from a library of images captured in a controlled lab environment. Next, these randomly drawn samples will be automatically merged into natural background images. Finally, we will use a generative adversarial network (GAN) that incorporates constraints of the physical model of underwater light propagation

  14. Neural basis for generalized quantifier comprehension.

    PubMed

    McMillan, Corey T; Clark, Robin; Moore, Peachie; Devita, Christian; Grossman, Murray

    2005-01-01

    Generalized quantifiers like "all cars" are semantically well understood, yet we know little about their neural representation. Our model of quantifier processing includes a numerosity device, operations that combine number elements and working memory. Semantic theory posits two types of quantifiers: first-order quantifiers identify a number state (e.g. "at least 3") and higher-order quantifiers additionally require maintaining a number state actively in working memory for comparison with another state (e.g. "less than half"). We used BOLD fMRI to test the hypothesis that all quantifiers recruit inferior parietal cortex associated with numerosity, while only higher-order quantifiers recruit prefrontal cortex associated with executive resources like working memory. Our findings showed that first-order and higher-order quantifiers both recruit right inferior parietal cortex, suggesting that a numerosity component contributes to quantifier comprehension. Moreover, only probes of higher-order quantifiers recruited right dorsolateral prefrontal cortex, suggesting involvement of executive resources like working memory. We also observed activation of thalamus and anterior cingulate that may be associated with selective attention. Our findings are consistent with a large-scale neural network centered in frontal and parietal cortex that supports comprehension of generalized quantifiers.

  15. Characterization of autoregressive processes using entropic quantifiers

    NASA Astrophysics Data System (ADS)

    Traversaro, Francisco; Redelico, Francisco O.

    2018-01-01

    The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.

  16. Quantifier Comprehension in Corticobasal Degeneration

    ERIC Educational Resources Information Center

    McMillan, Corey T.; Clark, Robin; Moore, Peachie; Grossman, Murray

    2006-01-01

    In this study, we investigated patients with focal neurodegenerative diseases to examine a formal linguistic distinction between classes of generalized quantifiers, like "some X" and "less than half of X." Our model of quantifier comprehension proposes that number knowledge is required to understand both first-order and higher-order quantifiers.…

  17. Reliability and validity of quantifying absolute muscle hardness using ultrasound elastography.

    PubMed

    Chino, Kentaro; Akagi, Ryota; Dohi, Michiko; Fukashiro, Senshi; Takahashi, Hideyuki

    2012-01-01

    Muscle hardness is a mechanical property that represents transverse muscle stiffness. A quantitative method that uses ultrasound elastography for quantifying absolute human muscle hardness has been previously devised; however, its reliability and validity have not been completely verified. This study aimed to verify the reliability and validity of this quantitative method. The Young's moduli of seven tissue-mimicking materials (in vitro; Young's modulus range, 20-80 kPa; increments of 10 kPa) and the human medial gastrocnemius muscle (in vivo) were quantified using ultrasound elastography. On the basis of the strain/Young's modulus ratio of two reference materials, one hard and one soft (Young's moduli of 7 and 30 kPa, respectively), the Young's moduli of the tissue-mimicking materials and medial gastrocnemius muscle were calculated. The intra- and inter-investigator reliability of the method was confirmed on the basis of acceptably low coefficient of variations (≤6.9%) and substantially high intraclass correlation coefficients (≥0.77) obtained from all measurements. The correlation coefficient between the Young's moduli of the tissue-mimicking materials obtained using a mechanical method and ultrasound elastography was 0.996, which was equivalent to values previously obtained using magnetic resonance elastography. The Young's moduli of the medial gastrocnemius muscle obtained using ultrasound elastography were within the range of values previously obtained using magnetic resonance elastography. The reliability and validity of the quantitative method for measuring absolute muscle hardness using ultrasound elastography were thus verified.

  18. Diesel Emissions Quantifier (DEQ)

    EPA Pesticide Factsheets

    .The Diesel Emissions Quantifier (Quantifier) is an interactive tool to estimate emission reductions and cost effectiveness. Publications EPA-420-F-13-008a (420f13008a), EPA-420-B-10-035 (420b10023), EPA-420-B-10-034 (420b10034)

  19. Quantifying Transmission.

    PubMed

    Woolhouse, Mark

    2017-07-01

    Transmissibility is the defining characteristic of infectious diseases. Quantifying transmission matters for understanding infectious disease epidemiology and designing evidence-based disease control programs. Tracing individual transmission events can be achieved by epidemiological investigation coupled with pathogen typing or genome sequencing. Individual infectiousness can be estimated by measuring pathogen loads, but few studies have directly estimated the ability of infected hosts to transmit to uninfected hosts. Individuals' opportunities to transmit infection are dependent on behavioral and other risk factors relevant given the transmission route of the pathogen concerned. Transmission at the population level can be quantified through knowledge of risk factors in the population or phylogeographic analysis of pathogen sequence data. Mathematical model-based approaches require estimation of the per capita transmission rate and basic reproduction number, obtained by fitting models to case data and/or analysis of pathogen sequence data. Heterogeneities in infectiousness, contact behavior, and susceptibility can have substantial effects on the epidemiology of an infectious disease, so estimates of only mean values may be insufficient. For some pathogens, super-shedders (infected individuals who are highly infectious) and super-spreaders (individuals with more opportunities to transmit infection) may be important. Future work on quantifying transmission should involve integrated analyses of multiple data sources.

  20. Quantifying Protein Concentrations Using Smartphone Colorimetry: A New Method for an Established Test

    ERIC Educational Resources Information Center

    Gee, Clifford T.; Kehoe, Eric; Pomerantz, William C. K.; Penn, R. Lee

    2017-01-01

    Proteins are involved in nearly every biological process, which makes them of interest to a range of scientists. Previous work has shown that hand-held cameras can be used to determine the concentration of colored analytes in solution, and this paper extends the approach to reactions involving a color change in order to quantify protein…

  1. Surgical cannulation of the superior ophthalmic vein for the treatment of previously embolized cavernous sinus dural arteriovenous fistulas: serial studies and angiographic follow-up.

    PubMed

    Luo, Bin; Zhang, Xin; Duan, Chuan-Zhi; He, Xu-Ying; Li, Xi-Feng; Karuna, Tamrakar; Gu, Da-Qun; Long, Xiao-Ao; Li, Tie-Lin; Zhang, Shi-Zhong; Ke, Yi-Quan; Jiang, Xiao-Dan

    2013-04-01

    The purpose of this study was to evaluate the safety and efficacy of transorbital puncture for the retreatment of previously embolized cavernous sinus dural arteriovenous fistulas (DAVFs) via a superior ophthalmic vein (SOV) approach. During a 12-year period, 9 consecutive patients with previously embolized cavernous sinus DAVFs underwent retreatment via the transorbital SOV approach. All of the nine cases of previously embolized cavernous sinus DAVFs were successfully embolized. Clinical follow-ups were conducted in all nine cases at the duration of 17-141 months (61.22 ± 39.13 months). No recanalization occurred during the follow-up period. A subtle ptosis appeared in two patients and disappeared in one of the two cases after a 4-year follow-up. One patient suffered from paroxysmal positional vertigo and bruit for nearly 2 years after the treatment, but the follow-up angiography demonstrated no recurrence. One patient had persistent visual impairment caused by the initial venous stasis retinopathy. One patient with a history of a procedure-related transient decrease in visual acuity had it return to the normal level. The remaining four cases had clear improvement in the ocular symptoms and became completely asymptomatic during the follow-up period. No patient worsened or developed new symptoms. The approach of surgical cannulation of the SOV for the retreatment of previously embolized cavernous sinus DAVFs was proved feasible and efficient, especially when the transarterial and transfemoral venous approaches were inaccessible. However, if the SOV is not dilated enough or is located deeply in the orbit, transorbital venous puncture access may not be possible.

  2. 61. The World-Wide Inaccessible Web, Part 2: Internet Routes

    ERIC Educational Resources Information Center

    Baggaley, Jon; Batpurev, Batchuluun; Klaas, Jim

    2007-01-01

    In the previous report in this series, Web browser loading times were measured in 12 Asian countries, and were found to be up to four times slower than commonly prescribed as acceptable. Failure of webpages to load at all was frequent. The current follow-up study compares these loading times with the complexity of the Internet routes linking the…

  3. Reliability and Validity of Quantifying Absolute Muscle Hardness Using Ultrasound Elastography

    PubMed Central

    Chino, Kentaro; Akagi, Ryota; Dohi, Michiko; Fukashiro, Senshi; Takahashi, Hideyuki

    2012-01-01

    Muscle hardness is a mechanical property that represents transverse muscle stiffness. A quantitative method that uses ultrasound elastography for quantifying absolute human muscle hardness has been previously devised; however, its reliability and validity have not been completely verified. This study aimed to verify the reliability and validity of this quantitative method. The Young’s moduli of seven tissue-mimicking materials (in vitro; Young’s modulus range, 20–80 kPa; increments of 10 kPa) and the human medial gastrocnemius muscle (in vivo) were quantified using ultrasound elastography. On the basis of the strain/Young’s modulus ratio of two reference materials, one hard and one soft (Young’s moduli of 7 and 30 kPa, respectively), the Young’s moduli of the tissue-mimicking materials and medial gastrocnemius muscle were calculated. The intra- and inter-investigator reliability of the method was confirmed on the basis of acceptably low coefficient of variations (≤6.9%) and substantially high intraclass correlation coefficients (≥0.77) obtained from all measurements. The correlation coefficient between the Young’s moduli of the tissue-mimicking materials obtained using a mechanical method and ultrasound elastography was 0.996, which was equivalent to values previously obtained using magnetic resonance elastography. The Young’s moduli of the medial gastrocnemius muscle obtained using ultrasound elastography were within the range of values previously obtained using magnetic resonance elastography. The reliability and validity of the quantitative method for measuring absolute muscle hardness using ultrasound elastography were thus verified. PMID:23029231

  4. Quantifying construction and demolition waste: an analytical review.

    PubMed

    Wu, Zezhou; Yu, Ann T W; Shen, Liyin; Liu, Guiwen

    2014-09-01

    Quantifying construction and demolition (C&D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C&D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C&D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Quantifying Gold Nanoparticle Concentration in a Dietary Supplement Using Smartphone Colorimetry and Google Applications

    ERIC Educational Resources Information Center

    Campos, Antonio R.; Knutson, Cassandra M.; Knutson, Theodore R.; Mozzetti, Abbie R.; Haynes, Christy L.; Penn, R. Lee

    2016-01-01

    Spectrophotometry and colorimetry experiments are common in high school and college chemistry courses, and nanotechnology is increasingly common in every day products and new devices. Previous work has demonstrated that handheld camera devices can be used to quantify the concentration of a colored analyte in solution in place of traditional…

  6. The Fallacy of Quantifying Risk

    DTIC Science & Technology

    2012-09-01

    Defense AT&L: September–October 2012 18 The Fallacy of Quantifying Risk David E. Frick, Ph.D. Frick is a 35-year veteran of the Department of...a key to risk analysis was “choosing the right technique” of quantifying risk . The weakness in this argument stems not from the assertion that one...of information about the enemy), yet achiev- ing great outcomes. Attempts at quantifying risk are not, in and of themselves, objectionable. Prudence

  7. Formulation of multifunctional oil-in-water nanosized emulsions for active and passive targeting of drugs to otherwise inaccessible internal organs of the human body.

    PubMed

    Tamilvanan, Shunmugaperumal

    2009-10-20

    Oil-in-water (o/w) type nanosized emulsions (NE) have been widely investigated as vehicles/carrier for the formulation and delivery of drugs with a broad range of applications. A comprehensive summary is presented on how to formulate the multifunctional o/w NE for active and passive targeting of drugs to otherwise inaccessible internal organs of the human body. The NE is classified into three generations based on its development over the last couple of decades to make ultimately a better colloidal carrier for a target site within the internal and external organs/parts of the body, thus allowing site-specific drug delivery and/or enhanced drug absorption. The third generation NE has tremendous application for drug absorption enhancement and for 'ferrying' compounds across cell membranes in comparison to its first and second generation counterparts. Furthermore, the third generation NE provides an interesting opportunity for use as drug delivery vehicles for numerous therapeutics that can range in size from small molecules to macromolecules.

  8. How the brain learns how few are “many”: An fMRI study of the flexibility of quantifier semantics

    PubMed Central

    Heim, Stefan; McMillan, Corey T.; Clark, Robin; Baehr, Laura; Ternes, Kylie; Olm, Christopher; Min, Nam Eun; Grossman, Murray

    2015-01-01

    Previous work has shown that the meaning of a quantifier such as “many” or “few” depends in part on quantity. However, the meaning of a quantifier may vary depending on the context, e.g. in the case of common entities such as “many ants” (perhaps several thousands) compared to endangered species such as “many pandas” (perhaps a dozen). In a recent study (Heim et al. 2015 Front. Psychol.) we demonstrated that the relative meaning of “many” and “few” may be changed experimentally. In a truth value judgment task, displays with 40% of circles in a named color initially had a low probability of being labeled “many”. After a training phase, the likelihood of acceptance 40% as “many” increased. Moreover, the semantic learning effect also generalized to the related quantifier “few” which had not been mentioned in the training phase. Thus, fewer 40% arrays were considered “few.” In the present study, we tested the hypothesis that this semantic adaptation effect was supported by cytoarchitectonic Brodmann area (BA) 45 in Broca’s region which may contribute to semantic evaluation in the context of language and quantification. In an event-related fMRI study, 17 healthy volunteers performed the same paradigm as in the previous behavioral study. We found a relative signal increase when comparing the critical, trained proportion to untrained proportions. This specific effect was found in left BA 45 for the trained quantifier “many”, and in left BA 44 for both quantifiers, reflecting the semantic adjustment for the untrained but related quantifier “few.” These findings demonstrate the neural basis for processing the flexible meaning of a quantifier, and illustrate the neuroanatomical structures that contribute to variable meanings that can be associated with a word when used in different contexts. PMID:26481678

  9. Quantifying dispersal rates and distances in North American martens: a test of enriched isotope labeling

    Treesearch

    Jonathan N. Pauli; Winston P. Smith; Merav Ben-David

    2012-01-01

    Advances in the application of stable isotopes have allowed the quantitative evaluation of previously cryptic ecological processes. In particular, researchers have utilized the predictable spatial patterning in natural abundance of isotopes to better understand animal dispersal and migration. However, quantifying dispersal via natural abundance alone has proven to be...

  10. Quantifying Evaporation in a Permeable Pavement System ...

    EPA Pesticide Factsheets

    Studies quantifying evaporation from permeable pavement systems are limited to a few laboratory studies and one field application. This research quantifies evaporation for a larger-scale field application by measuring the water balance from lined permeable pavement sections. The U.S. Environmental Protection Agency (USEPA) constructed a 0.4-ha parking lot in Edison, NJ, that incorporated three different permeable pavement types in the parking lanes – permeable interlocking concrete pavers (PICP), pervious concrete (PC), and porous asphalt (PA). An impermeable liner installed 0.4 m below the driving surface in four 11.6-m by 4.74-m sections per each pavement type captures all infiltrating water and routes it to collection tanks that can contain events up to 38 mm. Each section has a design impervious area to permeable pavement area ratio of 0.66:1. Pressure transducers installed in the underdrain collection tanks measured water level for 24 months. Level was converted to volume using depth-to-volume ratios for individual collection tanks. Using a water balance approach, the measured infiltrate volume was compared to rainfall volume on an event-basis to determine the rainfall retained in the pavement strata and underlying aggregate. Evaporation since the previous event created additional storage in the pavement and aggregate layers. Events were divided into three groups based on antecedent dry period (ADP) and three, four-month categories of potential e

  11. Quantifying resistance to isoxaflutole and mesotrione and investigating their interaction with metribuzin applied postemergence in Amaranthus tuberculatus

    USDA-ARS?s Scientific Manuscript database

    Previous research reported resistance to mesotrione (MES) and other 4-hydroxyphenylpyruvate dioxygenase (HPPD)-inhibiting herbicides in waterhemp (Amaranthus tuberculatus). Experiments were conducted to quantify resistance levels to MES and isoxaflutole (IFT) in NEB (for Nebraska HPPD-resistant) and...

  12. When Remembering Causes Forgetting: Retrieval-Induced Forgetting as Recovery Failure

    ERIC Educational Resources Information Center

    Bauml, Karl-Heinz; Zellner, Martina; Vilimek, Roman

    2005-01-01

    Retrieval practice on a subset of previously learned material can cause forgetting of the unpracticed material and make it inaccessible to consciousness. Such inaccessibility may arise because the material is no longer sampled from the set of to-be-recalled items, or, though sampled, its representation is not complete enough to be recovered into…

  13. Quantifying the Effects of Biofilm on the Hydraulic Properties of Unsaturated Soils

    NASA Astrophysics Data System (ADS)

    Volk, E.; Iden, S.; Furman, A.; Durner, W.; Rosenzweig, R.

    2017-12-01

    Quantifying the effects of biofilms on hydraulic properties of unsaturated soils is necessary for predicting water and solute flow in soil with extensive microbial presence. This can be relevant to bioremediation processes, soil aquifer treatment and effluent irrigation. Previous works showed a reduction in the hydraulic conductivity and an increase in water content due to the addition of biofilm analogue materials. The objective of this research is to quantify soil hydraulic properties of unsaturated soil (water retention and hydraulic conductivity) using real soil biofilm. In this work, Hamra soil was incubated with Luria Broth (LB) and biofilm-producing bacteria (Pseudomonas Putida F1). Hydraulic conductivity and water retention were measured by the evaporation method, Dewpoint method and a constant head permeameter. Biofilm was quantified using viable counts and the deficit of TOC. The results show that the presence of biofilms increases soil retention in the `dry' range of the curve and reduces the hydraulic conductivity (see figure). This research shows that biofilms may have a non-negligible effect on flow and transport in unsaturated soils. These findings contribute to modeling water flow in biofilm amended soil.

  14. Chimpanzees (Pan troglodytes) and bonobos (Pan paniscus) quantify split solid objects.

    PubMed

    Cacchione, Trix; Hrubesch, Christine; Call, Josep

    2013-01-01

    Recent research suggests that gorillas' and orangutans' object representations survive cohesion violations (e.g., a split of a solid object into two halves), but that their processing of quantities may be affected by them. We assessed chimpanzees' (Pan troglodytes) and bonobos' (Pan paniscus) reactions to various fission events in the same series of action tasks modelled after infant studies previously run on gorillas and orangutans (Cacchione and Call in Cognition 116:193-203, 2010b). Results showed that all four non-human great ape species managed to quantify split objects but that their performance varied as a function of the non-cohesiveness produced in the splitting event. Spatial ambiguity and shape invariance had the greatest impact on apes' ability to represent and quantify objects. Further, we observed species differences with gorillas performing lower than other species. Finally, we detected a substantial age effect, with ape infants below 6 years of age being outperformed by both juvenile/adolescent and adult apes.

  15. Quantifying errors without random sampling.

    PubMed

    Phillips, Carl V; LaPole, Luwanna M

    2003-06-12

    All quantifications of mortality, morbidity, and other health measures involve numerous sources of error. The routine quantification of random sampling error makes it easy to forget that other sources of error can and should be quantified. When a quantification does not involve sampling, error is almost never quantified and results are often reported in ways that dramatically overstate their precision. We argue that the precision implicit in typical reporting is problematic and sketch methods for quantifying the various sources of error, building up from simple examples that can be solved analytically to more complex cases. There are straightforward ways to partially quantify the uncertainty surrounding a parameter that is not characterized by random sampling, such as limiting reported significant figures. We present simple methods for doing such quantifications, and for incorporating them into calculations. More complicated methods become necessary when multiple sources of uncertainty must be combined. We demonstrate that Monte Carlo simulation, using available software, can estimate the uncertainty resulting from complicated calculations with many sources of uncertainty. We apply the method to the current estimate of the annual incidence of foodborne illness in the United States. Quantifying uncertainty from systematic errors is practical. Reporting this uncertainty would more honestly represent study results, help show the probability that estimated values fall within some critical range, and facilitate better targeting of further research.

  16. An immunological method for quantifying antibacterial activity in Salmo salar (Linnaeus, 1758) skin mucus.

    PubMed

    Narvaez, Edgar; Berendsen, Jorge; Guzmán, Fanny; Gallardo, José A; Mercado, Luis

    2010-01-01

    Antimicrobial peptides (AMPs) are a pivotal component of innate immunity in lower vertebrates. The aim of this study was to develop an immunological method for quantifying AMPs in Salmo salar skin mucus. A known antimicrobial peptide derived from histone H1 previously purified and described from S. salar skin mucus (SAMP H1) was chemically synthesized and used to obtain antibodies for the quantification of the molecule via ELISA. Using skin mucus samples, a correlation of bacterial growth inhibition versus SAMP H1 concentration (ELISA) was established. The results provide the first evidence for quantifying the presence of active AMPs in the skin mucus of S. salar through the use of an immunological method. Copyright 2009 Elsevier Ltd. All rights reserved.

  17. Exploring multicriteria decision strategies in GIS with linguistic quantifiers: A case study of residential quality evaluation

    NASA Astrophysics Data System (ADS)

    Malczewski, Jacek; Rinner, Claus

    2005-06-01

    Commonly used GIS combination operators such as Boolean conjunction/disjunction and weighted linear combination can be generalized to the ordered weighted averaging (OWA) family of operators. This multicriteria evaluation method allows decision-makers to define a decision strategy on a continuum between pessimistic and optimistic strategies. Recently, OWA has been introduced to GIS-based decision support systems. We propose to extend a previous implementation of OWA with linguistic quantifiers to simplify the definition of decision strategies and to facilitate an exploratory analysis of multiple criteria. The linguistic quantifier-guided OWA procedure is illustrated using a dataset for evaluating residential quality of neighborhoods in London, Ontario.

  18. Talker-specificity and adaptation in quantifier interpretation

    PubMed Central

    Yildirim, Ilker; Degen, Judith; Tanenhaus, Michael K.; Jaeger, T. Florian

    2015-01-01

    Linguistic meaning has long been recognized to be highly context-dependent. Quantifiers like many and some provide a particularly clear example of context-dependence. For example, the interpretation of quantifiers requires listeners to determine the relevant domain and scale. We focus on another type of context-dependence that quantifiers share with other lexical items: talker variability. Different talkers might use quantifiers with different interpretations in mind. We used a web-based crowdsourcing paradigm to study participants’ expectations about the use of many and some based on recent exposure. We first established that the mapping of some and many onto quantities (candies in a bowl) is variable both within and between participants. We then examined whether and how listeners’ expectations about quantifier use adapts with exposure to talkers who use quantifiers in different ways. The results demonstrate that listeners can adapt to talker-specific biases in both how often and with what intended meaning many and some are used. PMID:26858511

  19. Quantifying global dust devil occurrence from meteorological analyses

    PubMed Central

    Jemmett-Smith, Bradley C; Marsham, John H; Knippertz, Peter; Gilkeson, Carl A

    2015-01-01

    Dust devils and nonrotating dusty plumes are effective uplift mechanisms for fine particles, but their contribution to the global dust budget is uncertain. By applying known bulk thermodynamic criteria to European Centre for Medium-Range Weather Forecasts (ECMWF) operational analyses, we provide the first global hourly climatology of potential dust devil and dusty plume (PDDP) occurrence. In agreement with observations, activity is highest from late morning into the afternoon. Combining PDDP frequencies with dust source maps and typical emission values gives the best estimate of global contributions of 3.4% (uncertainty 0.9–31%), 1 order of magnitude lower than the only estimate previously published. Total global hours of dust uplift by dry convection are ∼0.002% of the dust-lifting winds resolved by ECMWF, consistent with dry convection making a small contribution to global uplift. Reducing uncertainty requires better knowledge of factors controlling PDDP occurrence, source regions, and dust fluxes induced by dry convection. Key Points Global potential dust devil occurrence quantified from meteorological analyses Climatology shows realistic diurnal cycle and geographical distribution Best estimate of global contribution of 3.4% is 10 times smaller than the previous estimate PMID:26681815

  20. Comparison of methods for quantifying surface sublimation over seasonally snow-covered terrain

    USGS Publications Warehouse

    Sexstone, Graham A.; Clow, David W.; Stannard, David I.; Fassnacht, Steven R.

    2016-01-01

    Snow sublimation can be an important component of the snow-cover mass balance, and there is considerable interest in quantifying the role of this process within the water and energy balance of snow-covered regions. In recent years, robust eddy covariance (EC) instrumentation has been used to quantify snow sublimation over snow-covered surfaces in complex mountainous terrain. However, EC can be challenging for monitoring turbulent fluxes in snow-covered environments because of intensive data, power, and fetch requirements, and alternative methods of estimating snow sublimation are often relied upon. To evaluate the relative merits of methods for quantifying surface sublimation, fluxes calculated by the EC, Bowen ratio–energy balance (BR), bulk aerodynamic flux (BF), and aerodynamic profile (AP) methods and their associated uncertainty were compared at two forested openings in the Colorado Rocky Mountains. Biases between methods are evaluated over a range of environmental conditions, and limitations of each method are discussed. Mean surface sublimation rates from both sites ranged from 0.33 to 0.36 mm day−1, 0.14 to 0.37 mm day−1, 0.10 to 0.17 mm day−1, and 0.03 to 0.10 mm day−1 for the EC, BR, BF and AP methods, respectively. The EC and/or BF methods are concluded to be superior for estimating surface sublimation in snow-covered forested openings. The surface sublimation rates quantified in this study are generally smaller in magnitude compared with previously published studies in this region and help to refine sublimation estimates for forested openings in the Colorado Rocky Mountains.

  1. Quantifying arm nonuse in individuals poststroke.

    PubMed

    Han, Cheol E; Kim, Sujin; Chen, Shuya; Lai, Yi-Hsuan; Lee, Jeong-Yoon; Osu, Rieko; Winstein, Carolee J; Schweighofer, Nicolas

    2013-06-01

    Arm nonuse, defined as the difference between what the individual can do when constrained to use the paretic arm and what the individual does when given a free choice to use either arm, has not yet been quantified in individuals poststroke. (1) To quantify nonuse poststroke and (2) to develop and test a novel, simple, objective, reliable, and valid instrument, the Bilateral Arm Reaching Test (BART), to quantify arm use and nonuse poststroke. First, we quantify nonuse with the Quality of Movement (QOM) subscale of the Actual Amount of Use Test (AAUT) by subtracting the AAUT QOM score in the spontaneous use condition from the AAUT QOM score in a subsequent constrained use condition. Second, we quantify arm use and nonuse with BART by comparing reaching performance to visual targets projected over a 2D horizontal hemi-work space in a spontaneous-use condition (in which participants are free to use either arm at each trial) with reaching performance in a constrained-use condition. All participants (N = 24) with chronic stroke and with mild to moderate impairment exhibited nonuse with the AAUT QOM. Nonuse with BART had excellent test-retest reliability and good external validity. BART is the first instrument that can be used repeatedly and practically in the clinic to quantify the effects of neurorehabilitation on arm use and nonuse and in the laboratory for advancing theoretical knowledge about the recovery of arm use and the development of nonuse and "learned nonuse" after stroke.

  2. Quantifying the influence of previously burned areas on suppression effectiveness and avoided exposure: A case study of the Las Conchas Fire

    Treesearch

    Matthew P. Thompson; Patrick Freeborn; Jon D. Rieck; Dave Calkin; Julie W. Gilbertson-Day; Mark A. Cochrane; Michael S. Hand

    2016-01-01

    We present a case study of the Las Conchas Fire (2011) to explore the role of previously burned areas (wildfires and prescribed fires) on suppression effectiveness and avoided exposure. Methodological innovations include characterisation of the joint dynamics of fire growth and suppression activities, development of a fire line effectiveness framework, and...

  3. Quantifying risks with exact analytical solutions of derivative pricing distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin

    2017-04-01

    Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.

  4. Processing of Numerical and Proportional Quantifiers

    ERIC Educational Resources Information Center

    Shikhare, Sailee; Heim, Stefan; Klein, Elise; Huber, Stefan; Willmes, Klaus

    2015-01-01

    Quantifier expressions like "many" and "at least" are part of a rich repository of words in language representing magnitude information. The role of numerical processing in comprehending quantifiers was studied in a semantic truth value judgment task, asking adults to quickly verify sentences about visual displays using…

  5. Scalar Quantifiers: Logic, Acquisition, and Processing

    ERIC Educational Resources Information Center

    Geurts, Bart; Katsos, Napoleon; Cummins, Chris; Moons, Jonas; Noordman, Leo

    2010-01-01

    Superlative quantifiers ("at least 3", "at most 3") and comparative quantifiers ("more than 2", "fewer than 4") are traditionally taken to be interdefinable: the received view is that "at least n" and "at most n" are equivalent to "more than n-1" and "fewer than n+1",…

  6. Investigation of Flood Risk Assessment in Inaccessible Regions using Multiple Remote Sensing and Geographic Information Systems

    NASA Astrophysics Data System (ADS)

    Lim, J.; Lee, K. S.

    2017-12-01

    Flooding is extremely dangerous when a river overflows to inundate an urban area. From 1995 to 2016, North Korea (NK) experienced annual extensive damage to life and property almost each year due to a levee breach resulting from typhoons and heavy rainfall during the summer monsoon season. Recently, Hoeryeong City (2016) experienced heavy rainfall during typhoon Lionrock and the resulting flood killed and injured many people (68,900) and destroyed numerous buildings and settlements (11,600). The NK state media described it as the biggest national disaster since 1945. Thus, almost all annual repeat occurrences of floods in NK have had a serious impact, which makes it necessary to figure out the extent of floods in restoring the damaged environment. In addition, traditional hydrological model is impractical to delineate Flood Damaged Areas (FDAs) in NK due to the inaccessibility. Under such a situation, multiple optical Remote Sensing (RS) and radar RS along with a Geographic Information System (GIS)-based spatial analysis were utilized in this study (1) to develop modelling FDA delineation using multiple RS and GIS methods and (2) to conduct flood risk assessment in NK. Interpreting high-resolution web-based satellite imagery were also implemented to confirm the results of the study. From the study result, it was found that (1) on August 30th, 2016, an area of 117.2 km2 (8.6%) at Hoeryeong City was inundated. Most floods occurred in flat areas with a lower and middle stream order. (2) In the binary logistic regression model applied in this study, the distance from the nearest stream map and landform map variables are important factors to delineate FDAs because these two factors reflect heterogeneous mountainous NK topography. (3) Total annual flood risk of study area is estimated to be ₩454.13 million NKW ($504,417.24 USD, and ₩576.53 million SKW). The risk of the confluence of the Tumen River and Hoeryeong stream appears to be the highest. (4) High resolution

  7. Quantifying the uncertainty in heritability.

    PubMed

    Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph

    2014-05-01

    The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large.

  8. Quantifying the uncertainty in heritability

    PubMed Central

    Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph

    2014-01-01

    The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large. PMID:24670270

  9. Quantifying ubiquitin signaling.

    PubMed

    Ordureau, Alban; Münch, Christian; Harper, J Wade

    2015-05-21

    Ubiquitin (UB)-driven signaling systems permeate biology, and are often integrated with other types of post-translational modifications (PTMs), including phosphorylation. Flux through such pathways is dictated by the fractional stoichiometry of distinct modifications and protein assemblies as well as the spatial organization of pathway components. Yet, we rarely understand the dynamics and stoichiometry of rate-limiting intermediates along a reaction trajectory. Here, we review how quantitative proteomic tools and enrichment strategies are being used to quantify UB-dependent signaling systems, and to integrate UB signaling with regulatory phosphorylation events, illustrated with the PINK1/PARKIN pathway. A key feature of ubiquitylation is that the identity of UB chain linkage types can control downstream processes. We also describe how proteomic and enzymological tools can be used to identify and quantify UB chain synthesis and linkage preferences. The emergence of sophisticated quantitative proteomic approaches will set a new standard for elucidating biochemical mechanisms of UB-driven signaling systems. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Quantifying Ubiquitin Signaling

    PubMed Central

    Ordureau, Alban; Münch, Christian; Harper, J. Wade

    2015-01-01

    Ubiquitin (UB)-driven signaling systems permeate biology, and are often integrated with other types of post-translational modifications (PTMs), most notably phosphorylation. Flux through such pathways is typically dictated by the fractional stoichiometry of distinct regulatory modifications and protein assemblies as well as the spatial organization of pathway components. Yet, we rarely understand the dynamics and stoichiometry of rate-limiting intermediates along a reaction trajectory. Here, we review how quantitative proteomic tools and enrichment strategies are being used to quantify UB-dependent signaling systems, and to integrate UB signaling with regulatory phosphorylation events. A key regulatory feature of ubiquitylation is that the identity of UB chain linkage types can control downstream processes. We also describe how proteomic and enzymological tools can be used to identify and quantify UB chain synthesis and linkage preferences. The emergence of sophisticated quantitative proteomic approaches will set a new standard for elucidating biochemical mechanisms of UB-driven signaling systems. PMID:26000850

  11. Using ultrasound to quantify tongue shape and movement characteristics.

    PubMed

    Zharkova, Natalia

    2013-01-01

    Objective : Previous experimental studies have demonstrated abnormal lingual articulatory patterns characterizing cleft palate speech. Most articulatory information to date has been collected using electropalatography, which records the location and size of tongue-palate contact but not the tongue shape. The latter type of data can be provided by ultrasound. The present paper aims to describe ultrasound tongue imaging as a potential tool for quantitative analysis of tongue function in speakers with cleft palate. A description of the ultrasound technique as applied to analyzing tongue movements is given, followed by the requirements for quantitative analysis. Several measures are described, and example calculations are provided. Measures : Two measures aim to quantify overuse of tongue dorsum in cleft palate articulations. Crucially for potential clinical applications, these measures do not require head-to-transducer stabilization because both are based on a single tongue curve. The other three measures compare sets of tongue curves, with the aim to quantify the dynamics of tongue displacement, token-to-token variability in tongue position, and the extent of separation between tongue curves for different speech sounds. Conclusions : All measures can be used to compare tongue function in speakers with cleft palate before and after therapy, as well as to assess their performance against that in typical speakers and to help in selecting more effective treatments.

  12. Quantifying renewable groundwater stress with GRACE

    NASA Astrophysics Data System (ADS)

    Richey, Alexandra S.; Thomas, Brian F.; Lo, Min-Hui; Reager, John T.; Famiglietti, James S.; Voss, Katalyn; Swenson, Sean; Rodell, Matthew

    2015-07-01

    Groundwater is an increasingly important water supply source globally. Understanding the amount of groundwater used versus the volume available is crucial to evaluate future water availability. We present a groundwater stress assessment to quantify the relationship between groundwater use and availability in the world's 37 largest aquifer systems. We quantify stress according to a ratio of groundwater use to availability, which we call the Renewable Groundwater Stress ratio. The impact of quantifying groundwater use based on nationally reported groundwater withdrawal statistics is compared to a novel approach to quantify use based on remote sensing observations from the Gravity Recovery and Climate Experiment (GRACE) satellite mission. Four characteristic stress regimes are defined: Overstressed, Variable Stress, Human-dominated Stress, and Unstressed. The regimes are a function of the sign of use (positive or negative) and the sign of groundwater availability, defined as mean annual recharge. The ability to mitigate and adapt to stressed conditions, where use exceeds sustainable water availability, is a function of economic capacity and land use patterns. Therefore, we qualitatively explore the relationship between stress and anthropogenic biomes. We find that estimates of groundwater stress based on withdrawal statistics are unable to capture the range of characteristic stress regimes, especially in regions dominated by sparsely populated biome types with limited cropland. GRACE-based estimates of use and stress can holistically quantify the impact of groundwater use on stress, resulting in both greater magnitudes of stress and more variability of stress between regions.

  13. Quantifying renewable groundwater stress with GRACE

    PubMed Central

    Richey, Alexandra S.; Thomas, Brian F.; Lo, Min‐Hui; Reager, John T.; Voss, Katalyn; Swenson, Sean; Rodell, Matthew

    2015-01-01

    Abstract Groundwater is an increasingly important water supply source globally. Understanding the amount of groundwater used versus the volume available is crucial to evaluate future water availability. We present a groundwater stress assessment to quantify the relationship between groundwater use and availability in the world's 37 largest aquifer systems. We quantify stress according to a ratio of groundwater use to availability, which we call the Renewable Groundwater Stress ratio. The impact of quantifying groundwater use based on nationally reported groundwater withdrawal statistics is compared to a novel approach to quantify use based on remote sensing observations from the Gravity Recovery and Climate Experiment (GRACE) satellite mission. Four characteristic stress regimes are defined: Overstressed, Variable Stress, Human‐dominated Stress, and Unstressed. The regimes are a function of the sign of use (positive or negative) and the sign of groundwater availability, defined as mean annual recharge. The ability to mitigate and adapt to stressed conditions, where use exceeds sustainable water availability, is a function of economic capacity and land use patterns. Therefore, we qualitatively explore the relationship between stress and anthropogenic biomes. We find that estimates of groundwater stress based on withdrawal statistics are unable to capture the range of characteristic stress regimes, especially in regions dominated by sparsely populated biome types with limited cropland. GRACE‐based estimates of use and stress can holistically quantify the impact of groundwater use on stress, resulting in both greater magnitudes of stress and more variability of stress between regions. PMID:26900185

  14. Quantifying Qualitative Learning.

    ERIC Educational Resources Information Center

    Bogus, Barbara

    1995-01-01

    A teacher at an alternative school for at-risk students discusses the development of student assessment that increases students' self-esteem, convinces students that learning is fun, and prepares students to return to traditional school settings. She found that allowing students to participate in the assessment process successfully quantified the…

  15. Loschmidt echo as a robust decoherence quantifier for many-body systems

    NASA Astrophysics Data System (ADS)

    Zangara, Pablo R.; Dente, Axel D.; Levstein, Patricia R.; Pastawski, Horacio M.

    2012-07-01

    We employ the Loschmidt echo, i.e., the signal recovered after the reversal of an evolution, to identify and quantify the processes contributing to decoherence. This procedure, which has been extensively used in single-particle physics, is employed here in a spin ladder. The isolated chains have 1/2 spins with XY interaction and their excitations would sustain a one-body-like propagation. One of them constitutes the controlled system S whose reversible dynamics is degraded by the weak coupling with the uncontrolled second chain, i.e., the environment E. The perturbative SE coupling is swept through arbitrary combinations of XY and Ising-like interactions, that contain the standard Heisenberg and dipolar ones. Different time regimes are identified for the Loschmidt echo dynamics in this perturbative configuration. In particular, the exponential decay scales as a Fermi golden rule, where the contributions of the different SE terms are individually evaluated and analyzed. Comparisons with previous analytical and numerical evaluations of decoherence based on the attenuation of specific interferences show that the Loschmidt echo is an advantageous decoherence quantifier at any time, regardless of the S internal dynamics.

  16. Quantifiers are incrementally interpreted in context, more than less

    PubMed Central

    Urbach, Thomas P.; DeLong, Katherine A.; Kutas, Marta

    2015-01-01

    Language interpretation is often assumed to be incremental. However, our studies of quantifier expressions in isolated sentences found N400 event-related brain potential (ERP) evidence for partial but not full immediate quantifier interpretation (Urbach & Kutas, 2010). Here we tested similar quantifier expressions in pragmatically supporting discourse contexts (Alex was an unusual toddler. Most/Few kids prefer sweets/vegetables…) while participants made plausibility judgments (Experiment 1) or read for comprehension (Experiment 2). Control Experiments 3A (plausibility) and 3B (comprehension) removed the discourse contexts. Quantifiers always modulated typical and/or atypical word N400 amplitudes. However, only the real-time N400 effects only in Experiment 2 mirrored offline quantifier and typicality crossover interaction effects for plausibility ratings and cloze probabilities. We conclude that quantifier expressions can be interpreted fully and immediately, though pragmatic and task variables appear to impact the speed and/or depth of quantifier interpretation. PMID:26005285

  17. Quantifying evenly distributed states in exclusion and nonexclusion processes

    NASA Astrophysics Data System (ADS)

    Binder, Benjamin J.; Landman, Kerry A.

    2011-04-01

    Spatial-point data sets, generated from a wide range of physical systems and mathematical models, can be analyzed by counting the number of objects in equally sized bins. We find that the bin counts are related to the Pólya distribution. New measures are developed which indicate whether or not a spatial data set, generated from an exclusion process, is at its most evenly distributed state, the complete spatial randomness (CSR) state. To this end, we define an index in terms of the variance between the bin counts. Limiting values of the index are determined when objects have access to the entire domain and when there are subregions of the domain that are inaccessible to objects. Using three case studies (Lagrangian fluid particles in chaotic laminar flows, cellular automata agents in discrete models, and biological cells within colonies), we calculate the indexes and verify that our theoretical CSR limit accurately predicts the state of the system. These measures should prove useful in many biological applications.

  18. Quantifying construction and demolition waste: An analytical review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin

    2014-09-15

    Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less

  19. Quantifying Evaporation in a Permeable Pavement System

    EPA Science Inventory

    Studies quantifying evaporation from permeable pavement systems are limited to a few laboratory studies and one field application. This research quantifies evaporation for a larger-scale field application by measuring the water balance from lined permeable pavement sections. Th...

  20. Two complementary approaches to quantify variability in heat resistance of spores of Bacillus subtilis.

    PubMed

    den Besten, Heidy M W; Berendsen, Erwin M; Wells-Bennik, Marjon H J; Straatsma, Han; Zwietering, Marcel H

    2017-07-17

    Realistic prediction of microbial inactivation in food requires quantitative information on variability introduced by the microorganisms. Bacillus subtilis forms heat resistant spores and in this study the impact of strain variability on spore heat resistance was quantified using 20 strains. In addition, experimental variability was quantified by using technical replicates per heat treatment experiment, and reproduction variability was quantified by using two biologically independent spore crops for each strain that were heat treated on different days. The fourth-decimal reduction times and z-values were estimated by a one-step and two-step model fitting procedure. Grouping of the 20 B. subtilis strains into two statistically distinguishable groups could be confirmed based on their spore heat resistance. The reproduction variability was higher than experimental variability, but both variabilities were much lower than strain variability. The model fitting approach did not significantly affect the quantification of variability. Remarkably, when strain variability in spore heat resistance was quantified using only the strains producing low-level heat resistant spores, then this strain variability was comparable with the previously reported strain variability in heat resistance of vegetative cells of Listeria monocytogenes, although in a totally other temperature range. Strains that produced spores with high-level heat resistance showed similar temperature range for growth as strains that produced low-level heat resistance. Strain variability affected heat resistance of spores most, and therefore integration of this variability factor in modelling of spore heat resistance will make predictions more realistic. Copyright © 2017. Published by Elsevier B.V.

  1. A compact clinical instrument for quantifying suppression.

    PubMed

    Black, Joanne M; Thompson, Benjamin; Maehara, Goro; Hess, Robert F

    2011-02-01

    We describe a compact and convenient clinical apparatus for the measurement of suppression based on a previously reported laboratory-based approach. In addition, we report and validate a novel, rapid psychophysical method for measuring suppression using this apparatus, which makes the technique more applicable to clinical practice. By using a Z800 dual pro head-mounted display driven by a MAC laptop, we provide dichoptic stimulation. Global motion stimuli composed of arrays of moving dots are presented to each eye. One set of dots move in a coherent direction (termed signal) whereas another set of dots move in a random direction (termed noise). To quantify performance, we measure the signal/noise ratio corresponding to a direction-discrimination threshold. Suppression is quantified by assessing the extent to which it matters which eye sees the signal and which eye sees the noise. A space-saving, head-mounted display using current video technology offers an ideal solution for clinical practice. In addition, our optimized psychophysical method provided results that were in agreement with those produced using the original technique. We made measures of suppression on a group of nine adult amblyopic participants using this apparatus with both the original and new psychophysical paradigms. All participants had measurable suppression ranging from mild to severe. The two different psychophysical methods gave a strong correlation for the strength of suppression (rho = -0.83, p = 0.006). Combining the new apparatus and new psychophysical method creates a convenient and rapid technique for parametric measurement of interocular suppression. In addition, this apparatus constitutes the ideal platform for suppressors to combine information between their eyes in a similar way to binocularly normal people. This provides a convenient way for clinicians to implement the newly proposed binocular treatment of amblyopia that is based on antisuppression training.

  2. Four-time 7Li stimulated-echo spectroscopy for the study of dynamic heterogeneities: Application to lithium borate glass.

    PubMed

    Storek, M; Tilly, J F; Jeffrey, K R; Böhmer, R

    2017-09-01

    To study the nature of the nonexponential ionic hopping in solids a pulse sequence was developed that yields four-time stimulated-echo functions of previously inaccessible spin-3/2-nuclei such as 7 Li. It exploits combined Zeeman and octupolar order as longitudinal carrier state. Higher-order correlation functions were successfully generated for natural-abundance and isotopically-enriched lithium diborate glasses. Four-time 7 Li measurements are presented and compared with two-time correlation functions. The results are discussed with reference to approaches devised to quantify the degree of nonexponentiality in glass forming systems and evidence for the occurrence of dynamic heterogeneities and dynamic exchange were found. Additional experiments using the 6 Li species illustrate the challenge posed by subensemble selection when the dipolar interactions are not very much smaller than the quadrupolar ones. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Quantifying uncertainty in carbon and nutrient pools of coarse woody debris

    NASA Astrophysics Data System (ADS)

    See, C. R.; Campbell, J. L.; Fraver, S.; Domke, G. M.; Harmon, M. E.; Knoepp, J. D.; Woodall, C. W.

    2016-12-01

    Woody detritus constitutes a major pool of both carbon and nutrients in forested ecosystems. Estimating coarse wood stocks relies on many assumptions, even when full surveys are conducted. Researchers rarely report error in coarse wood pool estimates, despite the importance to ecosystem budgets and modelling efforts. To date, no study has attempted a comprehensive assessment of error rates and uncertainty inherent in the estimation of this pool. Here, we use Monte Carlo analysis to propagate the error associated with the major sources of uncertainty present in the calculation of coarse wood carbon and nutrient (i.e., N, P, K, Ca, Mg, Na) pools. We also evaluate individual sources of error to identify the importance of each source of uncertainty in our estimates. We quantify sampling error by comparing the three most common field methods used to survey coarse wood (two transect methods and a whole-plot survey). We quantify the measurement error associated with length and diameter measurement, and technician error in species identification and decay class using plots surveyed by multiple technicians. We use previously published values of model error for the four most common methods of volume estimation: Smalian's, conical frustum, conic paraboloid, and average-of-ends. We also use previously published values for error in the collapse ratio (cross-sectional height/width) of decayed logs that serves as a surrogate for the volume remaining. We consider sampling error in chemical concentration and density for all decay classes, using distributions from both published and unpublished studies. Analytical uncertainty is calculated using standard reference plant material from the National Institute of Standards. Our results suggest that technician error in decay classification can have a large effect on uncertainty, since many of the error distributions included in the calculation (e.g. density, chemical concentration, volume-model selection, collapse ratio) are decay

  4. Analyzing complex networks evolution through Information Theory quantifiers

    NASA Astrophysics Data System (ADS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martín Gómez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Niño/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  5. Quantifying the sensitivity of post-glacial sea level change to laterally varying viscosity

    NASA Astrophysics Data System (ADS)

    Crawford, Ophelia; Al-Attar, David; Tromp, Jeroen; Mitrovica, Jerry X.; Austermann, Jacqueline; Lau, Harriet C. P.

    2018-05-01

    We present a method for calculating the derivatives of measurements of glacial isostatic adjustment (GIA) with respect to the viscosity structure of the Earth and the ice sheet history. These derivatives, or kernels, quantify the linearised sensitivity of measurements to the underlying model parameters. The adjoint method is used to enable efficient calculation of theoretically exact sensitivity kernels within laterally heterogeneous earth models that can have a range of linear or non-linear viscoelastic rheologies. We first present a new approach to calculate GIA in the time domain, which, in contrast to the more usual formulation in the Laplace domain, is well suited to continuously varying earth models and to the use of the adjoint method. Benchmarking results show excellent agreement between our formulation and previous methods. We illustrate the potential applications of the kernels calculated in this way through a range of numerical calculations relative to a spherically symmetric background model. The complex spatial patterns of the sensitivities are not intuitive, and this is the first time that such effects are quantified in an efficient and accurate manner.

  6. Gains and Pitfalls of Quantifier Elimination as a Teaching Tool

    ERIC Educational Resources Information Center

    Oldenburg, Reinhard

    2015-01-01

    Quantifier Elimination is a procedure that allows simplification of logical formulas that contain quantifiers. Many mathematical concepts are defined in terms of quantifiers and especially in calculus their use has been identified as an obstacle in the learning process. The automatic deduction provided by quantifier elimination thus allows…

  7. Quantifying the measurement uncertainty of results from environmental analytical methods.

    PubMed

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  8. Deaf Learners' Knowledge of English Universal Quantifiers

    ERIC Educational Resources Information Center

    Berent, Gerald P.; Kelly, Ronald R.; Porter, Jeffrey E.; Fonzi, Judith

    2008-01-01

    Deaf and hearing students' knowledge of English sentences containing universal quantifiers was compared through their performance on a 50-item, multiple-picture task that required students to decide whether each of five pictures represented a possible meaning of a target sentence. The task assessed fundamental knowledge of quantifier sentences,…

  9. Quantifying the costs and benefits of occupational health and safety interventions at a Bangladesh shipbuilding company

    PubMed Central

    Thiede, Irene; Thiede, Michael

    2015-01-01

    Background: This study is the first cost–benefit analysis (CBA) of occupational health and safety (OHS) in a low-income country. It focuses on one of the largest shipbuilding companies in Bangladesh, where globally recognised Occupational Health and Safety Advisory Services (OHSAS) 18001 certification was achieved in 2012. Objectives: The study examines the relative costs of implementing OHS measures against qualitative and quantifiable benefits of implementation in order to determine whether OHSAS measures are economically advantageous. Methods: Quantifying past costs and benefits and discounting future ones, this study looks at the returns of OHS measures at Western Marine Shipbuilding Company Ltd. Results: Costs included investments in workplace and environmental safety, a new clinic that also serves the community, and personal protective equipment (PPE) and training. The results are impressive: previously high injury statistics dropped to close to zero. Conclusions: OHS measures decrease injuries, increase efficiency, and bring income security to workers’ families. Certification has proven a competitive edge for the shipyard, resulting in access to greater markets. Intangible benefits such as trust, motivation and security are deemed crucial in the CBA, and this study finds the high investments made are difficult to offset with quantifiable benefits alone. PMID:25589369

  10. Quantifying the costs and benefits of occupational health and safety interventions at a Bangladesh shipbuilding company.

    PubMed

    Thiede, Irene; Thiede, Michael

    2015-01-01

    This study is the first cost-benefit analysis (CBA) of occupational health and safety (OHS) in a low-income country. It focuses on one of the largest shipbuilding companies in Bangladesh, where globally recognised Occupational Health and Safety Advisory Services (OHSAS) 18001 certification was achieved in 2012. The study examines the relative costs of implementing OHS measures against qualitative and quantifiable benefits of implementation in order to determine whether OHSAS measures are economically advantageous. Quantifying past costs and benefits and discounting future ones, this study looks at the returns of OHS measures at Western Marine Shipbuilding Company Ltd. Costs included investments in workplace and environmental safety, a new clinic that also serves the community, and personal protective equipment (PPE) and training. The results are impressive: previously high injury statistics dropped to close to zero. OHS measures decrease injuries, increase efficiency, and bring income security to workers' families. Certification has proven a competitive edge for the shipyard, resulting in access to greater markets. Intangible benefits such as trust, motivation and security are deemed crucial in the CBA, and this study finds the high investments made are difficult to offset with quantifiable benefits alone.

  11. LAPAROSCOPY AFTER PREVIOUS LAPAROTOMY

    PubMed Central

    Godinjak, Zulfo; Idrizbegović, Edin; Begić, Kerim

    2006-01-01

    Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for lap-aroscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previous laparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured. PMID:17177649

  12. Cross-linguistic patterns in the acquisition of quantifiers.

    PubMed

    Katsos, Napoleon; Cummins, Chris; Ezeizabarrena, Maria-José; Gavarró, Anna; Kuvač Kraljević, Jelena; Hrzica, Gordana; Grohmann, Kleanthes K; Skordi, Athina; Jensen de López, Kristine; Sundahl, Lone; van Hout, Angeliek; Hollebrandse, Bart; Overweg, Jessica; Faber, Myrthe; van Koert, Margreet; Smith, Nafsika; Vija, Maigi; Zupping, Sirli; Kunnari, Sari; Morisseau, Tiffany; Rusieshvili, Manana; Yatsushiro, Kazuko; Fengler, Anja; Varlokosta, Spyridoula; Konstantzou, Katerina; Farby, Shira; Guasti, Maria Teresa; Vernice, Mirta; Okabe, Reiko; Isobe, Miwa; Crosthwaite, Peter; Hong, Yoonjee; Balčiūnienė, Ingrida; Ahmad Nizar, Yanti Marina; Grech, Helen; Gatt, Daniela; Cheong, Win Nee; Asbjørnsen, Arve; Torkildsen, Janne von Koss; Haman, Ewa; Miękisz, Aneta; Gagarina, Natalia; Puzanova, Julia; Anđelković, Darinka; Savić, Maja; Jošić, Smiljana; Slančová, Daniela; Kapalková, Svetlana; Barberán, Tania; Özge, Duygu; Hassan, Saima; Chan, Cecilia Yuet Hung; Okubo, Tomoya; van der Lely, Heather; Sauerland, Uli; Noveck, Ira

    2016-08-16

    Learners of most languages are faced with the task of acquiring words to talk about number and quantity. Much is known about the order of acquisition of number words as well as the cognitive and perceptual systems and cultural practices that shape it. Substantially less is known about the acquisition of quantifiers. Here, we consider the extent to which systems and practices that support number word acquisition can be applied to quantifier acquisition and conclude that the two domains are largely distinct in this respect. Consequently, we hypothesize that the acquisition of quantifiers is constrained by a set of factors related to each quantifier's specific meaning. We investigate competence with the expressions for "all," "none," "some," "some…not," and "most" in 31 languages, representing 11 language types, by testing 768 5-y-old children and 536 adults. We found a cross-linguistically similar order of acquisition of quantifiers, explicable in terms of four factors relating to their meaning and use. In addition, exploratory analyses reveal that language- and learner-specific factors, such as negative concord and gender, are significant predictors of variation.

  13. SU-E-J-159: Intra-Patient Deformable Image Registration Uncertainties Quantified Using the Distance Discordance Metric

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saleh, Z; Thor, M; Apte, A

    2014-06-01

    Purpose: The quantitative evaluation of deformable image registration (DIR) is currently challenging due to lack of a ground truth. In this study we test a new method proposed for quantifying multiple-image based DIRrelated uncertainties, for DIR of pelvic images. Methods: 19 patients were analyzed, each with 6 CT scans, who previously had radiotherapy for prostate cancer. Manually delineated structures for rectum and bladder, which served as ground truth structures, were delineated on the planning CT and each subsequent scan. For each patient, voxel-by-voxel DIR-related uncertainties were evaluated, following B-spline based DIR, by applying a previously developed metric, the distance discordancemore » metric (DDM; Saleh et al., PMB (2014) 59:733). The DDM map was superimposed on the first acquired CT scan and DDM statistics were assessed, also relative to two metrics estimating the agreement between the propagated and the manually delineated structures. Results: The highest DDM values which correspond to greatest spatial uncertainties were observed near the body surface and in the bowel due to the presence of gas. The mean rectal and bladder DDM values ranged from 1.1–11.1 mm and 1.5–12.7 mm, respectively. There was a strong correlation in the DDMs between the rectum and bladder (Pearson R = 0.68 for the max DDM). For both structures, DDM was correlated with the ratio between the DIR-propagated and manually delineated volumes (R = 0.74 for the max rectal DDM). The maximum rectal DDM was negatively correlated with the Dice Similarity Coefficient between the propagated and the manually delineated volumes (R= −0.52). Conclusion: The multipleimage based DDM map quantified considerable DIR variability across different structures and among patients. Besides using the DDM for quantifying DIR-related uncertainties it could potentially be used to adjust for uncertainties in DIR-based accumulated dose distributions.« less

  14. Using time reversal to detect entanglement and spreading of quantum information

    NASA Astrophysics Data System (ADS)

    Gaerttner, Martin

    2017-04-01

    Characterizing and understanding the states of interacting quantum systems and their non-equilibrium dynamics is the goal of quantum simulation. For this it is crucial to find experimentally feasible means for quantifying how entanglement and correlation build up and spread. The ability of analog quantum simulators to reverse the unitary dynamics of quantum many-body systems provides new tools in this quest. One such tool is the multiple-quantum coherence (MQC) spectrum previously used in NMR spectroscopy which can now be studied in so far inaccessible parameter regimes near zero temperature in highly controllable environments. I present recent progress in relating the MQC spectrum to established entanglement witnesses such as quantum Fisher information. Recognizing the MQC as out-of-time-order correlation functions, which quantify the spreading, or scrambling, of quantum information, allows us to establish a connection between these quantities and multi-partite entanglement. I will show recent experimental results obtained with a trapped ion quantum simulator and a spinor BEC illustrating the power of time reversal protocols. Supported by: JILA-NSF-PFC-1125844, NSF-PHY-1521080, ARO, AFOSR, AFOSR-MURI, DARPA, NIST.

  15. Quantifying the Effects of Historical Land Cover Conversion Uncertainty on Global Carbon and Climate Estimates

    NASA Astrophysics Data System (ADS)

    Di Vittorio, A. V.; Mao, J.; Shi, X.; Chini, L.; Hurtt, G.; Collins, W. D.

    2018-01-01

    Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. Here we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO2 in 2004, and generates carbon uncertainty that is equivalent to 80% of the net effects of CO2 and climate and 124% of the effects of nitrogen deposition during 1850-2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. We conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.

  16. Quantifying the Effects of Historical Land Cover Conversion Uncertainty on Global Carbon and Climate Estimates

    DOE PAGES

    Di Vittorio, A. V.; Mao, J.; Shi, X.; ...

    2018-01-03

    Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. In this paper, we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO 2 in 2004, and generates carbon uncertainty that is equivalentmore » to 80% of the net effects of CO 2 and climate and 124% of the effects of nitrogen deposition during 1850–2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. Finally, we conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.« less

  17. Quantifying the Effects of Historical Land Cover Conversion Uncertainty on Global Carbon and Climate Estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Vittorio, A. V.; Mao, J.; Shi, X.

    Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. In this paper, we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO 2 in 2004, and generates carbon uncertainty that is equivalentmore » to 80% of the net effects of CO 2 and climate and 124% of the effects of nitrogen deposition during 1850–2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. Finally, we conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.« less

  18. Quantifying effects of retinal illuminance on frequency doubling perimetry.

    PubMed

    Swanson, William H; Dul, Mitchell W; Fischer, Susan E

    2005-01-01

    To measure and quantify effects of variation in retinal illuminance on frequency doubling technology (FDT) perimetry. A Zeiss-Humphrey/Welch Allyn FDT perimeter was used with the threshold N-30 strategy. Study 1, quantifying adaptation: 11 eyes of 11 subjects (24-46 years old) were tested with natural pupils, and then retested after stable pupillary dilation with neutral density filters of 0.0, 0.6, 1.2, and 1.6 log unit in front of the subject's eye. Study 2, predicting effect of reduced illuminance: 17 eyes of 17 subjects (26-61 years old) were tested with natural pupils, and then retested after stable pupillary miosis (assessed with an infrared camera). A quantitative adaptation model was fit to results of Study 1; the mean adaptation parameter was used to predict change in Study 2. Study 1: Mean defect (MD) decreased by 10 dB over a 1.6 log unit range of retinal illuminances; model fits for all subjects had r2> 95%. Study 2: Change in MD (DeltaMD) ranged from -7.3 dB to +0.8 dB. The mean adaptation parameter from Study 1 accounted for 69% of the variance in DeltaMD (P <0.0005), and accuracy of the model was independent of the magnitude of DeltaMD (r2< 1%, P >0.75). The results confirmed previous findings that FDT perimetry can be dramatically affected by variations in retinal illuminance. Application of a quantitative adaptation model provided guidelines for estimating effects of pupil diameter and lens density on FDT perimetry.

  19. Quantifying the Nonlinear, Anisotropic Material Response of Spinal Ligaments

    NASA Astrophysics Data System (ADS)

    Robertson, Daniel J.

    Spinal ligaments may be a significant source of chronic back pain, yet they are often disregarded by the clinical community due to a lack of information with regards to their material response, and innervation characteristics. The purpose of this dissertation was to characterize the material response of spinal ligaments and to review their innervation characteristics. Review of relevant literature revealed that all of the major spinal ligaments are innervated. They cause painful sensations when irritated and provide reflexive control of the deep spinal musculature. As such, including the neurologic implications of iatrogenic ligament damage in the evaluation of surgical procedures aimed at relieving back pain will likely result in more effective long-term solutions. The material response of spinal ligaments has not previously been fully quantified due to limitations associated with standard soft tissue testing techniques. The present work presents and validates a novel testing methodology capable of overcoming these limitations. In particular, the anisotropic, inhomogeneous material constitutive properties of the human supraspinous ligament are quantified and methods for determining the response of the other spinal ligaments are presented. In addition, a method for determining the anisotropic, inhomogeneous pre-strain distribution of the spinal ligaments is presented. The multi-axial pre-strain distributions of the human anterior longitudinal ligament, ligamentum flavum and supraspinous ligament were determined using this methodology. Results from this work clearly demonstrate that spinal ligaments are not uniaxial structures, and that finite element models which account for pre-strain and incorporate ligament's complex material properties may provide increased fidelity to the in vivo condition.

  20. Quantifying the quality of hand movement in stroke patients through three-dimensional curvature.

    PubMed

    Osu, Rieko; Ota, Kazuko; Fujiwara, Toshiyuki; Otaka, Yohei; Kawato, Mitsuo; Liu, Meigen

    2011-10-31

    To more accurately evaluate rehabilitation outcomes in stroke patients, movement irregularities should be quantified. Previous work in stroke patients has revealed a reduction in the trajectory smoothness and segmentation of continuous movements. Clinically, the Stroke Impairment Assessment Set (SIAS) evaluates the clumsiness of arm movements using an ordinal scale based on the examiner's observations. In this study, we focused on three-dimensional curvature of hand trajectory to quantify movement, and aimed to establish a novel measurement that is independent of movement duration. We compared the proposed measurement with the SIAS score and the jerk measure representing temporal smoothness. Sixteen stroke patients with SIAS upper limb proximal motor function (Knee-Mouth test) scores ranging from 2 (incomplete performance) to 4 (mild clumsiness) were recruited. Nine healthy participant with a SIAS score of 5 (normal) also participated. Participants were asked to grasp a plastic glass and repetitively move it from the lap to the mouth and back at a conformable speed for 30 s, during which the hand movement was measured using OPTOTRAK. The position data was numerically differentiated and the three-dimensional curvature was computed. To compare against a previously proposed measure, the mean squared jerk normalized by its minimum value was computed. Age-matched healthy participants were instructed to move the glass at three different movement speeds. There was an inverse relationship between the curvature of the movement trajectory and the patient's SIAS score. The median of the -log of curvature (MedianLC) correlated well with the SIAS score, upper extremity subsection of Fugl-Meyer Assessment, and the jerk measure in the paretic arm. When the healthy participants moved slowly, the increase in the jerk measure was comparable to the paretic movements with a SIAS score of 2 to 4, while the MedianLC was distinguishable from paretic movements. Measurement based on

  1. Quantifying the quality of hand movement in stroke patients through three-dimensional curvature

    PubMed Central

    2011-01-01

    Background To more accurately evaluate rehabilitation outcomes in stroke patients, movement irregularities should be quantified. Previous work in stroke patients has revealed a reduction in the trajectory smoothness and segmentation of continuous movements. Clinically, the Stroke Impairment Assessment Set (SIAS) evaluates the clumsiness of arm movements using an ordinal scale based on the examiner's observations. In this study, we focused on three-dimensional curvature of hand trajectory to quantify movement, and aimed to establish a novel measurement that is independent of movement duration. We compared the proposed measurement with the SIAS score and the jerk measure representing temporal smoothness. Methods Sixteen stroke patients with SIAS upper limb proximal motor function (Knee-Mouth test) scores ranging from 2 (incomplete performance) to 4 (mild clumsiness) were recruited. Nine healthy participant with a SIAS score of 5 (normal) also participated. Participants were asked to grasp a plastic glass and repetitively move it from the lap to the mouth and back at a conformable speed for 30 s, during which the hand movement was measured using OPTOTRAK. The position data was numerically differentiated and the three-dimensional curvature was computed. To compare against a previously proposed measure, the mean squared jerk normalized by its minimum value was computed. Age-matched healthy participants were instructed to move the glass at three different movement speeds. Results There was an inverse relationship between the curvature of the movement trajectory and the patient's SIAS score. The median of the -log of curvature (MedianLC) correlated well with the SIAS score, upper extremity subsection of Fugl-Meyer Assessment, and the jerk measure in the paretic arm. When the healthy participants moved slowly, the increase in the jerk measure was comparable to the paretic movements with a SIAS score of 2 to 4, while the MedianLC was distinguishable from paretic movements

  2. Influence on Busilvex pharmacokinetics of clonazepam compared to previous phenytoin historical data.

    PubMed

    Carreras, E; Cahn, J Y; Puozzo, C; Kröger, N; Sanz, G; Buzyn, A; Bacigalupo, A; Vernant, J P

    2010-07-01

    This study investigated the effect of seizure prophylaxis on busulfan (Bu) plasma exposure. Twenty-four adult patients received an intravenous Bu-cyclophoshamide conditioning regimen prior to bone marrow transplantation. Busilvex (0.8 mg/kg) was administered every six hours during four consecutive days. Clonazepam (0.025 to 0.03 mg/kg/day as a continuous 12-h i.v. infusion) was administered at least 12 hours prior to i.v. Bu dosing and continued until 24 hours after the last dose. Pharmacokinetic (PK) data were compared with those previously collected in patients (n=127) treated with phenytoin for seizure prophylaxis. Through population PK analysis, a 10% average increase (coefficient of variation, RSE=5.35%) in total clearance of Bu was quantified when Bu was associated with clonazepam as compared to phenytoin, which was considered as not being clinically relevant. The suspected induction on Bu metabolism by phenytoin should have resulted in the opposite effect. The patient efficacy and safety profiles were comparable between the two cohorts.

  3. A framework for quantifying net benefits of alternative prognostic models.

    PubMed

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-30

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd.

  4. A framework for quantifying net benefits of alternative prognostic models‡

    PubMed Central

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd. PMID:21905066

  5. Quantifying Functional Group Interactions that Determine Urea Effects on Nucleic Acid Helix Formation

    PubMed Central

    Guinn, Emily J.; Schwinefus, Jeffrey J.; Cha, Hyo Keun; McDevitt, Joseph L.; Merker, Wolf E.; Ritzer, Ryan; Muth, Gregory W.; Engelsgjerd, Samuel W.; Mangold, Kathryn E.; Thompson, Perry J.; Kerins, Michael J.; Record, Thomas

    2013-01-01

    Urea destabilizes helical and folded conformations of nucleic acids and proteins, as well as protein-nucleic acid complexes. To understand these effects, extend previous characterizations of interactions of urea with protein functional groups, and thereby develop urea as a probe of conformational changes in protein and nucleic acid processes, we obtain chemical potential derivatives (μ23 = dμ2/dm3) quantifying interactions of urea (component 3) with nucleic acid bases, base analogs, nucleosides and nucleotide monophosphates (component 2) using osmometry and hexanol-water distribution assays. Dissection of these μ23 yields interaction potentials quantifying interactions of urea with unit surface areas of nucleic acid functional groups (heterocyclic aromatic ring, ring methyl, carbonyl and phosphate O, amino N, sugar (C,O)); urea interacts favorably with all these groups, relative to interactions with water. Interactions of urea with heterocyclic aromatic rings and attached methyl groups (as on thymine) are particularly favorable, as previously observed for urea-homocyclic aromatic ring interactions. Urea m-values determined for double helix formation by DNA dodecamers near 25°C are in the range 0.72 to 0.85 kcal mol−1 m−1 and exhibit little systematic dependence on nucleobase composition (17–42% GC). Interpretation of these results using the urea interaction potentials indicates that extensive (60–90%) stacking of nucleobases in the separated strands in the transition region is required to explain the m-value. Results for RNA and DNA dodecamers obtained at higher temperatures, and literature data, are consistent with this conclusion. This demonstrates the utility of urea as a quantitative probe of changes in surface area (ΔASA) in nucleic acid processes. PMID:23510511

  6. Quantifying noise in optical tweezers by allan variance.

    PubMed

    Czerwinski, Fabian; Richardson, Andrew C; Oddershede, Lene B

    2009-07-20

    Much effort is put into minimizing noise in optical tweezers experiments because noise and drift can mask fundamental behaviours of, e.g., single molecule assays. Various initiatives have been taken to reduce or eliminate noise but it has been difficult to quantify their effect. We propose to use Allan variance as a simple and efficient method to quantify noise in optical tweezers setups.We apply the method to determine the optimal measurement time, frequency, and detection scheme, and quantify the effect of acoustic noise in the lab. The method can also be used on-the-fly for determining optimal parameters of running experiments.

  7. Flame-Vortex Studies to Quantify Markstein Numbers Needed to Model Flame Extinction Limits

    NASA Technical Reports Server (NTRS)

    Driscoll, James F.; Feikema, Douglas A.

    2003-01-01

    This has quantified a database of Markstein numbers for unsteady flames; future work will quantify a database of flame extinction limits for unsteady conditions. Unsteady extinction limits have not been documented previously; both a stretch rate and a residence time must be measured, since extinction requires that the stretch rate be sufficiently large for a sufficiently long residence time. Ma was measured for an inwardly-propagating flame (IPF) that is negatively-stretched under microgravity conditions. Computations also were performed using RUN-1DL to explain the measurements. The Markstein number of an inwardly-propagating flame, for both the microgravity experiment and the computations, is significantly larger than that of an outwardy-propagating flame. The computed profiles of the various species within the flame suggest reasons. Computed hydrogen concentrations build up ahead of the IPF but not the OPF. Understanding was gained by running the computations for both simplified and full-chemistry conditions. Numerical Simulations. To explain the experimental findings, numerical simulations of both inwardly and outwardly propagating spherical flames (with complex chemistry) were generated using the RUN-1DL code, which includes 16 species and 46 reactions.

  8. Quantifying tasks, ergonomic exposures and injury rates among school custodial workers.

    PubMed

    Village, J; Koehoorn, M; Hossain, S; Ostry, A

    2009-06-01

    A job exposure matrix of ergonomics risk factors was constructed for school custodial workers in one large school district in the province of British Columbia using 100 h of 1-min fixed-interval observations, participatory worker consensus on task durations and existing employment and school characteristic data. Significant differences in ergonomics risk factors were found by tasks and occupations. Cleaning and moving furniture, handling garbage, cleaning washrooms and cleaning floors were associated with the most physical risks and the exposure was often higher during the summer vs. the school year. Injury rates over a 4-year period showed the custodian injury rate was four times higher than the overall injury rate across all occupations in the school district. Injury rates were significantly higher in the school year compared with summer (12.2 vs. 7.0 per 100 full-time equivalents per year, p < 0.05). Custodial workers represent a considerable proportion of the labour force and have high injury rates, yet ergonomic studies are disproportionately few. Previous studies that quantified risk factors in custodial workers tended to focus on a few tasks or specific risk factors. This study, using participatory ergonomics and observational methods, systematically quantifies the broad range of musculoskeletal risk factors across multiple tasks performed by custodial workers in schools, adding considerably to the methodological literature.

  9. Quantifiers More or Less Quantify On-Line: ERP Evidence for Partial Incremental Interpretation

    ERIC Educational Resources Information Center

    Urbach, Thomas P.; Kutas, Marta

    2010-01-01

    Event-related brain potentials were recorded during RSVP reading to test the hypothesis that quantifier expressions are incrementally interpreted fully and immediately. In sentences tapping general knowledge ("Farmers grow crops/worms as their primary source of income"), Experiment 1 found larger N400s for atypical ("worms") than typical objects…

  10. Health information exchange and healthcare utilization.

    PubMed

    Vest, Joshua R

    2009-06-01

    Health information exchange (HIE) makes previously inaccessible data available to clinicians, resulting in more complete information. This study tested the hypotheses that HIE information access reduced emergency room visits and inpatient hospitalizations for ambulatory care sensitive conditions among medically indigent adults. HIE access was quantified by how frequently system users' accessed patients' data. Encounter counts were modeled using zero inflated binomial regression. HIE was not accessed for 43% of individuals. Patient factors associated with accessed data included: prior utilization, chronic conditions, and age. Higher levels of information access were significantly associated with increased counts of all encounter types. Results indicate system users were more likely to access HIE for patients for whom the information might be considered most beneficial. Ultimately, these results imply that HIE information access did not transform care in the ways many would expect. Expectations in utilization reductions, however logical, may have to be reevaluated or postponed.

  11. Skin collagen can be accurately quantified through noninvasive optical method: Validation on a swine study.

    PubMed

    Tzeng, S-Y; Kuo, T-Y; Hu, S-B; Chen, Y-W; Lin, Y-L; Chu, K-Y; Tseng, S-H

    2018-02-01

    Diffuse reflectance spectroscopy (DRS) is a noninvasive optical technology characterized by relatively low system cost and high efficiency. In our previous study, we quantified the relative concentration of collagen for the individual keloid patient. However, no actual value of collagen concentration can prove the reliability of collagen detection by our DRS system. Skin-mimicking phantoms were prepared using different collagen and coffee concentrations, and their chromophore concentrations were quantified using the DRS system to analyze the influence of collagen and other chromophores. Moreover, we used the animal study to compare the DRS system with the collagen evaluation of biopsy section by second-harmonic generation (SHG) microscopy at four different skin parts. In the phantom study, the result showed that coffee chromophore did not severely interfere with collagen concentration recovery. In the animal study, a positive correlation (r=.902) between the DRS system and collagen evaluation with SHG microscopy was found. We have demonstrated that the DRS system can quantify the actual values of collagen concentration and excluded the interference of other chromophores in skin-mimicking phantoms. Furthermore, a high positive correlation was found in the animal study with SHG microscopy. We consider that the DRS is a potential technique and can evaluate skin condition objectively. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. Using isotopes to quantify evaporation and non-stationary transit times distributions in lake water budgets

    NASA Astrophysics Data System (ADS)

    Smith, A. A.; Tetzlaff, D.; Soulsby, C.

    2017-12-01

    Evaporative fluxes from northern lakes are essential components of catchment water balances, providing large supplies of water to the atmosphere, and affecting downstream water availability. However, measurement of lake evaporation is difficult in many catchments due to remoteness and inaccessibility. Evaporative flux may also influence mean transit times of lakes and catchments, identified through water- and tracer mass-balance. We combined stable water isotopes (δ2H and δ18O), transit, and residence time distributions in a non-stationary transit time model to estimate the evaporative flux from two lakes in the Scottish Highlands. The lakes were in close proximity to each other ( 2km), shallow (mean depth, 1.5 m) with one large (0.88km2) and one small (0.4km2). Model calibration used measurements of precipitation, air temperature, water level, and isotopic stream compositions of lake inflow and outflows. Evaporation flux was identified using lake fractionation of δ2H and δ18O. Mixing patterns of the lakes and their respective outlet isotopic compositions were accounted for by comparing three probability distributions for discharge and evaporation. We found that the evaporation flux was strongly influenced by these discharge and evaporation distributions. Decreased mixing within the lake resulted in greater evaporation fluxes. One of the three distributions yielded similar mean daily evaporation and uncertainty for both lakes (max 5mm/day), while evaporation using the other two distributions was inconsistent between the lakes. Importantly, our approach also estimated distributions of evaporation age, which were significantly different between the lakes, reflecting a combination of inflow stream magnitude and the mixing regimes. The mean evaporation flux age of the large lake was 160 days, and 14 days for the small lake. Our integrated approach of stable isotopes, time variant transit time distributions has shown to be a useful tool for quantifying evaporative

  13. Analysis to Quantify Significant Contribution

    EPA Pesticide Factsheets

    This Technical Support Document provides information that supports EPA’s analysis to quantify upwind state emissions that significantly contribute to nonattainment or interfere with maintenance of National Ambient Air Quality Standards in downwind states.

  14. Deployment of paired pushnets from jet-propelled kayaks to sample ichthyoplankton

    USGS Publications Warehouse

    Acre, Matthew R.; Grabowski, Timothy B.

    2015-01-01

    Accessing and effectively sampling the off-channel habitats that are considered crucial for early life stages of freshwater fishes constitute a difficult challenge when common ichthyoplankton survey methods, such as push nets, are used. We describe a new method of deploying push nets from jet-propelled kayaks to enable the sampling of previously inaccessible off-channel habitats. The described rig is also functional in more open and accessible habitats, such as the main channel of rivers or reservoirs. Although further evaluation is necessary to ensure that results are comparable across studies, the described push-net system offers a statistically rigorous methodology that generates replicate samples from a wide range of freshwater habitats that were previously inaccessible to this gear type.

  15. Quantifying the Contribution of Urban-Industrial Efficiency and Symbiosis to Deep Decarbonization: Impact of 637 Chinese Cities

    NASA Astrophysics Data System (ADS)

    Ramaswami, A.; Tong, K.; Fang, A.; Lal, R.; Nagpure, A.; Li, Y.; Yu, H.; Jiang, D.; Russell, A. G.; Shi, L.; Chertow, M.; Wang, Y.; Wang, S.

    2016-12-01

    Urban activities in China contribute significantly to global greenhouse gas (GHG) emissions and to local air pollution-related health risks. Co-location analysis can help inform the potential for energy- and material-exchanges across homes, businesses, infrastructure and industries co-located in cities. Such co-location dependent urban-industrial symbiosis strategies offer a new pathway toward urban energy efficiency and health that have not previously been quantified. Key examples includes the use of waste industrial heat in other co-located industries, and in residential-commercial district heating-cooling systems of cities. To quantify the impact of these strategies: (1) We develop a new data-set of 637 Chinese cities to assess the potential for efficiency and symbiosis across co-located homes, businesses, industries and the energy and construction sectors in the different cities. (2) A multi-scalar urban systems model quantifies trans-boundary CO2 impacts as well as local health benefits of these uniquely urban, co-location-dependent strategies. (3) CO2 impacts are aggregated across the 637 Chinese cities (home to 701 million people) to quantify national CO2 mitigation potential. (4) The local health benefits are modeled specific to each city and mapped geospatially to identify areas where co-benefits between GHG mitigation and health are maximized. Results: A first order conservative analysis of co-location dependent urban symbiosis indicates potential for reducing 6% of China's national total CO2 emissions in a relatively short time period, yielding a new pathway not previously considered in China's energy futures models. The magnitude of these reductions (6%) was similar in magnitude to sector specific industrial, power sector and buildings efficiency strategeies that together contributed 9% CO2 reduction aggregated across the nation. CO2 reductions mapped to the 637 cities ranged from <1% to 40%, depending upon co-location patterns, climate and other

  16. Sunburn and sun-protective behaviors among adults with and without previous nonmelanoma skin cancer: a population-based study

    PubMed Central

    Fischer, Alexander H.; Wang, Timothy S.; Yenokyan, Gayane; Kang, Sewon; Chien, Anna L.

    2016-01-01

    Background Individuals with previous nonmelanoma skin cancer (NMSC) are at increased risk for subsequent skin cancer, and should therefore limit UV exposure. Objective To determine whether individuals with previous NMSC engage in better sun protection than those with no skin cancer history. Methods We pooled self-reported data (2005 and 2010 National Health Interview Surveys) from US non-Hispanic white adults (758 with and 34,161 without previous NMSC). We calculated adjusted prevalence odds ratios (aPOR) and 95% confidence intervals (95% CI), taking into account the complex survey design. Results Individuals with previous NMSC versus no history of NMSC had higher rates of frequent use of shade (44.3% versus 27.0%; aPOR=1.41; 1.16–1.71), long sleeves (20.5% versus 7.7%; aPOR=1.55; 1.21–1.98), a wide-brimmed hat (26.1% versus 10.5%; aPOR=1.52; 1.24–1.87), and sunscreen (53.7% versus 33.1%; aPOR=2.11; 95% CI=1.73–2.59), but did not have significantly lower odds of recent sunburn (29.7% versus 40.7%; aPOR=0.95; 0.77–1.17). Among subjects with previous NMSC, recent sunburn was inversely associated with age, sun avoidance, and shade but not sunscreen. Limitations Self-reported cross-sectional data and unavailable information quantifying regular sun exposure. Conclusion Physicians should emphasize sunburn prevention when counseling patients with previous NMSC, especially younger adults, focusing on shade and sun avoidance over sunscreen. PMID:27198078

  17. The use of QLF to quantify in vitro whitening in a product testing model.

    PubMed

    Pretty, I A; Edgar, W M; Higham, S M

    2001-11-24

    Professional and consumer interest in whitening products continues to increase against a background of both increased oral health awareness and demand for cosmetic procedures. In the current legal climate, few dentists are providing 'in-office' whitening treatments, and thus many patients turn to home-use products. The most common of these are the whitening toothpastes. Researchers are keen to quantify the effectiveness of such products through clinically relevant trials. Previous studies examining whitening products have employed a variety of stained substrates to monitor stain removal. This study aimed to quantify the removal of stain from human enamel using a new device, quantitative light-induced fluorescence (QLF). The experimental design follows that of a product-testing model. A total of 11 previously extracted molar teeth were coated with transparent nail varnish leaving an exposed window of enamel. The sound, exposed enamel was subject to a staining regime of human saliva, chlorhexidine and tea. Each of the eleven teeth was subjected to serial exposures of a positive control (Bocasan), a negative control (water) and a test product (Yotuel toothpaste). Following each two-minute exposure QLF images of the teeth were taken (a total of 5 applications). Following completion of one test solution, the teeth were cleaned, re-stained and the procedure repeated with the next solution. QLF images were stored on a PC and analysed by a blinded single examiner. The deltaQ value at 5% threshold was reported. ANOVA and paired t-tests were used to analyse the data. The study confirmed the ability of QLF to longitudinally quantify stain reduction from human enamel. The reliability of the technique in relation to positive and negative test controls was proven. The positive control had a significantly (alpha = 0.05) higher stain removal efficacy than water (p = 0.023) and Yotuel (p = 0.046). Yotuel was more effective than water (p = 0.023). The research community, the

  18. A Bayesian model for quantifying the change in mortality associated with future ozone exposures under climate change.

    PubMed

    Alexeeff, Stacey E; Pfister, Gabriele G; Nychka, Doug

    2016-03-01

    Climate change is expected to have many impacts on the environment, including changes in ozone concentrations at the surface level. A key public health concern is the potential increase in ozone-related summertime mortality if surface ozone concentrations rise in response to climate change. Although ozone formation depends partly on summertime weather, which exhibits considerable inter-annual variability, previous health impact studies have not incorporated the variability of ozone into their prediction models. A major source of uncertainty in the health impacts is the variability of the modeled ozone concentrations. We propose a Bayesian model and Monte Carlo estimation method for quantifying health effects of future ozone. An advantage of this approach is that we include the uncertainty in both the health effect association and the modeled ozone concentrations. Using our proposed approach, we quantify the expected change in ozone-related summertime mortality in the contiguous United States between 2000 and 2050 under a changing climate. The mortality estimates show regional patterns in the expected degree of impact. We also illustrate the results when using a common technique in previous work that averages ozone to reduce the size of the data, and contrast these findings with our own. Our analysis yields more realistic inferences, providing clearer interpretation for decision making regarding the impacts of climate change. © 2015, The International Biometric Society.

  19. Gradient approach to quantify the gradation smoothness for output media

    NASA Astrophysics Data System (ADS)

    Kim, Youn Jin; Bang, Yousun; Choh, Heui-Keun

    2010-01-01

    We aim to quantify the perception of color gradation smoothness using objectively measurable properties. We propose a model to compute the smoothness of hardcopy color-to-color gradations. It is a gradient-based method that can be determined as a function of the 95th percentile of second derivative for the tone-jump estimator and the fifth percentile of first derivative for the tone-clipping estimator. Performance of the model and a previously suggested method were psychophysically appreciated, and their prediction accuracies were compared to each other. Our model showed a stronger Pearson correlation to the corresponding visual data, and the magnitude of the Pearson correlation reached up to 0.87. Its statistical significance was verified through analysis of variance. Color variations of the representative memory colors-blue sky, green grass and Caucasian skin-were rendered as gradational scales and utilized as the test stimuli.

  20. Repeat immigration: A previously unobserved source of heterogeneity?

    PubMed

    Aradhya, Siddartha; Scott, Kirk; Smith, Christopher D

    2017-07-01

    Register data allow for nuanced analyses of heterogeneities between sub-groups which are not observable in other data sources. One heterogeneity for which register data is particularly useful is in identifying unique migration histories of immigrant populations, a group of interest across disciplines. Years since migration is a commonly used measure of integration in studies seeking to understand the outcomes of immigrants. This study constructs detailed migration histories to test whether misclassified migrations may mask important heterogeneities. In doing so, we identify a previously understudied group of migrants called repeat immigrants, and show that they differ systematically from permanent immigrants. In addition, we quantify the degree to which migration information is misreported in the registers. The analysis is carried out in two steps. First, we estimate income trajectories for repeat immigrants and permanent immigrants to understand the degree to which they differ. Second, we test data validity by cross-referencing migration information with changes in income to determine whether there are inconsistencies indicating misreporting. From the first part of the analysis, the results indicate that repeat immigrants systematically differ from permanent immigrants in terms of income trajectories. Furthermore, income trajectories differ based on the way in which years since migration is calculated. The second part of the analysis suggests that misreported migration events, while present, are negligible. Repeat immigrants differ in terms of income trajectories, and may differ in terms of other outcomes as well. Furthermore, this study underlines that Swedish registers provide a reliable data source to analyze groups which are unidentifiable in other data sources.

  1. Identifying and Quantifying the Intermediate Processes during Nitrate-Dependent Iron(II) Oxidation.

    PubMed

    Jamieson, James; Prommer, Henning; Kaksonen, Anna H; Sun, Jing; Siade, Adam J; Yusov, Anna; Bostick, Benjamin

    2018-05-15

    Microbially driven nitrate-dependent iron (Fe) oxidation (NDFO) in subsurface environments has been intensively studied. However, the extent to which Fe(II) oxidation is biologically catalyzed remains unclear because no neutrophilic iron-oxidizing and nitrate reducing autotroph has been isolated to confirm the existence of an enzymatic pathway. While mixotrophic NDFO bacteria have been isolated, understanding the process is complicated by simultaneous abiotic oxidation due to nitrite produced during denitrification. In this study, the relative contributions of biotic and abiotic processes during NDFO were quantified through the compilation and model-based interpretation of previously published experimental data. The kinetics of chemical denitrification by Fe(II) (chemodenitrification) were assessed, and compelling evidence was found for the importance of organic ligands, specifically exopolymeric substances secreted by bacteria, in enhancing abiotic oxidation of Fe(II). However, nitrite alone could not explain the observed magnitude of Fe(II) oxidation, with 60-75% of overall Fe(II) oxidation attributed to an enzymatic pathway for investigated strains: Acidovorax ( A.) strain BoFeN1, 2AN, A. ebreus strain TPSY, Paracoccus denitrificans Pd 1222, and Pseudogulbenkiania sp. strain 2002. By rigorously quantifying the intermediate processes, this study eliminated the potential for abiotic Fe(II) oxidation to be exclusively responsible for NDFO and verified the key contribution from an additional, biological Fe(II) oxidation process catalyzed by NDFO bacteria.

  2. Cross-linguistic patterns in the acquisition of quantifiers

    PubMed Central

    Cummins, Chris; Gavarró, Anna; Kuvač Kraljević, Jelena; Hrzica, Gordana; Grohmann, Kleanthes K.; Skordi, Athina; Jensen de López, Kristine; Sundahl, Lone; van Hout, Angeliek; Hollebrandse, Bart; Overweg, Jessica; Faber, Myrthe; van Koert, Margreet; Smith, Nafsika; Vija, Maigi; Zupping, Sirli; Kunnari, Sari; Morisseau, Tiffany; Rusieshvili, Manana; Yatsushiro, Kazuko; Fengler, Anja; Varlokosta, Spyridoula; Konstantzou, Katerina; Farby, Shira; Guasti, Maria Teresa; Vernice, Mirta; Okabe, Reiko; Isobe, Miwa; Crosthwaite, Peter; Hong, Yoonjee; Balčiūnienė, Ingrida; Ahmad Nizar, Yanti Marina; Grech, Helen; Gatt, Daniela; Cheong, Win Nee; Asbjørnsen, Arve; Torkildsen, Janne von Koss; Haman, Ewa; Miękisz, Aneta; Gagarina, Natalia; Puzanova, Julia; Anđelković, Darinka; Savić, Maja; Jošić, Smiljana; Slančová, Daniela; Kapalková, Svetlana; Barberán, Tania; Özge, Duygu; Hassan, Saima; Chan, Cecilia Yuet Hung; Okubo, Tomoya; van der Lely, Heather; Sauerland, Uli; Noveck, Ira

    2016-01-01

    Learners of most languages are faced with the task of acquiring words to talk about number and quantity. Much is known about the order of acquisition of number words as well as the cognitive and perceptual systems and cultural practices that shape it. Substantially less is known about the acquisition of quantifiers. Here, we consider the extent to which systems and practices that support number word acquisition can be applied to quantifier acquisition and conclude that the two domains are largely distinct in this respect. Consequently, we hypothesize that the acquisition of quantifiers is constrained by a set of factors related to each quantifier’s specific meaning. We investigate competence with the expressions for “all,” “none,” “some,” “some…not,” and “most” in 31 languages, representing 11 language types, by testing 768 5-y-old children and 536 adults. We found a cross-linguistically similar order of acquisition of quantifiers, explicable in terms of four factors relating to their meaning and use. In addition, exploratory analyses reveal that language- and learner-specific factors, such as negative concord and gender, are significant predictors of variation. PMID:27482119

  3. Bayesian focalization: quantifying source localization with environmental uncertainty.

    PubMed

    Dosso, Stan E; Wilmut, Michael J

    2007-05-01

    This paper applies a Bayesian formulation to study ocean acoustic source localization as a function of uncertainty in environmental properties (water column and seabed) and of data information content [signal-to-noise ratio (SNR) and number of frequencies]. The approach follows that of the optimum uncertain field processor [A. M. Richardson and L. W. Nolte, J. Acoust. Soc. Am. 89, 2280-2284 (1991)], in that localization uncertainty is quantified by joint marginal probability distributions for source range and depth integrated over uncertain environmental properties. The integration is carried out here using Metropolis Gibbs' sampling for environmental parameters and heat-bath Gibbs' sampling for source location to provide efficient sampling over complicated parameter spaces. The approach is applied to acoustic data from a shallow-water site in the Mediterranean Sea where previous geoacoustic studies have been carried out. It is found that reliable localization requires a sufficient combination of prior (environmental) information and data information. For example, sources can be localized reliably for single-frequency data at low SNR (-3 dB) only with small environmental uncertainties, whereas successful localization with large environmental uncertainties requires higher SNR and/or multifrequency data.

  4. Quantifying uncertainties in precipitation measurement

    NASA Astrophysics Data System (ADS)

    Chen, H. Z. D.

    2017-12-01

    The scientific community have a long history of utilizing precipitation data for climate model design. However, precipitation record and its model contains more uncertainty than its temperature counterpart. Literature research have shown precipitation measurements to be highly influenced by its surrounding environment, and weather stations are traditionally situated in open areas and subject to various limitations. As a result, this restriction limits the ability of the scientific community to fully close the loop on the water cycle. Horizontal redistribution have been shown to be a major factor influencing precipitation measurements. Efforts have been placed on reducing its effect on the monitoring apparatus. However, the amount of factors contributing to this uncertainty is numerous and difficult to fully capture. As a result, noise factor remains high in precipitation data. This study aims to quantify all uncertainties in precipitation data by factoring out horizontal redistribution by measuring them directly. Horizontal contribution of precipitation will be quantified by measuring precipitation at different heights, with one directly shadowing the other. The above collection represents traditional precipitation data, whereas the bottom measurements sums up the overall error term at given location. Measurements will be recorded and correlated with nearest available wind measurements to quantify its impact on traditional precipitation record. Collections at different locations will also be compared to see whether this phenomenon is location specific or if a general trend can be derived. We aim to demonstrate a new way to isolate the noise component in traditional precipitation data via empirical measurements. By doing so, improve the overall quality of historic precipitation record. As a result, provide a more accurate information for the design and calibration of large scale climate modeling.

  5. Acute fatal hemorrhage from previously undiagnosed cerebral arteriovenous malformations in children: a single-center experience.

    PubMed

    Riordan, Coleman P; Orbach, Darren B; Smith, Edward R; Scott, R Michael

    2018-06-01

    OBJECTIVE The most significant adverse outcome of intracranial hemorrhage from an arteriovenous malformation (AVM) is death. This study reviews a single-center experience with pediatric AVMs to quantify the incidence and characterize clinical and radiographic factors associated with sudden death from the hemorrhage of previously undiagnosed AVMs in children. METHODS A single-center database review of the period from 2006 to 2017 identified all patients with a first-time intracranial hemorrhage from a previously undiagnosed AVM. Clinical and radiographic data were collected and compared between patients who survived to hospital discharge and those who died at presentation. RESULTS A total of 57 patients (average age 10.8 years, range 0.1-19 years) presented with first-time intracranial hemorrhage from a previously undiagnosed AVM during the study period. Of this group, 7/57 (12%) patients (average age 11.5 years, range 6-16 years) suffered hemorrhages that led directly to their deaths. Compared to the cohort of patients who survived their hemorrhage, patients who died were 4 times more likely to have an AVM in the posterior fossa. No clear pattern of antecedent triggering activity (sports, trauma, etc.) was identified, and 3/7 (43%) experienced cardiac arrest in the prehospital setting. Surviving patients were ultimately treated with resection of the AVM in 42/50 (84%) of cases. CONCLUSIONS Children who present with hemorrhage from a previously undiagnosed intracranial AVM had a 12% chance of sudden death in our single-institution series of pediatric cerebrovascular cases. Clinical triggers of hemorrhage are unpredictable, but subsequent radiographic evidence of a posterior fossa AVM was present in 57% of fatal cases, and all fatal cases were in locations with high risk of potential herniation. These data support a proactive, aggressive approach toward definitive treatment of AVMs in children.

  6. THE SEGUE K GIANT SURVEY. III. QUANTIFYING GALACTIC HALO SUBSTRUCTURE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janesh, William; Morrison, Heather L.; Ma, Zhibo

    2016-01-10

    We statistically quantify the amount of substructure in the Milky Way stellar halo using a sample of 4568 halo K giant stars at Galactocentric distances ranging over 5–125 kpc. These stars have been selected photometrically and confirmed spectroscopically as K giants from the Sloan Digital Sky Survey’s Sloan Extension for Galactic Understanding and Exploration project. Using a position–velocity clustering estimator (the 4distance) and a model of a smooth stellar halo, we quantify the amount of substructure in the halo, divided by distance and metallicity. Overall, we find that the halo as a whole is highly structured. We also confirm earliermore » work using blue horizontal branch (BHB) stars which showed that there is an increasing amount of substructure with increasing Galactocentric radius, and additionally find that the amount of substructure in the halo increases with increasing metallicity. Comparing to resampled BHB stars, we find that K giants and BHBs have similar amounts of substructure over equivalent ranges of Galactocentric radius. Using a friends-of-friends algorithm to identify members of individual groups, we find that a large fraction (∼33%) of grouped stars are associated with Sgr, and identify stars belonging to other halo star streams: the Orphan Stream, the Cetus Polar Stream, and others, including previously unknown substructures. A large fraction of sample K giants (more than 50%) are not grouped into any substructure. We find also that the Sgr stream strongly dominates groups in the outer halo for all except the most metal-poor stars, and suggest that this is the source of the increase of substructure with Galactocentric radius and metallicity.« less

  7. Integrating SANS and fluid-invasion methods to characterize pore structure of typical American shale oil reservoirs.

    PubMed

    Zhao, Jianhua; Jin, Zhijun; Hu, Qinhong; Jin, Zhenkui; Barber, Troy J; Zhang, Yuxiang; Bleuel, Markus

    2017-11-13

    An integration of small-angle neutron scattering (SANS), low-pressure N 2 physisorption (LPNP), and mercury injection capillary pressure (MICP) methods was employed to study the pore structure of four oil shale samples from leading Niobrara, Wolfcamp, Bakken, and Utica Formations in USA. Porosity values obtained from SANS are higher than those from two fluid-invasion methods, due to the ability of neutrons to probe pore spaces inaccessible to N 2 and mercury. However, SANS and LPNP methods exhibit a similar pore-size distribution, and both methods (in measuring total pore volume) show different results of porosity and pore-size distribution obtained from the MICP method (quantifying pore throats). Multi-scale (five pore-diameter intervals) inaccessible porosity to N 2 was determined using SANS and LPNP data. Overall, a large value of inaccessible porosity occurs at pore diameters <10 nm, which we attribute to low connectivity of organic matter-hosted and clay-associated pores in these shales. While each method probes a unique aspect of complex pore structure of shale, the discrepancy between pore structure results from different methods is explained with respect to their difference in measurable ranges of pore diameter, pore space, pore type, sample size and associated pore connectivity, as well as theoretical base and interpretation.

  8. Visual Attention and Quantifier-Spreading in Heritage Russian Bilinguals

    ERIC Educational Resources Information Center

    Sekerina, Irina A.; Sauermann, Antje

    2015-01-01

    It is well established in language acquisition research that monolingual children and adult second language learners misinterpret sentences with the universal quantifier "every" and make quantifier-spreading errors that are attributed to a preference for a match in number between two sets of objects. The present Visual World eye-tracking…

  9. Dynamic kinematic responses of female volunteers in rear impacts and comparison to previous male volunteer tests.

    PubMed

    Carlsson, Anna; Linder, Astrid; Davidsson, Johan; Hell, Wolfram; Schick, Sylvia; Svensson, Mats

    2011-08-01

    The objective was to quantify dynamic responses of 50th percentile females in rear impacts and compare to those from similar tests with males. The results will serve as a basis for future work with models, criteria, and safety systems. A rear impact sled test series with 8 female volunteers was performed at velocity changes of 5 and 7 km/h. The following dynamic response corridors were generated for the head, T1 (first thoracic vertebra) and head relative to T1: (1) accelerations in posterior-anterior direction, (2) horizontal and vertical displacements, (3) angular displacements for 6 females close to the 50th percentile in size. Additionally, the head-to-head restraint distance and contact time and neck injury criterion (NIC) were extracted from the data set. These data were compared to results from previously performed male volunteer tests, representing the 50th percentile male, in equivalent test conditions. T-tests were performed with the statistical significance level of .05 to quantify the significance of the parameter value differences for the males and females. At 7 km/h, the females showed 29 percent earlier head-to-head restraint contact time (p = .0072); 27 percent shorter horizontal rearward head displacement (p = .0017); 36 percent narrower head extension angle (p = .0281); and 52 percent lower NIC value (p = .0239) than the males in previous tests. This was mainly due to 35 percent shorter initial head-to-head restraint distance for the females (p = .0125). The peak head acceleration in the posterior-anterior direction was higher and occurred earlier for the females. The overall result indicated differences in the dynamic response for the female and male volunteers. The results could be used in developing and evaluating a mechanical and/or mathematical average-sized female dummy model for rear impact safety assessment. These models can be used as a tool in the design of protective systems and for further development and evaluation of injury criteria.

  10. Pituitary-adrenocortical adjustments to transport stress in horses with previous different handling and transport conditions

    PubMed Central

    Fazio, E.; Medica, P.; Cravana, C.; Ferlazzo, and A.

    2016-01-01

    Aim: The changes of the hypothalamic pituitary adrenal (HPA) axis response to a long distance transportation results in increase of adrenocorticotropic hormone (ACTH) and cortisol levels. The purpose of the study was to quantify the level of short-term road transport stress on circulating ACTH and cortisol concentrations, related to the effect of previous handling and transport experience of horses. Materials and Methods: The study was performed on 56 healthy horses after short-term road transport of 30 km. The horses were divided into four groups, Groups A, B, C, and D, with respect to the handling quality: Good (Groups A and B), bad (Group D), and minimal handling (Group C) conditions. According to the previous transport, experience horses were divided as follows: Horses of Groups A and D had been experienced long-distance transportation before; horses of Groups B and C had been limited experience of transportation. Results: One-way RM-ANOVA showed significant effects of transport on ACTH changes in Groups B and C and on cortisol changes in both Groups A and B. Groups A and B showed lower baseline ACTH and cortisol values than Groups C and D; Groups A and B showed lower post-transport ACTH values than Groups C and D. Groups A, B, and C showed lower post-transport cortisol values than Group D. Only Groups A and B horses have shown an adequate capacity of stress response to transportation. Conclusion: The previous transport experience and quality of handling could influence the HPA axis physiological responses of horses after short-term road transport. PMID:27651674

  11. Increasing the Reliability of Circulation Model Validation: Quantifying Drifter Slip to See how Currents are Actually Moving

    NASA Astrophysics Data System (ADS)

    Anderson, T.

    2016-02-01

    Ocean circulation forecasts can help answer questions regarding larval dispersal, passive movement of injured sea animals, oil spill mitigation, and search and rescue efforts. Circulation forecasts are often validated with GPS-tracked drifter paths, but how accurately do these drifters actually move with ocean currents? Drifters are not only moved by water, but are also forced by wind and waves acting on the exposed buoy and transmitter; this imperfect movement is referred to as drifter slip. The quantification and further understanding of drifter slip will allow scientists to differentiate between drifter imperfections and actual computer model error when comparing trajectory forecasts with actual drifter tracks. This will avoid falsely accrediting all discrepancies between a trajectory forecast and an actual drifter track to computer model error. During multiple deployments of drifters in Nantucket Sound and using observed wind and wave data, we attempt to quantify the slip of drifters developed by the Northeast Fisheries Science Center's (NEFSC) Student Drifters Program. While similar studies have been conducted previously, very few have directly attached current meters to drifters to quantify drifter slip. Furthermore, none have quantified slip of NEFSC drifters relative to the oceanographic-standard "CODE" drifter. The NEFSC drifter archive has over 1000 drifter tracks primarily off the New England coast. With a better understanding of NEFSC drifter slip, modelers can reliably use these tracks for model validation.

  12. Increasing the Reliability of Circulation Model Validation: Quantifying Drifter Slip to See how Currents are Actually Moving

    NASA Astrophysics Data System (ADS)

    Anderson, T.

    2015-12-01

    Ocean circulation forecasts can help answer questions regarding larval dispersal, passive movement of injured sea animals, oil spill mitigation, and search and rescue efforts. Circulation forecasts are often validated with GPS-tracked drifter paths, but how accurately do these drifters actually move with ocean currents? Drifters are not only moved by water, but are also forced by wind and waves acting on the exposed buoy and transmitter; this imperfect movement is referred to as drifter slip. The quantification and further understanding of drifter slip will allow scientists to differentiate between drifter imperfections and actual computer model error when comparing trajectory forecasts with actual drifter tracks. This will avoid falsely accrediting all discrepancies between a trajectory forecast and an actual drifter track to computer model error. During multiple deployments of drifters in Nantucket Sound and using observed wind and wave data, we attempt to quantify the slip of drifters developed by the Northeast Fisheries Science Center's (NEFSC) Student Drifters Program. While similar studies have been conducted previously, very few have directly attached current meters to drifters to quantify drifter slip. Furthermore, none have quantified slip of NEFSC drifters relative to the oceanographic-standard "CODE" drifter. The NEFSC drifter archive has over 1000 drifter tracks primarily off the New England coast. With a better understanding of NEFSC drifter slip, modelers can reliably use these tracks for model validation.

  13. Quantifying collagen orientation in breast tissue biopsies using SLIM (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Majeed, Hassaan; Okoro, Chukwuemeka; Balla, Andre; Toussaint, Kimani C.; Popescu, Gabriel

    2017-02-01

    Breast cancer is a major public health problem worldwide, being the most common type of cancer among women according to the World Health Organization (WHO). The WHO has further stressed the importance of an early determination of the disease course through prognostic markers. Recent studies have shown that the alignment of collagen fibers in tumor adjacent stroma correlate with poorer health outcomes in patients. Such studies have typically been carried out using Second-Harmonic Generation (SHG) microscopy. SHG images are very useful for quantifying collagen fiber orientation due their specificity to non-centrosymmetric structures in tissue, leading to high contrast in collagen rich areas. However, the imaging throughput in SHG microscopy is limited by its point scanning geometry. In this work, we show that SLIM, a wide-field high-throughput QPI technique, can be used to obtain the same information on collagen fiber orientation as is obtainable through SHG microscopy. We imaged a tissue microarray containing both benign and malignant cores using both SHG microscopy and SLIM. The cellular (non-collagenous) structures in the SLIM images were next segmented out using an algorithm developed in-house. Using the previously published Fourier Transform Second Harmonic Generation (FT-SHG) tool, the fiber orientations in SHG and segmented SLIM images were then quantified. The resulting histograms of fiber orientation angles showed that both SHG and SLIM generate similar measurements of collagen fiber orientation. The SLIM modality, however, can generate these results at much higher throughput due to its wide-field, whole-slide scanning capabilities.

  14. Estimating Regional Mass Balance of Himalayan Glaciers Using Hexagon Imagery: An Automated Approach

    NASA Astrophysics Data System (ADS)

    Maurer, J. M.; Rupper, S.

    2013-12-01

    Currently there is much uncertainty regarding the present and future state of Himalayan glaciers, which supply meltwater for river systems vital to more than 1.4 billion people living throughout Asia. Previous assessments of regional glacier mass balance in the Himalayas using various remote sensing and field-based methods give inconsistent results, and most assessments are over relatively short (e.g., single decade) timescales. This study aims to quantify multi-decadal changes in volume and extent of Himalayan glaciers through efficient use of the large database of declassified 1970-80s era Hexagon stereo imagery. Automation of the DEM extraction process provides an effective workflow for many images to be processed and glacier elevation changes quantified with minimal user input. The tedious procedure of manual ground control point selection necessary for block-bundle adjustment (as ephemeral data is not available for the declassified images) is automated using the Maximally Stable Extremal Regions algorithm, which matches image elements between raw Hexagon images and georeferenced Landsat 15 meter panchromatic images. Additional automated Hexagon DEM processing, co-registration, and bias correction allow for direct comparison with modern ASTER and SRTM elevation data, thus quantifying glacier elevation and area changes over several decades across largely inaccessible mountainous regions. As consistent methodology is used for all glaciers, results will likely reveal significant spatial and temporal patterns in regional ice mass balance. Ultimately, these findings could have important implications for future water resource management in light of environmental change.

  15. Quantifying torso deformity in scoliosis

    NASA Astrophysics Data System (ADS)

    Ajemba, Peter O.; Kumar, Anish; Durdle, Nelson G.; Raso, V. James

    2006-03-01

    Scoliosis affects the alignment of the spine and the shape of the torso. Most scoliosis patients and their families are more concerned about the effect of scoliosis on the torso than its effect on the spine. There is a need to develop robust techniques for quantifying torso deformity based on full torso scans. In this paper, deformation indices obtained from orthogonal maps of full torso scans are used to quantify torso deformity in scoliosis. 'Orthogonal maps' are obtained by applying orthogonal transforms to 3D surface maps. (An 'orthogonal transform' maps a cylindrical coordinate system to a Cartesian coordinate system.) The technique was tested on 361 deformed computer models of the human torso and on 22 scans of volunteers (8 normal and 14 scoliosis). Deformation indices from the orthogonal maps correctly classified up to 95% of the volunteers with a specificity of 1.00 and a sensitivity of 0.91. In addition to classifying scoliosis, the system gives a visual representation of the entire torso in one view and is viable for use in a clinical environment for managing scoliosis.

  16. Quantifying the ice-albedo feedback through decoupling

    NASA Astrophysics Data System (ADS)

    Kravitz, B.; Rasch, P. J.

    2017-12-01

    The ice-albedo feedback involves numerous individual components, whereby warming induces sea ice melt, inducing reduced surface albedo, inducing increased surface shortwave absorption, causing further warming. Here we attempt to quantify the sea ice albedo feedback using an analogue of the "partial radiative perturbation" method, but where the governing mechanisms are directly decoupled in a climate model. As an example, we can isolate the insulating effects of sea ice on surface energy and moisture fluxes by allowing sea ice thickness to change but fixing Arctic surface albedo, or vice versa. Here we present results from such idealized simulations using the Community Earth System Model in which individual components are successively fixed, effectively decoupling the ice-albedo feedback loop. We isolate the different components of this feedback, including temperature change, sea ice extent/thickness, and air-sea exchange of heat and moisture. We explore the interactions between these different components, as well as the strengths of the total feedback in the decoupled feedback loop, to quantify contributions from individual pieces. We also quantify the non-additivity of the effects of the components as a means of investigating the dominant sources of nonlinearity in the ice-albedo feedback.

  17. Quantifying the transmission potential of pandemic influenza

    NASA Astrophysics Data System (ADS)

    Chowell, Gerardo; Nishiura, Hiroshi

    2008-03-01

    This article reviews quantitative methods to estimate the basic reproduction number of pandemic influenza, a key threshold quantity to help determine the intensity of interventions required to control the disease. Although it is difficult to assess the transmission potential of a probable future pandemic, historical epidemiologic data is readily available from previous pandemics, and as a reference quantity for future pandemic planning, mathematical and statistical analyses of historical data are crucial. In particular, because many historical records tend to document only the temporal distribution of cases or deaths (i.e. epidemic curve), our review focuses on methods to maximize the utility of time-evolution data and to clarify the detailed mechanisms of the spread of influenza. First, we highlight structured epidemic models and their parameter estimation method which can quantify the detailed disease dynamics including those we cannot observe directly. Duration-structured epidemic systems are subsequently presented, offering firm understanding of the definition of the basic and effective reproduction numbers. When the initial growth phase of an epidemic is investigated, the distribution of the generation time is key statistical information to appropriately estimate the transmission potential using the intrinsic growth rate. Applications of stochastic processes are also highlighted to estimate the transmission potential using similar data. Critically important characteristics of influenza data are subsequently summarized, followed by our conclusions to suggest potential future methodological improvements.

  18. Sunburn and sun-protective behaviors among adults with and without previous nonmelanoma skin cancer (NMSC): A population-based study.

    PubMed

    Fischer, Alexander H; Wang, Timothy S; Yenokyan, Gayane; Kang, Sewon; Chien, Anna L

    2016-08-01

    Individuals with previous nonmelanoma skin cancer (NMSC) are at increased risk for subsequent skin cancer, and should therefore limit ultraviolet exposure. We sought to determine whether individuals with previous NMSC engage in better sun protection than those with no skin cancer history. We pooled self-reported data (2005 and 2010 National Health Interview Surveys) from US non-Hispanic white adults (758 with and 34,161 without previous NMSC). We calculated adjusted prevalence odds ratios (aPOR) and 95% confidence intervals (CI), taking into account the complex survey design. Individuals with previous NMSC versus no history of NMSC had higher rates of frequent use of shade (44.3% vs 27.0%; aPOR 1.41; 95% CI 1.16-1.71), long sleeves (20.5% vs 7.7%; aPOR 1.55; 95% CI 1.21-1.98), a wide-brimmed hat (26.1% vs 10.5%; aPOR 1.52; 95% CI 1.24-1.87), and sunscreen (53.7% vs 33.1%; aPOR 2.11; 95% CI 1.73-2.59), but did not have significantly lower odds of recent sunburn (29.7% vs 40.7%; aPOR 0.95; 95% CI 0.77-1.17). Among those with previous NMSC, recent sunburn was inversely associated with age, sun avoidance, and shade but not sunscreen. Self-reported cross-sectional data and unavailable information quantifying regular sun exposure are limitations. Physicians should emphasize sunburn prevention when counseling patients with previous NMSC, especially younger adults, focusing on shade and sun avoidance over sunscreen. Copyright © 2016 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  19. Quantifying edge significance on maintaining global connectivity

    PubMed Central

    Qian, Yuhua; Li, Yebin; Zhang, Min; Ma, Guoshuai; Lu, Furong

    2017-01-01

    Global connectivity is a quite important issue for networks. The failures of some key edges may lead to breakdown of the whole system. How to find them will provide a better understanding on system robustness. Based on topological information, we propose an approach named LE (link entropy) to quantify the edge significance on maintaining global connectivity. Then we compare the LE with the other six acknowledged indices on the edge significance: the edge betweenness centrality, degree product, bridgeness, diffusion importance, topological overlap and k-path edge centrality. Experimental results show that the LE approach outperforms in quantifying edge significance on maintaining global connectivity. PMID:28349923

  20. Quantifying cell mono-layer cultures by video imaging.

    PubMed

    Miller, K S; Hook, L A

    1996-04-01

    A method is described in which the relative number of adherent cells in multi-well tissue-culture plates is assayed by staining the cells with Giemsa and capturing the image of the stained cells with a video camera and charged-coupled device. The resultant image is quantified using the associated video imaging software. The method is shown to be sensitive and reproducible and should be useful for studies where quantifying relative cell numbers and/or proliferation in vitro is required.

  1. 76 FR 23538 - Notice of Intent To Reinstate a Previously Approved Information Collection.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-27

    ... management, and conservation practices in order to quantify and assess current impacts of farming practices... (USDA) in 2003 as a multi-agency effort to quantify the environmental effects of conservation practices...) conducted on-site interviews with farmers during 2003-2006 to document tillage and irrigation practices...

  2. Non-Markovianity quantifier of an arbitrary quantum process

    NASA Astrophysics Data System (ADS)

    Debarba, Tiago; Fanchini, Felipe F.

    2017-12-01

    Calculating the degree of non-Markovianity of a quantum process, for a high-dimensional system, is a difficult task given complex maximization problems. Focusing on the entanglement-based measure of non-Markovianity we propose a numerically feasible quantifier for finite-dimensional systems. We define the non-Markovianity measure in terms of a class of entanglement quantifiers named witnessed entanglement which allow us to write several entanglement based measures of non-Markovianity in a unique formalism. In this formalism, we show that the non-Markovianity, in a given time interval, can be witnessed by calculating the expectation value of an observable, making it attractive for experimental investigations. Following this property we introduce a quantifier base on the entanglement witness in an interval of time; we show that measure is a bonafide measure of non-Markovianity. In our example, we use the generalized robustness of entanglement, an entanglement measure that can be readily calculated by a semidefinite programming method, to study impurity atoms coupled to a Bose-Einstein condensate.

  3. Quantifying functional mobility progress for chronic disease management.

    PubMed

    Boyle, Justin; Karunanithi, Mohan; Wark, Tim; Chan, Wilbur; Colavitti, Christine

    2006-01-01

    A method for quantifying improvements in functional mobility is presented based on patient-worn accelerometer devices. For patients with cardiovascular, respiratory, or other chronic disease, increasing the amount of functional mobility is a large component of rehabilitation programs. We have conducted an observational trial on the use of accelerometers for quantifying mobility improvements in a small group of chronic disease patients (n=15, 48 - 86 yrs). Cognitive impairments precluded complex instrumentation of patients, and movement data was obtained from a single 2-axis accelerometer device worn at the hip. In our trial, movement data collected from accelerometer devices was classified into Lying vs Sitting/Standing vs Walking/Activity movements. This classification enabled the amount of walking to be quantified and graphically presented to clinicians and carers for feedback on exercise efficacy. Presenting long term trends in this data to patients also provides valuable feedback for self managed care and assisting with compliance.

  4. Quantifying and minimizing entropy generation in AMTEC cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, T.J.; Huang, C.

    1997-12-31

    Entropy generation in an AMTEC cell represents inherent power loss to the AMTEC cell. Minimizing cell entropy generation directly maximizes cell power generation and efficiency. An internal project is on-going at AMPS to identify, quantify and minimize entropy generation mechanisms within an AMTEC cell, with the goal of determining cost-effective design approaches for maximizing AMTEC cell power generation. Various entropy generation mechanisms have been identified and quantified. The project has investigated several cell design techniques in a solar-driven AMTEC system to minimize cell entropy generation and produce maximum power cell designs. In many cases, various sources of entropy generation aremore » interrelated such that minimizing entropy generation requires cell and system design optimization. Some of the tradeoffs between various entropy generation mechanisms are quantified and explained and their implications on cell design are discussed. The relationship between AMTEC cell power and efficiency and entropy generation is presented and discussed.« less

  5. Characterizing within-subject variability in quantified measures of balance control: A cohort study.

    PubMed

    Worthen-Chaudhari, Lise C; Monfort, Scott M; Bland, Courtney; Pan, Xueliang; Chaudhari, Ajit M W

    2018-06-02

    To longitudinally assess individuals using quantified measures, we must characterize within-subject variability (WSV) of the measures. What is the natural within-subject variability (WSV) that can be expected in postural control over 3+ days? Thirteen individuals without orthopedic or neurologic impairment (mean(SD) = 55 (9) years; 76 (18) kg; 11 females/2 males) were recruited from a community workplace and consented to participate. Participants stood quietly with eyes closed (QEC) on a force platform (5 x 1 min x 6 days) in two stances: comfortable and narrow. We recorded center of pressure (COP) and calculated COP-based balance parameters. To analyze variance components, we applied a linear mixed model for repeated measures, calculating within-subject standard deviation (SDws) from the pooled variance not attributable to between-subject variability. To estimate WSV, we scaled SDws by a confidence interval (CI) factor (e.g. WSV at the 95%CI = WSV 95 = SDws * 1.96) and report WSV 95 for a range of conditions previously reported in the literature and the following measures previously found sensitive to or predictive of health: (primary) WSV 95 of root-mean square amplitude of medial-lateral COP during QEC (RMSml); (secondary) WSV 95 of COP ellipse area (COPa); (secondary) WSV 95 of mean medial-lateral COP velocity (COPvml) during QEC. WSV 95 was estimated at RMSml = 0.8 mm, COPa = 99mm 2 , and COPvml = 1.1 mm/s among healthy, middle-aged participants standing comfortably for one recommended data duration (4 × 30 s trials). A look up table provides values for alternate protocols that have been suggested in the literature and might prove relevant for clinical translation. This work advances longitudinal assessment of individuals using quantified measures of postural control. Results enable practitioners/researchers to assess an individual's progress, maintenance, or decline relative to WSV at a defined CI level. Copyright © 2018

  6. Quantifying falsifiability of scientific theories

    NASA Astrophysics Data System (ADS)

    Nemenman, Ilya

    I argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the statistical Occam's razor, and allows transforming some long-running arguments about validity of scientific theories from philosophical discussions to rigorous mathematical calculations.

  7. Development of a Method for Selecting Optimum Sites for the Automatic Mountain Meteorology Observation Station (AMOS) to Improve Predictability of Forest Fires in Inaccessible Area

    NASA Astrophysics Data System (ADS)

    Yoon, S.; Won, M.; Jang, K.; Lim, J.

    2016-12-01

    As there has been a recent increase in the case of forest fires in North Korea descending southward through the De-Militarized Zone (DMZ), ensuring proper response to such events has been a challenge. Therefore, in order to respond and manage these forest fires appropriately, an improvement in the forest fire predictability through integration of mountain weather information observed at the most optimal site is necessary. This study is a proactive case in which a spatial analysis and an on-site assessment method were developed for selecting an optimum site for a mountain weather observation in national forest. For spatial analysis, the class 1 and 2 forest fire danger areas for the past 10 years, accessibility maximum 100m, Automatic Weather Station (AWS) redundancy within 2.5km, and mountain terrains higher than 200m were analyzed. A final overlay analysis was performed to select the candidates for the field assessment. The sites selected through spatial analysis were quantitatively evaluated based on the optimal meteorological environment, forest and hiking trail accessibility, AWS redundancy, and supply of wireless communication and solar powered electricity. The sites with total score of 70 and higher were accepted as adequate. At the final selected sites, an AMOS was established, and integration of mountain and Korea Meteorological Administration (KMA) weather data improved the forest fire predictability in South Korea by 10%. Given these study results, we expect that establishing an automatic mountain meteorology observation station at the optimal sites in inaccessible area and integrating mountain weather data will improve the predictability of forest fires.

  8. Quantifying Semantic Linguistic Maturity in Children.

    PubMed

    Hansson, Kristina; Bååth, Rasmus; Löhndorf, Simone; Sahlén, Birgitta; Sikström, Sverker

    2016-10-01

    We propose a method to quantify semantic linguistic maturity (SELMA) based on a high dimensional semantic representation of words created from the co-occurrence of words in a large text corpus. The method was applied to oral narratives from 108 children aged 4;0-12;10. By comparing the SELMA measure with maturity ratings made by human raters we found that SELMA predicted the rating of semantic maturity made by human raters over and above the prediction made using a child's age and number of words produced. We conclude that the semantic content of narratives changes in a predictable pattern with children's age and argue that SELMA is a measure quantifying semantic linguistic maturity. The study opens up the possibility of using quantitative measures for studying the development of semantic representation in children's narratives, and emphasizes the importance of word co-occurrences for understanding the development of meaning.

  9. Quantifying Groundwater Model Uncertainty

    NASA Astrophysics Data System (ADS)

    Hill, M. C.; Poeter, E.; Foglia, L.

    2007-12-01

    Groundwater models are characterized by the (a) processes simulated, (b) boundary conditions, (c) initial conditions, (d) method of solving the equation, (e) parameterization, and (f) parameter values. Models are related to the system of concern using data, some of which form the basis of observations used most directly, through objective functions, to estimate parameter values. Here we consider situations in which parameter values are determined by minimizing an objective function. Other methods of model development are not considered because their ad hoc nature generally prohibits clear quantification of uncertainty. Quantifying prediction uncertainty ideally includes contributions from (a) to (f). The parameter values of (f) tend to be continuous with respect to both the simulated equivalents of the observations and the predictions, while many aspects of (a) through (e) are discrete. This fundamental difference means that there are options for evaluating the uncertainty related to parameter values that generally do not exist for other aspects of a model. While the methods available for (a) to (e) can be used for the parameter values (f), the inferential methods uniquely available for (f) generally are less computationally intensive and often can be used to considerable advantage. However, inferential approaches require calculation of sensitivities. Whether the numerical accuracy and stability of the model solution required for accurate sensitivities is more broadly important to other model uses is an issue that needs to be addressed. Alternative global methods can require 100 or even 1,000 times the number of runs needed by inferential methods, though methods of reducing the number of needed runs are being developed and tested. Here we present three approaches for quantifying model uncertainty and investigate their strengths and weaknesses. (1) Represent more aspects as parameters so that the computationally efficient methods can be broadly applied. This

  10. TU-EF-304-09: Quantifying the Biological Effects of Therapeutic Protons by LET Spectrum Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guan, F; Bronk, L; Kerr, M

    2015-06-15

    Purpose: To correlate in vitro cell kill with linear energy transfer (LET) spectra using Monte Carlo simulations and knowledge obtained from previous high-throughput in vitro proton relative biological effectiveness (RBE) measurements. Methods: The Monte Carlo simulation toolkit Geant4 was used to design the experimental setups and perform the dose, dose-averaged LET, and LET spectra calculations. The clonogenic assay was performed using the H460 lung cancer cell line in standard 6-well plates. Using two different experimental setups, the same dose and dose-averaged LET (12.6 keV/µm) was delivered to the cell layer; however, each respective energy or LET spectrum was different. Wemore » quantified the dose contributions from high-LET (≥10 keV/µm, threshold determined by previous RBE measurements) events in the LET spectra separately for these two setups as 39% and 53%. 8 dose levels with 1 Gy increments were delivered. The photon reference irradiation was performed using 6 MV x-rays from a LINAC. Results: The survival curves showed that both proton irradiations demonstrated an increased RBE compared to the reference photon irradiation. Within the proton-irradiated cells, the setup with 53% dose contribution from high-LET events exhibited the higher biological effectiveness. Conclusion: The experimental results indicate that the dose-averaged LET may not be an appropriate indicator to quantify the biological effects of protons when the LET spectrum is broad enough to contain both low- and high-LET events. Incorporating the LET spectrum distribution into robust intensity-modulated proton therapy optimization planning may provide more accurate biological dose distribution than using the dose-averaged LET. NIH Program Project Grant 2U19CA021239-35.« less

  11. The comprehension and production of quantifiers in isiXhosa-speaking Grade 1 learners

    PubMed Central

    Southwood, Frenette

    2016-01-01

    Background Quantifiers form part of the discourse-internal linguistic devices that children need to access and produce narratives and other classroom discourse. Little is known about the development - especially the prodiction - of quantifiers in child language, specifically in speakers of an African language. Objectives The study aimed to ascertain how well Grade 1 isiXhosa first language (L1) learners perform at the beginning and at the end of Grade 1 on quantifier comprehension and production tasks. Method Two low socioeconomic groups of L1 isiXhosa learners with either isiXhosa or English as language of learning and teaching (LOLT) were tested in February and November of their Grade 1 year with tasks targeting several quantifiers. Results The isiXhosa LOLT group comprehended no/none, any and all fully either in February or then in November of Grade 1, and they produced all assessed quantifiers in February of Grade 1. For the English LOLT group, neither the comprehension nor the production of quantifiers was mastered by the end of Grade 1, although there was a significant increase in both their comprehension and production scores. Conclusion The English LOLT group made significant progress in comprehension and production of quantifiers, but still performed worse than peers who had their L1 as LOLT. Generally, children with no or very little prior knowledge of the LOLT need either, (1) more deliberate exposure to quantifier-rich language or, (2) longer exposure to general classroom language before quantifiers can be expected to be mastered sufficiently to allow access to quantifier-related curriculum content. PMID:27245132

  12. Quantifying, Visualizing, and Monitoring Lead Optimization.

    PubMed

    Maynard, Andrew T; Roberts, Christopher D

    2016-05-12

    Although lead optimization (LO) is by definition a process, process-centric analysis and visualization of this important phase of pharmaceutical R&D has been lacking. Here we describe a simple statistical framework to quantify and visualize the progression of LO projects so that the vital signs of LO convergence can be monitored. We refer to the resulting visualizations generated by our methodology as the "LO telemetry" of a project. These visualizations can be automated to provide objective, holistic, and instantaneous analysis and communication of LO progression. This enhances the ability of project teams to more effectively drive LO process, while enabling management to better coordinate and prioritize LO projects. We present the telemetry of five LO projects comprising different biological targets and different project outcomes, including clinical compound selection, termination due to preclinical safety/tox, and termination due to lack of tractability. We demonstrate that LO progression is accurately captured by the telemetry. We also present metrics to quantify LO efficiency and tractability.

  13. Quantifying capital goods for biological treatment of organic waste.

    PubMed

    Brogaard, Line K; Petersen, Per H; Nielsen, Peter D; Christensen, Thomas H

    2015-02-01

    Materials and energy used for construction of anaerobic digestion (AD) and windrow composting plants were quantified in detail. The two technologies were quantified in collaboration with consultants and producers of the parts used to construct the plants. The composting plants were quantified based on the different sizes for the three different types of waste (garden and park waste, food waste and sludge from wastewater treatment) in amounts of 10,000 or 50,000 tonnes per year. The AD plant was quantified for a capacity of 80,000 tonnes per year. Concrete and steel for the tanks were the main materials for the AD plant. For the composting plants, gravel and concrete slabs for the pavement were used in large amounts. To frame the quantification, environmental impact assessments (EIAs) showed that the steel used for tanks at the AD plant and the concrete slabs at the composting plants made the highest contribution to Global Warming. The total impact on Global Warming from the capital goods compared to the operation reported in the literature on the AD plant showed an insignificant contribution of 1-2%. For the composting plants, the capital goods accounted for 10-22% of the total impact on Global Warming from composting. © The Author(s) 2015.

  14. Microvascular remodelling in preeclampsia: quantifying capillary rarefaction accurately and independently predicts preeclampsia.

    PubMed

    Antonios, Tarek F T; Nama, Vivek; Wang, Duolao; Manyonda, Isaac T

    2013-09-01

    Preeclampsia is a major cause of maternal and neonatal mortality and morbidity. The incidence of preeclampsia seems to be rising because of increased prevalence of predisposing disorders, such as essential hypertension, diabetes, and obesity, and there is increasing evidence to suggest widespread microcirculatory abnormalities before the onset of preeclampsia. We hypothesized that quantifying capillary rarefaction could be helpful in the clinical prediction of preeclampsia. We measured skin capillary density according to a well-validated protocol at 5 consecutive predetermined visits in 322 consecutive white women, of whom 16 subjects developed preeclampsia. We found that structural capillary rarefaction at 20-24 weeks of gestation yielded a sensitivity of 0.87 with a specificity of 0.50 at the cutoff of 2 capillaries/field with the area under the curve of the receiver operating characteristic value of 0.70, whereas capillary rarefaction at 27-32 weeks of gestation yielded a sensitivity of 0.75 and a higher specificity of 0.77 at the cutoff of 8 capillaries/field with area under the curve of the receiver operating characteristic value of 0.82. Combining capillary rarefaction with uterine artery Doppler pulsatility index increased the sensitivity and specificity of the prediction. Multivariable analysis shows that the odds of preeclampsia are increased in women with previous history of preeclampsia or chronic hypertension and in those with increased uterine artery Doppler pulsatility index, but the most powerful and independent predictor of preeclampsia was capillary rarefaction at 27-32 weeks. Quantifying structural rarefaction of skin capillaries in pregnancy is a potentially useful clinical marker for the prediction of preeclampsia.

  15. COMPLEXITY&APPROXIMABILITY OF QUANTIFIED&STOCHASTIC CONSTRAINT SATISFACTION PROBLEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, H. B.; Marathe, M. V.; Stearns, R. E.

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S and T be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SATc(S).) Here, we study simultaneously the complexity of decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. We present simple yet general techniques to characterize simultaneously, the complexity ormore » efficient approximability of a number of versions/variants of the problems SAT(S), Q-SAT(S), S-SAT(S),MAX-Q-SAT(S) etc., for many different such D,C ,S, T. These versions/variants include decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. Our unified approach is based on the following two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic represent ability. Some of the results extend the earlier results in [Pa85,LMP99,CF+93,CF+94O]u r techniques and results reported here also provide significant steps towards obtaining dichotomy theorems, for a number of the problems above, including the problems MAX-&-SAT( S), and MAX-S-SAT(S). The discovery of such dichotomy theorems, for unquantified formulas, has received significant recent attention in the literature [CF+93,CF+94,Cr95,KSW97]« less

  16. The Emergence of the Quantified Child

    ERIC Educational Resources Information Center

    Smith, Rebecca

    2017-01-01

    Using document analysis, this paper examines the historical emergence of the quantified child, revealing how the collection and use of data has become normalized through legitimizing discourses. First, following in the traditions of Foucault's genealogy and studies examining the sociology of numbers, this paper traces the evolution of data…

  17. Quantifying lost information due to covariance matrix estimation in parameter inference

    NASA Astrophysics Data System (ADS)

    Sellentin, Elena; Heavens, Alan F.

    2017-02-01

    Parameter inference with an estimated covariance matrix systematically loses information due to the remaining uncertainty of the covariance matrix. Here, we quantify this loss of precision and develop a framework to hypothetically restore it, which allows to judge how far away a given analysis is from the ideal case of a known covariance matrix. We point out that it is insufficient to estimate this loss by debiasing the Fisher matrix as previously done, due to a fundamental inequality that describes how biases arise in non-linear functions. We therefore develop direct estimators for parameter credibility contours and the figure of merit, finding that significantly fewer simulations than previously thought are sufficient to reach satisfactory precisions. We apply our results to DES Science Verification weak lensing data, detecting a 10 per cent loss of information that increases their credibility contours. No significant loss of information is found for KiDS. For a Euclid-like survey, with about 10 nuisance parameters we find that 2900 simulations are sufficient to limit the systematically lost information to 1 per cent, with an additional uncertainty of about 2 per cent. Without any nuisance parameters, 1900 simulations are sufficient to only lose 1 per cent of information. We further derive estimators for all quantities needed for forecasting with estimated covariance matrices. Our formalism allows to determine the sweetspot between running sophisticated simulations to reduce the number of nuisance parameters, and running as many fast simulations as possible.

  18. Plantar pressure and daily cumulative stress in persons affected by leprosy with current, previous and no previous foot ulceration.

    PubMed

    van Schie, Carine H M; Slim, Frederik J; Keukenkamp, Renske; Faber, William R; Nollet, Frans

    2013-03-01

    Not only plantar pressure but also weight-bearing activity affects accumulated mechanical stress to the foot and may be related to foot ulceration. To date, activity has not been accounted for in leprosy. The purpose was to compare barefoot pressure, in-shoe pressure and daily cumulative stress between persons affected by leprosy with and without previous or current foot ulceration. Nine persons with current plantar ulceration were compared to 15 with previous and 15 without previous ulceration. Barefoot peak pressure (EMED-X), in-shoe peak pressure (Pedar-X) and daily cumulative stress (in-shoe forefoot pressure time integral×mean daily strides (Stepwatch™ Activity Monitor)) were measured. Barefoot peak pressure was increased in persons with current and previous compared to no previous foot ulceration (mean±SD=888±222 and 763±335 vs 465±262kPa, p<0.05). In-shoe peak pressure was only increased in persons with current compared to without previous ulceration (mean±SD=412±145 vs 269±70kPa, p<0.05). Daily cumulative stress was not different between groups, although persons with current and previous foot ulceration were less active. Although barefoot peak pressure was increased in people with current and previous plantar ulceration, it did not discriminate between these groups. While in-shoe peak pressure was increased in persons with current ulceration, they were less active, resulting in no difference in daily cumulative stress. Increased in-shoe peak pressure suggests insufficient pressure reducing footwear in persons with current ulceration, highlighting the importance of pressure reducing qualities of footwear. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Quantifying ecological thresholds from response surfaces

    Treesearch

    Heather E. Lintz; Bruce McCune; Andrew N. Gray; Katherine A. McCulloh

    2011-01-01

    Ecological thresholds are abrupt changes of ecological state. While an ecological threshold is a widely accepted concept, most empirical methods detect them in time or across geographic space. Although useful, these approaches do not quantify the direct drivers of threshold response. Causal understanding of thresholds detected empirically requires their investigation...

  20. Quantifying the impacts of global disasters

    NASA Astrophysics Data System (ADS)

    Jones, L. M.; Ross, S.; Wilson, R. I.; Borrero, J. C.; Brosnan, D.; Bwarie, J. T.; Geist, E. L.; Hansen, R. A.; Johnson, L. A.; Kirby, S. H.; Long, K.; Lynett, P. J.; Miller, K. M.; Mortensen, C. E.; Perry, S. C.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Thio, H. K.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2012-12-01

    The US Geological Survey, National Oceanic and Atmospheric Administration, California Geological Survey, and other entities are developing a Tsunami Scenario, depicting a realistic outcome of a hypothetical but plausible large tsunami originating in the eastern Aleutian Arc, affecting the west coast of the United States, including Alaska and Hawaii. The scenario includes earth-science effects, damage and restoration of the built environment, and social and economic impacts. Like the earlier ShakeOut and ARkStorm disaster scenarios, the purpose of the Tsunami Scenario is to apply science to quantify the impacts of natural disasters in a way that can be used by decision makers in the affected sectors to reduce the potential for loss. Most natural disasters are local. A major hurricane can destroy a city or damage a long swath of coastline while mostly sparing inland areas. The largest earthquake on record caused strong shaking along 1500 km of Chile, but left the capital relatively unscathed. Previous scenarios have used the local nature of disasters to focus interaction with the user community. However, the capacity for global disasters is growing with the interdependency of the global economy. Earthquakes have disrupted global computer chip manufacturing and caused stock market downturns. Tsunamis, however, can be global in their extent and direct impact. Moreover, the vulnerability of seaports to tsunami damage can increase the global consequences. The Tsunami Scenario is trying to capture the widespread effects while maintaining the close interaction with users that has been one of the most successful features of the previous scenarios. The scenario tsunami occurs in the eastern Aleutians with a source similar to the 2011 Tohoku event. Geologic similarities support the argument that a Tohoku-like source is plausible in Alaska. It creates a major nearfield tsunami in the Aleutian arc and peninsula, a moderate tsunami in the US Pacific Northwest, large but not the

  1. Quantifying Arabia-Eurasia convergence accommodated in the Greater Caucasus by paleomagnetic reconstruction

    NASA Astrophysics Data System (ADS)

    van der Boon, A.; van Hinsbergen, D. J. J.; Rezaeian, M.; Gürer, D.; Honarmand, M.; Pastor-Galán, D.; Krijgsman, W.; Langereis, C. G.

    2018-01-01

    Since the late Eocene, convergence and subsequent collision between Arabia and Eurasia was accommodated both in the overriding Eurasian plate forming the Greater Caucasus orogen and the Iranian plateau, and by subduction and accretion of the Neotethys and Arabian margin forming the East Anatolian plateau and the Zagros. To quantify how much Arabia-Eurasia convergence was accommodated in the Greater Caucasus region, we here provide new paleomagnetic results from 97 volcanic sites (∼500 samples) in the Talysh Mountains of NW Iran, that show ∼15° net clockwise rotation relative to Eurasia since the Eocene. We apply a first-order kinematic restoration of the northward convex orocline that formed to the south of the Greater Caucasus, integrating our new data with previously published constraints on rotations of the Eastern Pontides and Lesser Caucasus. This suggests that north of the Talysh ∼120 km of convergence must have been accommodated. North of the Eastern Pontides and Lesser Caucasus this is significantly more: 200-280 km. Our reconstruction independently confirms previous Caucasus convergence estimates. Moreover, we show for the first time a sharp contrast of convergence between the Lesser Caucasus and the Talysh. This implies that the ancient Paleozoic-Mesozoic transform plate boundary, preserved between the Iranian and East-Anatolian plateaus, was likely reactivated as a right-lateral transform fault since late Eocene time.

  2. Quantifying on- and off-target genome editing.

    PubMed

    Hendel, Ayal; Fine, Eli J; Bao, Gang; Porteus, Matthew H

    2015-02-01

    Genome editing with engineered nucleases is a rapidly growing field thanks to transformative technologies that allow researchers to precisely alter genomes for numerous applications including basic research, biotechnology, and human gene therapy. While the ability to make precise and controlled changes at specified sites throughout the genome has grown tremendously in recent years, we still lack a comprehensive and standardized battery of assays for measuring the different genome editing outcomes created at endogenous genomic loci. Here we review the existing assays for quantifying on- and off-target genome editing and describe their utility in advancing the technology. We also highlight unmet assay needs for quantifying on- and off-target genome editing outcomes and discuss their importance for the genome editing field. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Quantifying the Benefits of NEC

    DTIC Science & Technology

    2005-04-01

    R UNCLASSIFIED/UNLIMITED Quantifying the Benefits of NEC Georgia Court and Lynda CM Sharp Dstl/UK MoD A2 Building, Dstl Farnborough Ively...transformation of UK forces is dependent on exploiting the benefits of Network Enabled Capability (NEC). The white paper notes that NEC, through... benefit to defence; • what can be traded off to pay for it; • what changes are required to processes, structures, equipment etc, to deliver the

  4. A fluorescent imaging technique for quantifying spray deposits on plant leaves

    USDA-ARS?s Scientific Manuscript database

    Because of the unique characteristics of electrostatically-charged sprays, use of traditional methods to quantify deposition from these sprays has been challenging. A new fluorescent imaging technique was developed to quantify spray deposits from electrostatically-charged sprays on natural plant lea...

  5. Quantifying the Reuse of Learning Objects

    ERIC Educational Resources Information Center

    Elliott, Kristine; Sweeney, Kevin

    2008-01-01

    This paper reports the findings of one case study from a larger project, which aims to quantify the claimed efficiencies of reusing learning objects to develop e-learning resources. The case study describes how an online inquiry project "Diabetes: A waste of energy" was developed by searching for, evaluating, modifying and then…

  6. Quantifying the Urban and Rural Nutrient Fluxes to Lake Erie Using a Paired Watershed Approach

    NASA Astrophysics Data System (ADS)

    Hopkins, M.; Beck, M.; Rossi, E.; Luh, N.; Allen-King, R. M.; Lowry, C.

    2016-12-01

    Excess nutrients have a detrimental impact on the water quality of Lake Erie, specifically nitrate and phosphate, which can lead to toxic algae blooms. Algae blooms have negatively impacted Lake Erie, which is the main source of drinking water for many coastal Great Lake communities. In 2014 the city of Toledo, Ohio was forced to shut down its water treatment plant due to these toxic algae blooms. The objective of this research is to quantify surface water nutrient fluxes to the eastern basin of Lake Erie using a paired watershed approach. Three different western New York watersheds that feed Lake Erie were chosen based on land use and areal extent: one small urban, one small rural, and one large rural. These paired watersheds were chosen to represent a range of sources of potential nutrient loading to the lake. Biweekly water samples were taken from the streams during the 2015-2016 winter to summer seasonal transition to quantify springtime snow melt effects on nutrient fluxes. These results were compared to the previous year samples, collected over the summer of 2015, which represented wetter conditions. Phosphorous levels were assessed using the ascorbic acid colorimetric assay, while nitrate was analyzed by anion-exchange chromatography. Stream gaging was used to obtain flow measurements and establish a rating curve, which was incorporated to quantify seasonal nutrient fluxes entering the lake. Patterns in the nutrient levels show higher level of nutrients in the rural watersheds with a decrease in concentration over the winter to spring transition. However, nutrient patterns in the urban stream show relatively constant patters of nutrient flux, which is independent of seasonal transition or stream discharge. A comparison of wet and dry seasons shows higher nutrient concentrations during summers with greater rainfall. By identifying the largest contributors of each nutrient, we can better allocate limited attenuation resources.

  7. Quantifying Ballistic Armor Performance: A Minimally Invasive Approach

    NASA Astrophysics Data System (ADS)

    Holmes, Gale; Kim, Jaehyun; Blair, William; McDonough, Walter; Snyder, Chad

    2006-03-01

    Theoretical and non-dimensional analyses suggest a critical link between the performance of ballistic resistant armor and the fundamental mechanical properties of the polymeric materials that comprise them. Therefore, a test methodology that quantifies these properties without compromising an armored vest that is exposed to the industry standard V-50 ballistic performance test is needed. Currently, there is considerable speculation about the impact that competing degradation mechanisms (e.g., mechanical, humidity, ultraviolet) may have on ballistic resistant armor. We report on the use of a new test methodology that quantifies the mechanical properties of ballistic fibers and how each proposed degradation mechanism may impact a vest's ballistic performance.

  8. The extended Price equation quantifies species selection on mammalian body size across the Palaeocene/Eocene Thermal Maximum.

    PubMed

    Rankin, Brian D; Fox, Jeremy W; Barrón-Ortiz, Christian R; Chew, Amy E; Holroyd, Patricia A; Ludtke, Joshua A; Yang, Xingkai; Theodor, Jessica M

    2015-08-07

    Species selection, covariation of species' traits with their net diversification rates, is an important component of macroevolution. Most studies have relied on indirect evidence for its operation and have not quantified its strength relative to other macroevolutionary forces. We use an extension of the Price equation to quantify the mechanisms of body size macroevolution in mammals from the latest Palaeocene and earliest Eocene of the Bighorn and Clarks Fork Basins of Wyoming. Dwarfing of mammalian taxa across the Palaeocene/Eocene Thermal Maximum (PETM), an intense, brief warming event that occurred at approximately 56 Ma, has been suggested to reflect anagenetic change and the immigration of small bodied-mammals, but might also be attributable to species selection. Using previously reconstructed ancestor-descendant relationships, we partitioned change in mean mammalian body size into three distinct mechanisms: species selection operating on resident mammals, anagenetic change within resident mammalian lineages and change due to immigrants. The remarkable decrease in mean body size across the warming event occurred through anagenetic change and immigration. Species selection also was strong across the PETM but, intriguingly, favoured larger-bodied species, implying some unknown mechanism(s) by which warming events affect macroevolution. © 2015 The Author(s).

  9. Leveraging Distant Relatedness to Quantify Human Mutation and Gene-Conversion Rates

    PubMed Central

    Palamara, Pier Francesco; Francioli, Laurent C.; Wilton, Peter R.; Genovese, Giulio; Gusev, Alexander; Finucane, Hilary K.; Sankararaman, Sriram; Sunyaev, Shamil R.; de Bakker, Paul I.W.; Wakeley, John; Pe’er, Itsik; Price, Alkes L.

    2015-01-01

    The rate at which human genomes mutate is a central biological parameter that has many implications for our ability to understand demographic and evolutionary phenomena. We present a method for inferring mutation and gene-conversion rates by using the number of sequence differences observed in identical-by-descent (IBD) segments together with a reconstructed model of recent population-size history. This approach is robust to, and can quantify, the presence of substantial genotyping error, as validated in coalescent simulations. We applied the method to 498 trio-phased sequenced Dutch individuals and inferred a point mutation rate of 1.66 × 10−8 per base per generation and a rate of 1.26 × 10−9 for <20 bp indels. By quantifying how estimates varied as a function of allele frequency, we inferred the probability that a site is involved in non-crossover gene conversion as 5.99 × 10−6. We found that recombination does not have observable mutagenic effects after gene conversion is accounted for and that local gene-conversion rates reflect recombination rates. We detected a strong enrichment of recent deleterious variation among mismatching variants found within IBD regions and observed summary statistics of local sharing of IBD segments to closely match previously proposed metrics of background selection; however, we found no significant effects of selection on our mutation-rate estimates. We detected no evidence of strong variation of mutation rates in a number of genomic annotations obtained from several recent studies. Our analysis suggests that a mutation-rate estimate higher than that reported by recent pedigree-based studies should be adopted in the context of DNA-based demographic reconstruction. PMID:26581902

  10. A Synthesis and Comparison of Approaches for Quantifying Coral Reef Structure

    NASA Astrophysics Data System (ADS)

    Duvall, M. S.; Hench, J. L.

    2016-02-01

    The complex physical structures of coral reefs provide substrate for benthic organisms, surface area for material fluxes, and have been used as a predictor of reef-fish biomass and biodiversity. Coral reef topography has a first order effect on reef hydrodynamics by imposing drag forces and increasing momentum and scalar dispersion. Despite its importance, quantifying reef topography remains a challenge, as it is patchy and discontinuous while also varying over orders of magnitude in spatial scale. Previous studies have quantified reef structure using a range of 1D and 2D metrics that estimate vertical roughness, which is the departure from a flat geometric profile or surface. However, there is no general mathematical or conceptual framework by which to apply or compare these roughness metrics. While the specific calculations of different metrics vary, we propose that they can be classified into four categories based on: 1) vertical relief relative to a reference height; 2) gradients in vertical relief; 3) surface contour distance; or 4) variations in roughness with scale. We apply metrics from these four classes to idealized reef topography as well as natural reef topography data from Moorea, French Polynesia. Through the use of idealized profiles, we demonstrate the potential for reefs with different morphologies to possess the same value for some scale-dependent metrics (i.e. classes 1-3). Due to the superposition of variable-scale roughness elements in reef topography, we find that multi-scale metrics (i.e. class 4) can better characterize structural complexity by capturing surface roughness across a range of spatial scales. In particular, we provide evidence of the ability of 1D continuous wavelet transforms to detect changes in dominant roughness scales on idealized topography as well as within real reef systems.

  11. Quantifying and measuring cyber resiliency

    NASA Astrophysics Data System (ADS)

    Cybenko, George

    2016-05-01

    Cyber resliency has become an increasingly attractive research and operational concept in cyber security. While several metrics have been proposed for quantifying cyber resiliency, a considerable gap remains between those metrics and operationally measurable and meaningful concepts that can be empirically determined in a scientific manner. This paper describes a concrete notion of cyber resiliency that can be tailored to meet specific needs of organizations that seek to introduce resiliency into their assessment of their cyber security posture.

  12. Quantifying Bell nonlocality with the trace distance

    NASA Astrophysics Data System (ADS)

    Brito, S. G. A.; Amaral, B.; Chaves, R.

    2018-02-01

    Measurements performed on distant parts of an entangled quantum state can generate correlations incompatible with classical theories respecting the assumption of local causality. This is the phenomenon known as quantum nonlocality that, apart from its fundamental role, can also be put to practical use in applications such as cryptography and distributed computing. Clearly, developing ways of quantifying nonlocality is an important primitive in this scenario. Here, we propose to quantify the nonlocality of a given probability distribution via its trace distance to the set of classical correlations. We show that this measure is a monotone under the free operations of a resource theory and, furthermore, that it can be computed efficiently with a linear program. We put our framework to use in a variety of relevant Bell scenarios also comparing the trace distance to other standard measures in the literature.

  13. Classifying and quantifying basins of attraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprott, J. C.; Xiong, Anda

    2015-08-15

    A scheme is proposed to classify the basins for attractors of dynamical systems in arbitrary dimensions. There are four basic classes depending on their size and extent, and each class can be further quantified to facilitate comparisons. The calculation uses a Monte Carlo method and is applied to numerous common dissipative chaotic maps and flows in various dimensions.

  14. Subtleties of Hidden Quantifiers in Implication

    ERIC Educational Resources Information Center

    Shipman, Barbara A.

    2016-01-01

    Mathematical conjectures and theorems are most often of the form P(x) ? Q(x), meaning ?x,P(x) ? Q(x). The hidden quantifier ?x is crucial in understanding the implication as a statement with a truth value. Here P(x) and Q(x) alone are only predicates, without truth values, since they contain unquantified variables. But standard textbook…

  15. Quantifying Semantic Linguistic Maturity in Children

    ERIC Educational Resources Information Center

    Hansson, Kristina; Bååth, Rasmus; Löhndorf, Simone; Sahlén, Birgitta; Sikström, Sverker

    2016-01-01

    We propose a method to quantify "semantic linguistic maturity" (SELMA) based on a high dimensional semantic representation of words created from the co-occurrence of words in a large text corpus. The method was applied to oral narratives from 108 children aged 4;0-12;10. By comparing the SELMA measure with maturity ratings made by human…

  16. Diagnosis and characterization of mania: Quantifying increased energy and activity in the human behavioral pattern monitor

    PubMed Central

    Perry, William; McIlwain, Meghan; Kloezeman, Karen; Henry, Brook L.; Minassian, Arpi

    2016-01-01

    Increased energy or activity is now an essential feature of the mania of Bipolar Disorder (BD) according to DSM-5. This study examined whether objective measures of increased energy can differentiate manic BD individuals and provide greater diagnostic accuracy compared to rating scales, extending the work of previous studies with smaller samples. We also tested the relationship between objective measures of energy and rating scales. 50 hospitalized manic BD patients were compared to healthy subjects (HCS, n=39) in the human Behavioral Pattern Monitor (hBPM) which quantifies motor activity and goal-directed behavior in an environment containing novel stimuli. Archival hBPM data from 17 schizophrenia patients were used in sensitivity and specificity analyses. Manic BD patients exhibited higher motor activity than HCS and higher novel object interactions. hBPM activity measures were not correlated with observer-rated symptoms, and hBPM activity was more sensitive in accurately classifying hospitalized BD subjects than observer ratings. Although the findings can only be generalized to inpatient populations, they suggest that increased energy, particularly specific and goal-directed exploration, is a distinguishing feature of BD mania and is best quantified by objective measures of motor activity. A better understanding is needed of the biological underpinnings of this cardinal feature. PMID:27138818

  17. Repeat haematinic requests in patients with previous normal results: the scale of the problem in elderly patients at a district general hospital.

    PubMed

    Ganiyu-Dada, Z; Bowcock, S

    2011-12-01

    Repeating normal laboratory tests can waste resources. This study aimed to quantify unnecessary repeat haematinic tests taken from the elderly in a district general hospital. Haematinic tests (ferritin, B12, serum folate) from patients age ≥ 70 years were reviewed for repeat tests during an 8-week period. Questionnaires were given to doctors to establish when the considered repeating a 'borderline low normal' result to be clinically justifiable. 7.7% of all haematinic tests were repeat tests and of these, the majority (83%) was performed following a previously normal result. Thirteen of 24 doctors believed repeating a normal result at the bottom of the normal range ('borderline low normal') was justifiable. After excluding 'borderline low normal' results, 6.0% (at minimum) of repeat tests were done following a previous normal result and were unnecessary. This audit showed that there are a significant number of unnecessary repeat haematinic tests being performed. © 2011 Blackwell Publishing Ltd.

  18. 14 CFR 60.17 - Previously qualified FSTDs.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 2 2014-01-01 2014-01-01 false Previously qualified FSTDs. 60.17 Section 60.17 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN FLIGHT SIMULATION TRAINING DEVICE INITIAL AND CONTINUING QUALIFICATION AND USE § 60.17 Previously...

  19. 14 CFR 60.17 - Previously qualified FSTDs.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 2 2012-01-01 2012-01-01 false Previously qualified FSTDs. 60.17 Section 60.17 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN FLIGHT SIMULATION TRAINING DEVICE INITIAL AND CONTINUING QUALIFICATION AND USE § 60.17 Previously...

  20. 14 CFR 60.17 - Previously qualified FSTDs.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 2 2011-01-01 2011-01-01 false Previously qualified FSTDs. 60.17 Section 60.17 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN FLIGHT SIMULATION TRAINING DEVICE INITIAL AND CONTINUING QUALIFICATION AND USE § 60.17 Previously...

  1. 14 CFR 60.17 - Previously qualified FSTDs.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 2 2013-01-01 2013-01-01 false Previously qualified FSTDs. 60.17 Section 60.17 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN FLIGHT SIMULATION TRAINING DEVICE INITIAL AND CONTINUING QUALIFICATION AND USE § 60.17 Previously...

  2. 14 CFR 60.17 - Previously qualified FSTDs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 2 2010-01-01 2010-01-01 false Previously qualified FSTDs. 60.17 Section 60.17 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN FLIGHT SIMULATION TRAINING DEVICE INITIAL AND CONTINUING QUALIFICATION AND USE § 60.17 Previously...

  3. 49 CFR 173.23 - Previously authorized packaging.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Previously authorized packaging. 173.23 Section... REQUIREMENTS FOR SHIPMENTS AND PACKAGINGS Preparation of Hazardous Materials for Transportation § 173.23 Previously authorized packaging. (a) When the regulations specify a packaging with a specification marking...

  4. Quantifying Stock Return Distributions in Financial Markets.

    PubMed

    Botta, Federico; Moat, Helen Susannah; Stanley, H Eugene; Preis, Tobias

    2015-01-01

    Being able to quantify the probability of large price changes in stock markets is of crucial importance in understanding financial crises that affect the lives of people worldwide. Large changes in stock market prices can arise abruptly, within a matter of minutes, or develop across much longer time scales. Here, we analyze a dataset comprising the stocks forming the Dow Jones Industrial Average at a second by second resolution in the period from January 2008 to July 2010 in order to quantify the distribution of changes in market prices at a range of time scales. We find that the tails of the distributions of logarithmic price changes, or returns, exhibit power law decays for time scales ranging from 300 seconds to 3600 seconds. For larger time scales, we find that the distributions tails exhibit exponential decay. Our findings may inform the development of models of market behavior across varying time scales.

  5. 22 CFR 40.91 - Certain aliens previously removed.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 22 Foreign Relations 1 2012-04-01 2012-04-01 false Certain aliens previously removed. 40.91... IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.91 Certain aliens previously removed. (a) 5-year bar. An alien who has been found inadmissible, whether as a result...

  6. 22 CFR 40.91 - Certain aliens previously removed.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 22 Foreign Relations 1 2013-04-01 2013-04-01 false Certain aliens previously removed. 40.91... IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.91 Certain aliens previously removed. (a) 5-year bar. An alien who has been found inadmissible, whether as a result...

  7. 22 CFR 40.91 - Certain aliens previously removed.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 22 Foreign Relations 1 2014-04-01 2014-04-01 false Certain aliens previously removed. 40.91... IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.91 Certain aliens previously removed. (a) 5-year bar. An alien who has been found inadmissible, whether as a result...

  8. 22 CFR 40.91 - Certain aliens previously removed.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Certain aliens previously removed. 40.91... IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.91 Certain aliens previously removed. (a) 5-year bar. An alien who has been found inadmissible, whether as a result...

  9. 22 CFR 40.91 - Certain aliens previously removed.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Certain aliens previously removed. 40.91... IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.91 Certain aliens previously removed. (a) 5-year bar. An alien who has been found inadmissible, whether as a result...

  10. A stochastic approach for quantifying immigrant integration: the Spanish test case

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Barra, Adriano; Contucci, Pierluigi; Sandell, Richard; Vernia, Cecilia

    2014-10-01

    We apply stochastic process theory to the analysis of immigrant integration. Using a unique and detailed data set from Spain, we study the relationship between local immigrant density and two social and two economic immigration quantifiers for the period 1999-2010. As opposed to the classic time-series approach, by letting immigrant density play the role of ‘time’ and the quantifier the role of ‘space,’ it becomes possible to analyse the behavior of the quantifiers by means of continuous time random walks. Two classes of results are then obtained. First, we show that social integration quantifiers evolve following diffusion law, while the evolution of economic quantifiers exhibits ballistic dynamics. Second, we make predictions of best- and worst-case scenarios taking into account large local fluctuations. Our stochastic process approach to integration lends itself to interesting forecasting scenarios which, in the hands of policy makers, have the potential to improve political responses to integration problems. For instance, estimating the standard first-passage time and maximum-span walk reveals local differences in integration performance for different immigration scenarios. Thus, by recognizing the importance of local fluctuations around national means, this research constitutes an important tool to assess the impact of immigration phenomena on municipal budgets and to set up solid multi-ethnic plans at the municipal level as immigration pressures build.

  11. Divers models of divalent cation interaction to calcium-binding proteins: techniques and anthology.

    PubMed

    Cox, Jos A

    2013-01-01

    Intracellular Ca(2+)-binding proteins (CaBPs) are sensors of the calcium signal and several of them even shape the signal. Most of them are equipped with at least two EF-hand motifs designed to bind Ca(2+). Their affinities are very variable, can display cooperative effects, and can be modulated by physiological Mg(2+) concentrations. These binding phenomena are monitored by four major techniques: equilibrium dialysis, fluorimetry with fluorescent Ca(2+) indicators, flow dialysis, and isothermal titration calorimetry. In the last quarter of the twentieth century reports on the ion-binding characteristics of several abundant wild-type CaBPs were published. With the advent of recombinant CaBPs it became possible to determine these properties on previously inaccessible proteins. Here I report on studies by our group carried out in the last decade on eight families of recombinant CaBPs, their mutants, or truncated domains. Moreover this chapter deals with the currently used methods for quantifying the binding of Ca(2+) and Mg(2+) to CaBPs.

  12. Quantifying parametric uncertainty in the Rothermel model

    Treesearch

    S. Goodrick

    2008-01-01

    The purpose of the present work is to quantify parametric uncertainty in the Rothermel wildland fire spreadmodel (implemented in software such as fire spread models in the United States. This model consists of a non-linear system of equations that relates environmentalvariables (input parameter groups...

  13. 18 CFR 154.302 - Previously submitted material.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Previously submitted material. 154.302 Section 154.302 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... Filed With Changes § 154.302 Previously submitted material. (a) If all, or any portion, of the...

  14. 18 CFR 154.302 - Previously submitted material.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Previously submitted material. 154.302 Section 154.302 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... Filed With Changes § 154.302 Previously submitted material. (a) If all, or any portion, of the...

  15. 18 CFR 154.302 - Previously submitted material.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Previously submitted material. 154.302 Section 154.302 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... Filed With Changes § 154.302 Previously submitted material. (a) If all, or any portion, of the...

  16. 18 CFR 154.302 - Previously submitted material.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Previously submitted material. 154.302 Section 154.302 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... Filed With Changes § 154.302 Previously submitted material. (a) If all, or any portion, of the...

  17. Quantifying Factors That Impact Riverbed Dynamic Permeability at a Riverbank Filtration Facility

    NASA Astrophysics Data System (ADS)

    Ulrich, C.; Hubbard, S. S.; Florsheim, J. L.; Rosenberry, D. O.; Borglin, S. E.; Zhang, Y.; Seymour, D.; Trotta, M.

    2012-12-01

    Previous modeling studies of the Wohler riverbank filtration system on the Russian River, California suggested that riverbed and aquifer permeability both influence the development of a pumping-induced unsaturated zone below the riverbed, which affects water produced through large radial water-supply collector wells that extend beneath and adjacent to the river. In particular, previous work suggests that riverbed permeability is influenced by interaction between pumping and river stage that is controlled by a downstream temporary inflatable dam during the summer low flow period. We hypothesize that raising the dam may instead lead to deposition of fine-grained sediment and/or accumulation of biota, both of which decrease riverbed permeability in the vicinity of the collector wells. To test this hypothesis, we are monitoring streambed permeability and seepage as a function of river stage and dam operation. We are using multiple methods to monitor the hydrological, sedimentological and geomorphic dynamics, including: seepage meters, sediment traps, cryogenic coring, ground penetrating radar, electrical resistance tomography, riverbed topography, piezometers, and thermistors. Here we discuss the use of this novel suite of methods to quantify dynamic riverbed permeability, how it relates to dam operation, and determine the key controls on permeability (i.e., biotic or abiotic). These results are expected to improve the overall understanding of riverbed permeability dynamics associated with Riverbank filtration. The results are also expected to be transferable to the project sponsors, the Sonoma County Water Agency, toward the development of an optimal pumping and dam operation schedule.

  18. New measurements quantify atmospheric greenhouse effect

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Atreyee

    2012-10-01

    In spite of a large body of existing measurements of incoming short-wave solar radiation and outgoing long-wave terrestrial radiation at the surface of the Earth and, more recently, in the upper atmosphere, there are few observations documenting how radiation profiles change through the atmosphere—information that is necessary to fully quantify the greenhouse effect of Earth's atmosphere. Through the use of existing technology but employing improvements in observational techniques it may now be possible not only to quantify but also to understand how different components of the atmosphere (e.g., concentration of gases, cloud cover, moisture, and aerosols) contribute to the greenhouse effect. Using weather balloons equipped with radiosondes, Philipona et al. continuously measured radiation fluxes from the surface of Earth up to altitudes of 35 kilometers in the upper stratosphere. Combining data from flights conducted during both day and night with continuous 24-hour measurements made at the surface of the Earth, the researchers created radiation profiles of all four components necessary to fully capture the radiation budget of Earth, namely, the upward and downward short-wave and long-wave radiation as a function of altitude.

  19. Quantifying meta-correlations in financial markets

    NASA Astrophysics Data System (ADS)

    Kenett, Dror Y.; Preis, Tobias; Gur-Gershgoren, Gitit; Ben-Jacob, Eshel

    2012-08-01

    Financial markets are modular multi-level systems, in which the relationships between the individual components are not constant in time. Sudden changes in these relationships significantly affect the stability of the entire system, and vice versa. Our analysis is based on historical daily closing prices of the 30 components of the Dow Jones Industrial Average (DJIA) from March 15th, 1939 until December 31st, 2010. We quantify the correlation among these components by determining Pearson correlation coefficients, to investigate whether mean correlation of the entire portfolio can be used as a precursor for changes in the index return. To this end, we quantify the meta-correlation - the correlation of mean correlation and index return. We find that changes in index returns are significantly correlated with changes in mean correlation. Furthermore, we study the relationship between the index return and correlation volatility - the standard deviation of correlations for a given time interval. This parameter provides further evidence of the effect of the index on market correlations and their fluctuations. Our empirical findings provide new information and quantification of the index leverage effect, and have implications to risk management, portfolio optimization, and to the increased stability of financial markets.

  20. Quantifying the Impact of Scenic Environments on Health

    NASA Astrophysics Data System (ADS)

    Seresinhe, Chanuki Illushka; Preis, Tobias; Moat, Helen Susannah

    2015-11-01

    Few people would deny an intuitive sense of increased wellbeing when spending time in beautiful locations. Here, we ask: can we quantify the relationship between environmental aesthetics and human health? We draw on data from Scenic-Or-Not, a website that crowdsources ratings of “scenicness” for geotagged photographs across Great Britain, in combination with data on citizen-reported health from the Census for England and Wales. We find that inhabitants of more scenic environments report better health, across urban, suburban and rural areas, even when taking core socioeconomic indicators of deprivation into account, such as income, employment and access to services. Our results provide evidence in line with the striking hypothesis that the aesthetics of the environment may have quantifiable consequences for our wellbeing.

  1. Developing a molecular picture of soil organic matter–mineral interactions by quantifying organo–mineral binding

    DOE PAGES

    Newcomb, C. J.; Qafoku, N. P.; Grate, J. W.; ...

    2017-08-30

    Long residence times of soil organic matter have been attributed to reactive mineral surface sites that sorb organic species and cause inaccessibility due to isolation and chemical stabilization at the organic-mineral interface. Instrumentation for probing this interface is limited. As a result, much of the micron- and molecular-scale knowledge about organic-mineral interactions remains largely qualitative. We report the use of force spectroscopy to directly measure the binding between organic ligands with known chemical functionalities to soil minerals in aqueous environments. By systematically studying the role of organic functional group chemistry with model minerals, we demonstrate that the chemistry of bothmore » the organic ligand and mineral contribute to values of binding free energy and that changes in pH and ionic strength produce significant differences in binding energies. These direct measurements of molecular binding provide mechanistic insights into organo-mineral interactions, which could potentially inform land-carbon models that explicitly include mineral-bound C pools.« less

  2. Developing a molecular picture of soil organic matter–mineral interactions by quantifying organo–mineral binding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newcomb, C. J.; Qafoku, N. P.; Grate, J. W.

    Long residence times of soil organic matter have been attributed to reactive mineral surface sites that sorb organic species and cause inaccessibility due to isolation and chemical stabilization at the organic-mineral interface. Instrumentation for probing this interface is limited. As a result, much of the micron- and molecular-scale knowledge about organic-mineral interactions remains largely qualitative. We report the use of force spectroscopy to directly measure the binding between organic ligands with known chemical functionalities to soil minerals in aqueous environments. By systematically studying the role of organic functional group chemistry with model minerals, we demonstrate that the chemistry of bothmore » the organic ligand and mineral contribute to values of binding free energy and that changes in pH and ionic strength produce significant differences in binding energies. These direct measurements of molecular binding provide mechanistic insights into organo-mineral interactions, which could potentially inform land-carbon models that explicitly include mineral-bound C pools.« less

  3. Accessibility Evaluation of Top-Ranking University Websites in World, Oceania, and Arab Categories for Home, Admission, and Course Description Webpages

    ERIC Educational Resources Information Center

    Alahmadi, Tahani; Drew, Steve

    2017-01-01

    Evaluating accessibility is an important equity step in assessing the effectiveness and usefulness of online learning materials for students with disabilities such as visual or hearing impairments. Previous studies in this area have indicated that, over time, university websites have become gradually more inaccessible. This paper relates findings…

  4. Quantifying Stock Return Distributions in Financial Markets

    PubMed Central

    Botta, Federico; Moat, Helen Susannah; Stanley, H. Eugene; Preis, Tobias

    2015-01-01

    Being able to quantify the probability of large price changes in stock markets is of crucial importance in understanding financial crises that affect the lives of people worldwide. Large changes in stock market prices can arise abruptly, within a matter of minutes, or develop across much longer time scales. Here, we analyze a dataset comprising the stocks forming the Dow Jones Industrial Average at a second by second resolution in the period from January 2008 to July 2010 in order to quantify the distribution of changes in market prices at a range of time scales. We find that the tails of the distributions of logarithmic price changes, or returns, exhibit power law decays for time scales ranging from 300 seconds to 3600 seconds. For larger time scales, we find that the distributions tails exhibit exponential decay. Our findings may inform the development of models of market behavior across varying time scales. PMID:26327593

  5. Quantifying temporal glucose variability in diabetes via continuous glucose monitoring: mathematical methods and clinical application.

    PubMed

    Kovatchev, Boris P; Clarke, William L; Breton, Marc; Brayman, Kenneth; McCall, Anthony

    2005-12-01

    Continuous glucose monitors (CGMs) collect detailed blood glucose (BG) time series, which carry significant information about the dynamics of BG fluctuations. In contrast, the methods for analysis of CGM data remain those developed for infrequent BG self-monitoring. As a result, important information about the temporal structure of the data is lost during the translation of raw sensor readings into clinically interpretable statistics and images. The following mathematical methods are introduced into the field of CGM data interpretation: (1) analysis of BG rate of change; (2) risk analysis using previously reported Low/High BG Indices and Poincare (lag) plot of risk associated with temporal BG variability; and (3) spatial aggregation of the process of BG fluctuations and its Markov chain visualization. The clinical application of these methods is illustrated by analysis of data of a patient with Type 1 diabetes mellitus who underwent islet transplantation and with data from clinical trials. Normative data [12,025 reference (YSI device, Yellow Springs Instruments, Yellow Springs, OH) BG determinations] in patients with Type 1 diabetes mellitus who underwent insulin and glucose challenges suggest that the 90%, 95%, and 99% confidence intervals of BG rate of change that could be maximally sustained over 15-30 min are [-2,2], [-3,3], and [-4,4] mg/dL/min, respectively. BG dynamics and risk parameters clearly differentiated the stages of transplantation and the effects of medication. Aspects of treatment were clearly visualized by graphs of BG rate of change and Low/High BG Indices, by a Poincare plot of risk for rapid BG fluctuations, and by a plot of the aggregated Markov process. Advanced analysis and visualization of CGM data allow for evaluation of dynamical characteristics of diabetes and reveal clinical information that is inaccessible via standard statistics, which do not take into account the temporal structure of the data. The use of such methods improves the

  6. Quantifying fibrosis in head and neck cancer treatment: An overview.

    PubMed

    Moloney, Emma C; Brunner, Markus; Alexander, Ashlin J; Clark, Jonathan

    2015-08-01

    Fibrosis is a common late complication of radiotherapy and/or surgical treatment for head and neck cancers. Fibrosis is difficult to quantify and formal methods of measure are not well recognized. The purpose of this review was to summarize the methods available to quantify neck fibrosis. A PubMed search of articles was carried out using key words "neck" and "fibrosis." Many methods have been used to assess fibrosis, however, there is no preferred methodology. Specific to neck fibrosis, most studies have relied upon hand palpation rating scales. Indentation and suction techniques have been used to mechanically quantify neck fibrosis. There is scope to develop applications of ultrasound, dielectric, bioimpedance, and MRI techniques for use in the neck region. Quantitative assessment of neck fibrosis is sought after in order to compare treatment regimens and improve quality of life outcomes in patients with head and neck cancer. © 2014 Wiley Periodicals, Inc.

  7. Global Distribution of Extreme Precipitation and High-Impact Landslides in 2010 Relative to Previous Years

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia; Adler, Robert; Adler, David; Peters-Lidard, Christa; Huffman, George

    2012-01-01

    It is well known that extreme or prolonged rainfall is the dominant trigger of landslides worldwide. While research has evaluated the spatiotemporal distribution of extreme rainfall and landslides at local or regional scales using in situ data, few studies have mapped rainfall-triggered landslide distribution globally due to the dearth of landslide data and consistent precipitation information. This study uses a newly developed Global Landslide Catalog (GLC) and a 13-year satellite-based precipitation record from TRMM data. For the first time, these two unique products provide the foundation to quantitatively evaluate the co-occurrence of precipitation and landslides globally. Evaluation of the GLC indicates that 2010 had a large number of high-impact landslide events relative to previous years. This study considers how variations in extreme and prolonged satellite-based rainfall are related to the distribution of landslides over the same time scales for three active landslide areas: Central America, the Himalayan Arc, and central-eastern China. Several test statistics confirm that TRMM rainfall generally scales with the observed increase in landslide reports and fatal events for 2010 and previous years over each region. These findings suggest that the co-occurrence of satellite precipitation and landslide reports may serve as a valuable indicator for characterizing the spatiotemporal distribution of landslide-prone areas in order to establish a global rainfall-triggered landslide climatology. This study characterizes the variability of satellite precipitation data and reported landslide activity at the globally scale in order to improve landslide cataloging, forecasting and quantify potential triggering sources at daily, monthly and yearly time scales.

  8. Quantifying pollen-vegetation relationships to reconstruct ancient forests using 19th-century forest composition and pollen data

    USGS Publications Warehouse

    Dawson, Andria; Paciorek, Christopher J.; McLachlan, Jason S.; Goring, Simon; Williams, John W.; Jackson, Stephen T.

    2016-01-01

    Mitigation of climate change and adaptation to its effects relies partly on how effectively land-atmosphere interactions can be quantified. Quantifying composition of past forest ecosystems can help understand processes governing forest dynamics in a changing world. Fossil pollen data provide information about past forest composition, but rigorous interpretation requires development of pollen-vegetation models (PVMs) that account for interspecific differences in pollen production and dispersal. Widespread and intensified land-use over the 19th and 20th centuries may have altered pollen-vegetation relationships. Here we use STEPPS, a Bayesian hierarchical spatial PVM, to estimate key process parameters and associated uncertainties in the pollen-vegetation relationship. We apply alternate dispersal kernels, and calibrate STEPPS using a newly developed Euro-American settlement-era calibration data set constructed from Public Land Survey data and fossil pollen samples matched to the settlement-era using expert elicitation. Models based on the inverse power-law dispersal kernel outperformed those based on the Gaussian dispersal kernel, indicating that pollen dispersal kernels are fat tailed. Pine and birch have the highest pollen productivities. Pollen productivity and dispersal estimates are generally consistent with previous understanding from modern data sets, although source area estimates are larger. Tests of model predictions demonstrate the ability of STEPPS to predict regional compositional patterns.

  9. Quantifying pollen-vegetation relationships to reconstruct ancient forests using 19th-century forest composition and pollen data

    NASA Astrophysics Data System (ADS)

    Dawson, Andria; Paciorek, Christopher J.; McLachlan, Jason S.; Goring, Simon; Williams, John W.; Jackson, Stephen T.

    2016-04-01

    Mitigation of climate change and adaptation to its effects relies partly on how effectively land-atmosphere interactions can be quantified. Quantifying composition of past forest ecosystems can help understand processes governing forest dynamics in a changing world. Fossil pollen data provide information about past forest composition, but rigorous interpretation requires development of pollen-vegetation models (PVMs) that account for interspecific differences in pollen production and dispersal. Widespread and intensified land-use over the 19th and 20th centuries may have altered pollen-vegetation relationships. Here we use STEPPS, a Bayesian hierarchical spatial PVM, to estimate key process parameters and associated uncertainties in the pollen-vegetation relationship. We apply alternate dispersal kernels, and calibrate STEPPS using a newly developed Euro-American settlement-era calibration data set constructed from Public Land Survey data and fossil pollen samples matched to the settlement-era using expert elicitation. Models based on the inverse power-law dispersal kernel outperformed those based on the Gaussian dispersal kernel, indicating that pollen dispersal kernels are fat tailed. Pine and birch have the highest pollen productivities. Pollen productivity and dispersal estimates are generally consistent with previous understanding from modern data sets, although source area estimates are larger. Tests of model predictions demonstrate the ability of STEPPS to predict regional compositional patterns.

  10. Neutrophil alveolitis following endotoxemia. Enhancement by previous exposure to hyperoxia.

    PubMed

    Rinaldo, J E; Dauber, J H; Christman, J; Rogers, R M

    1984-12-01

    We injected Escherichia coli endotoxin, 2.5 mg/kg, intraperitoneally in rats, sequentially quantified alveolar inflammation during a 6-day period by several techniques, and observed the effect of previous exposure to hyperoxia on the intensity of alveolitis in this model. As noted in other models of endotoxemia, we found intravascular sequestration of leukocytes and an increase in the retention of 125I albumin in the lung 4 to 6 h after the injection of endotoxin. Bronchoalveolar lavage fluid (BALF) obtained at this time only slightly stimulated the migration of neutrophils in vitro, and the numbers and types of cells recovered by lavage were normal. Fifteen h after the injection of endotoxin, however, bronchoalveolar lavage fluid stimulated both random and directed migration of neutrophils in vitro, although recovery of neutrophils by lavage was increased only slightly. By 24 h, 125I albumin retention had returned to normal levels, but the chemotactic activity of BALF remained high, and the percentage and absolute number of neutrophils recovered by lung lavage were increased markedly. The recovery of neutrophils remained significantly elevated for 3 days but declined to control levels by 6 days, whereas the recovery of alveolar macrophages was increased at this time. Exposure to 100% O2 for 36 h prior to endotoxemia accelerated and intensified neutrophil influx into the lung and increased the stimulatory effect of BALF on neutrophil migration in vitro. We conclude that a single episode of endotoxemia in the rat causes a multi-phasic alveolar inflammatory response, and that this response is accelerated and intensified by prior, mild exposure to hyperoxia.(ABSTRACT TRUNCATED AT 250 WORDS)

  11. Study Quantifies Physical Demands of Yoga in Seniors

    MedlinePlus

    ... Z Study Quantifies Physical Demands of Yoga in Seniors Share: A recent NCCAM-funded study measured the ... performance of seven standing poses commonly taught in senior yoga classes: Chair, Wall Plank, Tree, Warrior II, ...

  12. Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness.

    PubMed

    Richie, Megan; Josephson, S Andrew

    2018-01-01

    Construct: Authors examined whether a new vignette-based instrument could isolate and quantify heuristic bias. Heuristics are cognitive shortcuts that may introduce bias and contribute to error. There is no standardized instrument available to quantify heuristic bias in clinical decision making, limiting future study of educational interventions designed to improve calibration of medical decisions. This study presents validity data to support a vignette-based instrument quantifying bias due to the anchoring, availability, and representativeness heuristics. Participants completed questionnaires requiring assignment of probabilities to potential outcomes of medical and nonmedical scenarios. The instrument randomly presented scenarios in one of two versions: Version A, encouraging heuristic bias, and Version B, worded neutrally. The primary outcome was the difference in probability judgments for Version A versus Version B scenario options. Of 167 participants recruited, 139 enrolled. Participants assigned significantly higher mean probability values to Version A scenario options (M = 9.56, SD = 3.75) than Version B (M = 8.98, SD = 3.76), t(1801) = 3.27, p = .001. This result remained significant analyzing medical scenarios alone (Version A, M = 9.41, SD = 3.92; Version B, M = 8.86, SD = 4.09), t(1204) = 2.36, p = .02. Analyzing medical scenarios by heuristic revealed a significant difference between Version A and B for availability (Version A, M = 6.52, SD = 3.32; Version B, M = 5.52, SD = 3.05), t(404) = 3.04, p = .003, and representativeness (Version A, M = 11.45, SD = 3.12; Version B, M = 10.67, SD = 3.71), t(396) = 2.28, p = .02, but not anchoring. Stratifying by training level, students maintained a significant difference between Version A and B medical scenarios (Version A, M = 9.83, SD = 3.75; Version B, M = 9.00, SD = 3.98), t(465) = 2.29, p = .02, but not residents or attendings. Stratifying by heuristic and training level, availability maintained

  13. Quantifying hypoxia in human cancers using static PET imaging.

    PubMed

    Taylor, Edward; Yeung, Ivan; Keller, Harald; Wouters, Bradley G; Milosevic, Michael; Hedley, David W; Jaffray, David A

    2016-11-21

    Compared to FDG, the signal of 18 F-labelled hypoxia-sensitive tracers in tumours is low. This means that in addition to the presence of hypoxic cells, transport properties contribute significantly to the uptake signal in static PET images. This sensitivity to transport must be minimized in order for static PET to provide a reliable standard for hypoxia quantification. A dynamic compartmental model based on a reaction-diffusion formalism was developed to interpret tracer pharmacokinetics and applied to static images of FAZA in twenty patients with pancreatic cancer. We use our model to identify tumour properties-well-perfused without substantial necrosis or partitioning-for which static PET images can reliably quantify hypoxia. Normalizing the measured activity in a tumour voxel by the value in blood leads to a reduction in the sensitivity to variations in 'inter-corporal' transport properties-blood volume and clearance rate-as well as imaging study protocols. Normalization thus enhances the correlation between static PET images and the FAZA binding rate K 3 , a quantity which quantifies hypoxia in a biologically significant way. The ratio of FAZA uptake in spinal muscle and blood can vary substantially across patients due to long muscle equilibration times. Normalized static PET images of hypoxia-sensitive tracers can reliably quantify hypoxia for homogeneously well-perfused tumours with minimal tissue partitioning. The ideal normalizing reference tissue is blood, either drawn from the patient before PET scanning or imaged using PET. If blood is not available, uniform, homogeneously well-perfused muscle can be used. For tumours that are not homogeneously well-perfused or for which partitioning is significant, only an analysis of dynamic PET scans can reliably quantify hypoxia.

  14. Quantifying hypoxia in human cancers using static PET imaging

    NASA Astrophysics Data System (ADS)

    Taylor, Edward; Yeung, Ivan; Keller, Harald; Wouters, Bradley G.; Milosevic, Michael; Hedley, David W.; Jaffray, David A.

    2016-11-01

    Compared to FDG, the signal of 18F-labelled hypoxia-sensitive tracers in tumours is low. This means that in addition to the presence of hypoxic cells, transport properties contribute significantly to the uptake signal in static PET images. This sensitivity to transport must be minimized in order for static PET to provide a reliable standard for hypoxia quantification. A dynamic compartmental model based on a reaction-diffusion formalism was developed to interpret tracer pharmacokinetics and applied to static images of FAZA in twenty patients with pancreatic cancer. We use our model to identify tumour properties—well-perfused without substantial necrosis or partitioning—for which static PET images can reliably quantify hypoxia. Normalizing the measured activity in a tumour voxel by the value in blood leads to a reduction in the sensitivity to variations in ‘inter-corporal’ transport properties—blood volume and clearance rate—as well as imaging study protocols. Normalization thus enhances the correlation between static PET images and the FAZA binding rate K 3, a quantity which quantifies hypoxia in a biologically significant way. The ratio of FAZA uptake in spinal muscle and blood can vary substantially across patients due to long muscle equilibration times. Normalized static PET images of hypoxia-sensitive tracers can reliably quantify hypoxia for homogeneously well-perfused tumours with minimal tissue partitioning. The ideal normalizing reference tissue is blood, either drawn from the patient before PET scanning or imaged using PET. If blood is not available, uniform, homogeneously well-perfused muscle can be used. For tumours that are not homogeneously well-perfused or for which partitioning is significant, only an analysis of dynamic PET scans can reliably quantify hypoxia.

  15. Basinsoft, a computer program to quantify drainage basin characteristics

    USGS Publications Warehouse

    Harvey, Craig A.; Eash, David A.

    2001-01-01

    In 1988, the USGS began developing a program called Basinsoft. The initial program quantified 16 selected drainage basin characteristics from three source-data layers that were manually digitized from topographic maps using the versions of ARC/INFO, Fortran programs, and prime system Command Programming Language (CPL) programs available in 1988 (Majure and Soenksen, 1991). By 1991, Basinsoft was enhanced to quantify 27 selected drainage-basin characteristics from three source-data layers automatically generated from digital elevation model (DEM) data using a set of Fortran programs (Majure and Eash, 1991: Jenson and Dominique, 1988). Due to edge-matching problems encountered in 1991 with the preprocessing

  16. Diagnosis and characterization of mania: Quantifying increased energy and activity in the human behavioral pattern monitor.

    PubMed

    Perry, William; McIlwain, Meghan; Kloezeman, Karen; Henry, Brook L; Minassian, Arpi

    2016-06-30

    Increased energy or activity is now an essential feature of the mania of Bipolar Disorder (BD) according to DSM-5. This study examined whether objective measures of increased energy can differentiate manic BD individuals and provide greater diagnostic accuracy compared to rating scales, extending the work of previous studies with smaller samples. We also tested the relationship between objective measures of energy and rating scales. 50 hospitalized manic BD patients were compared to healthy subjects (HCS, n=39) in the human Behavioral Pattern Monitor (hBPM) which quantifies motor activity and goal-directed behavior in an environment containing novel stimuli. Archival hBPM data from 17 schizophrenia patients were used in sensitivity and specificity analyses. Manic BD patients exhibited higher motor activity than HCS and higher novel object interactions. hBPM activity measures were not correlated with observer-rated symptoms, and hBPM activity was more sensitive in accurately classifying hospitalized BD subjects than observer ratings. Although the findings can only be generalized to inpatient populations, they suggest that increased energy, particularly specific and goal-directed exploration, is a distinguishing feature of BD mania and is best quantified by objective measures of motor activity. A better understanding is needed of the biological underpinnings of this cardinal feature. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Quantifying cadherin mechanotransduction machinery assembly/disassembly dynamics using fluorescence covariance analysis.

    PubMed

    Vedula, Pavan; Cruz, Lissette A; Gutierrez, Natasha; Davis, Justin; Ayee, Brian; Abramczyk, Rachel; Rodriguez, Alexis J

    2016-06-30

    Quantifying multi-molecular complex assembly in specific cytoplasmic compartments is crucial to understand how cells use assembly/disassembly of these complexes to control function. Currently, biophysical methods like Fluorescence Resonance Energy Transfer and Fluorescence Correlation Spectroscopy provide quantitative measurements of direct protein-protein interactions, while traditional biochemical approaches such as sub-cellular fractionation and immunoprecipitation remain the main approaches used to study multi-protein complex assembly/disassembly dynamics. In this article, we validate and quantify multi-protein adherens junction complex assembly in situ using light microscopy and Fluorescence Covariance Analysis. Utilizing specific fluorescently-labeled protein pairs, we quantified various stages of adherens junction complex assembly, the multiprotein complex regulating epithelial tissue structure and function following de novo cell-cell contact. We demonstrate: minimal cadherin-catenin complex assembly in the perinuclear cytoplasm and subsequent localization to the cell-cell contact zone, assembly of adherens junction complexes, acto-myosin tension-mediated anchoring, and adherens junction maturation following de novo cell-cell contact. Finally applying Fluorescence Covariance Analysis in live cells expressing fluorescently tagged adherens junction complex proteins, we also quantified adherens junction complex assembly dynamics during epithelial monolayer formation.

  18. Road to the Future: Strategies for Wildlife Crossings and Youth Empowerment to Improve Wildlife Habitat in Roaded Landscapes

    ERIC Educational Resources Information Center

    Tanner, Dawn Renee

    2010-01-01

    As the footprint of human society expands upon the earth, habitat loss and landscape fragmentation is an increasing global problem. That problem includes loss of native habitats as these areas are harvested, converted to agricultural crops, and occupied by human settlement. Roads increase human access to previously inaccessible areas, encourage…

  19. Crossing Thresholds: Identifying Conceptual Transitions in Postsecondary Teaching

    ERIC Educational Resources Information Center

    Wilcox, Susan; Leger, Andy B.

    2013-01-01

    In this paper we report on research we conducted to begin the process of identifying threshold concepts in the field of postsecondary teaching. Meyer & Land (2006) propose that within all disciplinary fields there seem to be particular "threshold concepts" that serve as gateways, opening up new and previously inaccessible ways of…

  20. Quantifying utricular stimulation during natural behavior

    PubMed Central

    Rivera, Angela R. V.; Davis, Julian; Grant, Wally; Blob, Richard W.; Peterson, Ellengene; Neiman, Alexander B.; Rowe, Michael

    2012-01-01

    The use of natural stimuli in neurophysiological studies has led to significant insights into the encoding strategies used by sensory neurons. To investigate these encoding strategies in vestibular receptors and neurons, we have developed a method for calculating the stimuli delivered to a vestibular organ, the utricle, during natural (unrestrained) behaviors, using the turtle as our experimental preparation. High-speed digital video sequences are used to calculate the dynamic gravito-inertial (GI) vector acting on the head during behavior. X-ray computed tomography (CT) scans are used to determine the orientation of the otoconial layer (OL) of the utricle within the head, and the calculated GI vectors are then rotated into the plane of the OL. Thus, the method allows us to quantify the spatio-temporal structure of stimuli to the OL during natural behaviors. In the future, these waveforms can be used as stimuli in neurophysiological experiments to understand how natural signals are encoded by vestibular receptors and neurons. We provide one example of the method which shows that turtle feeding behaviors can stimulate the utricle at frequencies higher than those typically used in vestibular studies. This method can be adapted to other species, to other vestibular end organs, and to other methods of quantifying head movements. PMID:22753360

  1. Land Use in LCA: Including Regionally Altered Precipitation to Quantify Ecosystem Damage.

    PubMed

    Lathuillière, Michael J; Bulle, Cécile; Johnson, Mark S

    2016-11-01

    The incorporation of soil moisture regenerated by precipitation, or green water, into life cycle assessment has been of growing interest given the global importance of this resource for terrestrial ecosystems and food production. This paper proposes a new impact assessment model to relate land and water use in seasonally dry, semiarid, and arid regions where precipitation and evapotranspiration are closely coupled. We introduce the Precipitation Reduction Potential midpoint impact representing the change in downwind precipitation as a result of a land transformation and occupation activity. Then, our end-point impact model quantifies terrestrial ecosystem damage as a function of precipitation loss using a relationship between woody plant species richness, water and energy regimes. We then apply the midpoint and end-point models to the production of soybean in Southeastern Amazonia which has resulted from the expansion of cropland into tropical forest, with noted effects on local precipitation. Our proposed cause-effect chain represents a complementary approach to previous contributions which have focused on water consumption impacts and/or have represented evapotranspiration as a loss to the water cycle.

  2. Quantifying fall migration of Ross's gulls (Rhodostethia rosea) past Point Barrow, Alaska

    USGS Publications Warehouse

    Uher-Koch, Brian D.; Davis, Shanti E.; Maftei, Mark; Gesmundo, Callie; Suydam, R.S.; Mallory, Mark L.

    2014-01-01

    The Ross's gull (Rhodostethia rosea) is a poorly known seabird of the circumpolar Arctic. The only place in the world where Ross's gulls are known to congregate is in the near-shore waters around Point Barrow, Alaska where they undertake an annual passage in late fall. Ross's gulls seen at Point Barrow are presumed to originate from nesting colonies in Siberia, but neither their origin nor their destination has been confirmed. Current estimates of the global population of Ross's gulls are based largely on expert opinion, and the only reliable population estimate is derived from extrapolations from previous counts conducted at Point Barrow, but these data are now over 25 years old. In order to update and clarify the status of this species in Alaska, our study quantified the timing, number, and flight direction of Ross's gulls passing Point Barrow in 2011. We recorded up to two-thirds of the estimated global population of Ross's gulls (≥ 27,000 individuals) over 39 days with numbers peaking on 16 October when we observed over 7,000 birds during a three-hour period.

  3. Visualizing and quantifying protein secretion using a Renilla luciferase-GFP fusion protein.

    PubMed

    Liu, J; Wang, Y; Szalay, A A; Escher, A

    2000-01-01

    We have shown previously that an engineered form of Renilla luciferase (SRUC) can be secreted as a functional enzyme by mammalian cells, and that fusing wild-type Renilla luciferase with the green fluorescent protein from Aequorea victoria (GFP) yields a chimeric protein retaining light-emission properties similar to that of unfused Renilla luciferase and GFP. In the work presented here, SRUC was fused with GFP to determine whether it could be used to both visualize and quantify protein secretion in mammalian cells. Simian COS-7 and Chinese hamster ovary (CHO) cells were transiently transfected with gene constructs encoding a secreted or an intracellular version of a Renilla luciferase-GFP fusion protein. Renilla luciferase activity was measured from COS-7 cell lysates and culture media, and GFP activity was detected in CHO cells using fluorescence microscopy. Data indicated that the SRUC-GFP fusion protein was secreted as a chimeric protein that had both Renilla luciferase and GFP activity. This fusion protein could be a useful marker for the study of protein secretion in mammalian cells. Copyright 2000 John Wiley & Sons, Ltd.

  4. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  5. Quantifying why urea is a protein denaturant, whereas glycine betaine is a protein stabilizer

    PubMed Central

    Guinn, Emily J.; Pegram, Laurel M.; Capp, Michael W.; Pollock, Michelle N.; Record, M. Thomas

    2011-01-01

    To explain the large, opposite effects of urea and glycine betaine (GB) on stability of folded proteins and protein complexes, we quantify and interpret preferential interactions of urea with 45 model compounds displaying protein functional groups and compare with a previous analysis of GB. This information is needed to use urea as a probe of coupled folding in protein processes and to tune molecular dynamics force fields. Preferential interactions between urea and model compounds relative to their interactions with water are determined by osmometry or solubility and dissected using a unique coarse-grained analysis to obtain interaction potentials quantifying the interaction of urea with each significant type of protein surface (aliphatic, aromatic hydrocarbon (C); polar and charged N and O). Microscopic local-bulk partition coefficients Kp for the accumulation or exclusion of urea in the water of hydration of these surfaces relative to bulk water are obtained. Kp values reveal that urea accumulates moderately at amide O and weakly at aliphatic C, whereas GB is excluded from both. These results provide both thermodynamic and molecular explanations for the opposite effects of urea and glycine betaine on protein stability, as well as deductions about strengths of amide NH—amide O and amide NH—amide N hydrogen bonds relative to hydrogen bonds to water. Interestingly, urea, like GB, is moderately accumulated at aromatic C surface. Urea m-values for protein folding and other protein processes are quantitatively interpreted and predicted using these urea interaction potentials or Kp values. PMID:21930943

  6. Quantifying why urea is a protein denaturant, whereas glycine betaine is a protein stabilizer.

    PubMed

    Guinn, Emily J; Pegram, Laurel M; Capp, Michael W; Pollock, Michelle N; Record, M Thomas

    2011-10-11

    To explain the large, opposite effects of urea and glycine betaine (GB) on stability of folded proteins and protein complexes, we quantify and interpret preferential interactions of urea with 45 model compounds displaying protein functional groups and compare with a previous analysis of GB. This information is needed to use urea as a probe of coupled folding in protein processes and to tune molecular dynamics force fields. Preferential interactions between urea and model compounds relative to their interactions with water are determined by osmometry or solubility and dissected using a unique coarse-grained analysis to obtain interaction potentials quantifying the interaction of urea with each significant type of protein surface (aliphatic, aromatic hydrocarbon (C); polar and charged N and O). Microscopic local-bulk partition coefficients K(p) for the accumulation or exclusion of urea in the water of hydration of these surfaces relative to bulk water are obtained. K(p) values reveal that urea accumulates moderately at amide O and weakly at aliphatic C, whereas GB is excluded from both. These results provide both thermodynamic and molecular explanations for the opposite effects of urea and glycine betaine on protein stability, as well as deductions about strengths of amide NH--amide O and amide NH--amide N hydrogen bonds relative to hydrogen bonds to water. Interestingly, urea, like GB, is moderately accumulated at aromatic C surface. Urea m-values for protein folding and other protein processes are quantitatively interpreted and predicted using these urea interaction potentials or K(p) values.

  7. An attempt to quantify aerosol-cloud effects in fields of precipitating trade wind cumuli

    NASA Astrophysics Data System (ADS)

    Seifert, Axel; Heus, Thijs

    2015-04-01

    Aerosol indirect effects are notoriously difficult to understand and quantify. Using large-eddy simulations (LES) we attempt to quantify the impact of aerosols on the albedo and the precipitation formation in trade wind cumulus clouds. Having performed a set of large-domain Giga-LES runs we are able to capture the mesoscale self-organization of the cloud field. Our simulations show that self-organization is intrinsically tied to precipitation formation in this cloud regime making previous studies that did not consider cloud organization questionable. We find that aerosols, here modeled just as a perturbation in cloud droplet number concentration, have a significant impact on the transient behavior, i.e., how fast rain is formed and self-organization of the cloud field takes place. Though, for longer integration times, all simulations approach the same radiative-convective equilibrium and aerosol effects become small. The sensitivity to aerosols becomes even smaller when we include explicit cloud-radiation interaction as this leads to a much faster and more vigorous response of the cloud layer. Overall we find that aerosol-cloud interactions, like cloud lifetime effects etc., are small or even negative when the cloud field is close to equilibrium. Consequently, the Twomey effect does already provide an upper bound on the albedo effects of aerosol perturbations. Our analysis also highlights that current parameterizations that predict only the grid-box mean of the cloud field and do not take into account cloud organization are not able to describe aerosol indirect effects correctly, but overestimate them due to that lack of cloud dynamical and mesoscale buffering.

  8. Quantifying sources of methane using light alkanes in the Los Angeles basin, California

    NASA Astrophysics Data System (ADS)

    Peischl, J.; Ryerson, T. B.; Brioude, J.; Aikin, K. C.; Andrews, A. E.; Atlas, E.; Blake, D.; Daube, B. C.; de Gouw, J. A.; Dlugokencky, E.; Frost, G. J.; Gentner, D. R.; Gilman, J. B.; Goldstein, A. H.; Harley, R. A.; Holloway, J. S.; Kofler, J.; Kuster, W. C.; Lang, P. M.; Novelli, P. C.; Santoni, G. W.; Trainer, M.; Wofsy, S. C.; Parrish, D. D.

    2013-05-01

    Methane (CH4), carbon dioxide (CO2), carbon monoxide (CO), and C2-C5 alkanes were measured throughout the Los Angeles (L.A.) basin in May and June 2010. We use these data to show that the emission ratios of CH4/CO and CH4/CO2 in the L.A. basin are larger than expected from population-apportioned bottom-up state inventories, consistent with previously published work. We use experimentally determined CH4/CO and CH4/CO2 emission ratios in combination with annual State of California CO and CO2 inventories to derive a yearly emission rate of CH4 to the L.A. basin. We further use the airborne measurements to directly derive CH4 emission rates from dairy operations in Chino, and from the two largest landfills in the L.A. basin, and show these sources are accurately represented in the California Air Resources Board greenhouse gas inventory for CH4. We then use measurements of C2-C5 alkanes to quantify the relative contribution of other CH4 sources in the L.A. basin, with results differing from those of previous studies. The atmospheric data are consistent with the majority of CH4 emissions in the region coming from fugitive losses from natural gas in pipelines and urban distribution systems and/or geologic seeps, as well as landfills and dairies. The local oil and gas industry also provides a significant source of CH4 in the area. The addition of CH4 emissions from natural gas pipelines and urban distribution systems and/or geologic seeps and from the local oil and gas industry is sufficient to account for the differences between the top-down and bottom-up CH4 inventories identified in previously published work.

  9. The Urban Forest Effects (UFORE) model: quantifying urban forest structure and functions

    Treesearch

    David J. Nowak; Daniel E. Crane

    2000-01-01

    The Urban Forest Effects (UFORE) computer model was developed to help managers and researchers quantify urban forest structure and functions. The model quantifies species composition and diversity, diameter distribution, tree density and health, leaf area, leaf biomass, and other structural characteristics; hourly volatile organic compound emissions (emissions that...

  10. The early postpartum experience of previously infertile mothers.

    PubMed

    Ladores, Sigrid; Aroian, Karen

    2015-01-01

    To explore the lived experience of becoming a new mother from the unique perspectives of previously infertile women. A descriptive phenomenological design was used to extract the fundamental structure of the postpartum experience of previously infertile mothers. Central Florida. Twelve first-time, previously infertile mothers age 27 to 43 years. Face-to-face interviews were conducted twice with each participant. Recorded interviews were transcribed verbatim and analyzed using Colaizzi's approach. Two main themes emerged that described the early postpartum experience of first-time, previously infertile mothers: (a) lingering identity as infertile and (b) gratitude for the gift of motherhood. Participants reported that their lingering identities as infertile and immense gratitude for the gift of motherhood propelled them to establish unrealistic expectations to be perfect mothers. When they were unable to live up this expectation, they censored their feelings of inadequacy, guilt, and shame. Findings from this study may help to sensitize health care providers to the difficulties faced by previously infertile women during their transition to motherhood. © 2015 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses.

  11. Quantifying Modern Recharge to the Nubian Sandstone Aquifer System: Inferences from GRACE and Land Surface Models

    NASA Astrophysics Data System (ADS)

    Mohamed, A.; Sultan, M.; Ahmed, M.; Yan, E.

    2014-12-01

    The Nubian Sandstone Aquifer System (NSAS) is shared by Egypt, Libya, Chad and Sudanand is one of the largest (area: ~ 2 × 106 km2) groundwater systems in the world. Despite its importance to the population of these countries, major hydrological parameters such as modern recharge and extraction rates remain poorly investigated given: (1) the large extent of the NSAS, (2) the absence of comprehensive monitoring networks, (3) the general inaccessibility of many of the NSAS regions, (4) difficulties in collecting background information, largely included in unpublished governmental reports, and (5) limited local funding to support the construction of monitoring networks and/or collection of field and background datasets. Data from monthly Gravity Recovery and Climate Experiment (GRACE) gravity solutions were processed (Gaussian smoothed: 100 km; rescaled) and used to quantify the modern recharge to the NSAS during the period from January 2003 to December 2012. To isolate the groundwater component in GRACE data, the soil moisture and river channel storages were removed using the outputs from the most recent Community Land Model version 4.5 (CLM4.5). GRACE-derived recharge calculations were performed over the southern NSAS outcrops (area: 835 × 103 km2) in Sudan and Chad that receive average annual precipitation of 65 km3 (77.5 mm). GRACE-derived recharge rates were estimated at 2.79 ± 0.98 km3/yr (3.34 ± 1.17 mm/yr). If we take into account the total annual extraction rates (~ 0.4 km3; CEDARE, 2002) from Chad and Sudan the average annual recharge rate for the NSAS could reach up to ~ 3.20 ± 1.18 km3/yr (3.84 ± 1.42 mm/yr). Our recharge rates estimates are similar to those calculated using (1) groundwater flow modelling in the Central Sudan Rift Basins (4-8 mm/yr; Abdalla, 2008), (2) WaterGAP global scale groundwater recharge model (< 5 mm/yr, Döll and Fiedler, 2008), and (3) chloride tracer in Sudan (3.05 mm/yr; Edmunds et al. 1988). Given the available global

  12. Quantifying Safety Performance of Driveways on State Highways

    DOT National Transportation Integrated Search

    2012-08-01

    This report documents a research effort to quantify the safety performance of driveways in the State of Oregon. In : particular, this research effort focuses on driveways located adjacent to principal arterial state highways with urban or : rural des...

  13. No Previous Public Services Required

    ERIC Educational Resources Information Center

    Taylor, Kelley R.

    2009-01-01

    In 2007, the Supreme Court heard a case that involved the question of whether a school district could be required to reimburse parents who unilaterally placed their child in private school when the child had not previously received special education and related services in a public institution ("Board of Education v. Tom F."). The…

  14. Properties and relative measure for quantifying quantum synchronization

    NASA Astrophysics Data System (ADS)

    Li, Wenlin; Zhang, Wenzhao; Li, Chong; Song, Heshan

    2017-07-01

    Although quantum synchronization phenomena and corresponding measures have been widely discussed recently, it is still an open question how to characterize directly the influence of nonlocal correlation, which is the key distinction for identifying classical and quantum synchronizations. In this paper, we present basic postulates for quantifying quantum synchronization based on the related theory in Mari's work [Phys. Rev. Lett. 111, 103605 (2013), 10.1103/PhysRevLett.111.103605], and we give a general formula of a quantum synchronization measure with clear physical interpretations. By introducing Pearson's parameter, we show that the obvious characteristics of our measure are the relativity and monotonicity. As an example, the measure is applied to describe synchronization among quantum optomechanical systems under a Markovian bath. We also show the potential by quantifying generalized synchronization and discrete variable synchronization with this measure.

  15. Quantifying temporal change in biodiversity: challenges and opportunities

    PubMed Central

    Dornelas, Maria; Magurran, Anne E.; Buckland, Stephen T.; Chao, Anne; Chazdon, Robin L.; Colwell, Robert K.; Curtis, Tom; Gaston, Kevin J.; Gotelli, Nicholas J.; Kosnik, Matthew A.; McGill, Brian; McCune, Jenny L.; Morlon, Hélène; Mumby, Peter J.; Øvreås, Lise; Studeny, Angelika; Vellend, Mark

    2013-01-01

    Growing concern about biodiversity loss underscores the need to quantify and understand temporal change. Here, we review the opportunities presented by biodiversity time series, and address three related issues: (i) recognizing the characteristics of temporal data; (ii) selecting appropriate statistical procedures for analysing temporal data; and (iii) inferring and forecasting biodiversity change. With regard to the first issue, we draw attention to defining characteristics of biodiversity time series—lack of physical boundaries, uni-dimensionality, autocorrelation and directionality—that inform the choice of analytic methods. Second, we explore methods of quantifying change in biodiversity at different timescales, noting that autocorrelation can be viewed as a feature that sheds light on the underlying structure of temporal change. Finally, we address the transition from inferring to forecasting biodiversity change, highlighting potential pitfalls associated with phase-shifts and novel conditions. PMID:23097514

  16. Quantifying Tropical Glacier Mass Balance Sensitivity to Climate Change Through Regional-Scale Modeling and The Randolph Glacier Inventory

    NASA Astrophysics Data System (ADS)

    Malone, A.

    2017-12-01

    Quantifying mass balance sensitivity to climate change is essential for forecasting glacier evolution and deciphering climate signals embedded in archives of past glacier changes. Ideally, these quantifications result from decades of field measurement, remote sensing, and a hierarchy modeling approach, but in data-sparse regions, such as the Himalayas and tropical Andes, regional-scale modeling rooted in first principles provides a first-order picture. Previous regional-scaling modeling studies have applied a surface energy and mass balance approach in order to quantify equilibrium line altitude sensitivity to climate change. In this study, an expanded regional-scale surface energy and mass balance model is implemented to quantify glacier-wide mass balance sensitivity to climate change for tropical Andean glaciers. Data from the Randolph Glacier Inventory are incorporated, and additional physical processes are included, such as a dynamic albedo and cloud-dependent atmospheric emissivity. The model output agrees well with the limited mass balance records for tropical Andean glaciers. The dominant climate variables driving interannual mass balance variability differ depending on the climate setting. For wet tropical glaciers (annual precipitation >0.75 m y-1), temperature is the dominant climate variable. Different hypotheses for the processes linking wet tropical glacier mass balance variability to temperature are evaluated. The results support the hypothesis that glacier-wide mass balance on wet tropical glaciers is largely dominated by processes at the lowest elevation where temperature plays a leading role in energy exchanges. This research also highlights the transient nature of wet tropical glaciers - the vast majority of tropical glaciers and a vital regional water resource - in an anthropogenic warming world.

  17. Quantifying and tuning entanglement for quantum systems

    NASA Astrophysics Data System (ADS)

    Xu, Qing

    A 2D Ising model with transverse field on a triangular lattice is studied using exact diagonalization. The quantum entanglement of the system is quantified by the entanglement of formation. The ground state property of the system is studied and the quantified entanglement is shown to be closely related to the ground state wavefunction while the singularity in the entanglement as a function of the transverse field is a reasonable indicator of the quantum phase transition. In order to tune the entanglement, one can either include an impurity in the otherwise homogeneous system whose strength is tunable, or one can vary the external transverse field as a tuner. The latter kind of tuning involves complicated dynamical properties of the system. From the study of the dynamics on a comparatively smaller system, we provide ways to tune the entanglement without triggering any decoherence. The finite temperature effect is also discussed. Besides showing above physical results, the realization of the trace-minimization method in our system is provided; the scalability of such method to larger systems is argued.

  18. Quantifying Scheduling Challenges for Exascale System Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mondragon, Oscar; Bridges, Patrick G.; Jones, Terry R

    2015-01-01

    The move towards high-performance computing (HPC) ap- plications comprised of coupled codes and the need to dra- matically reduce data movement is leading to a reexami- nation of time-sharing vs. space-sharing in HPC systems. In this paper, we discuss and begin to quantify the perfor- mance impact of a move away from strict space-sharing of nodes for HPC applications. Specifically, we examine the po- tential performance cost of time-sharing nodes between ap- plication components, we determine whether a simple coor- dinated scheduling mechanism can address these problems, and we research how suitable simple constraint-based opti- mization techniques are for solvingmore » scheduling challenges in this regime. Our results demonstrate that current general- purpose HPC system software scheduling and resource al- location systems are subject to significant performance de- ciencies which we quantify for six representative applica- tions. Based on these results, we discuss areas in which ad- ditional research is needed to meet the scheduling challenges of next-generation HPC systems.« less

  19. Quantifying site-specific physical heterogeneity within an estuarine seascape

    USGS Publications Warehouse

    Kennedy, Cristina G.; Mather, Martha E.; Smith, Joseph M.

    2017-01-01

    Quantifying physical heterogeneity is essential for meaningful ecological research and effective resource management. Spatial patterns of multiple, co-occurring physical features are rarely quantified across a seascape because of methodological challenges. Here, we identified approaches that measured total site-specific heterogeneity, an often overlooked aspect of estuarine ecosystems. Specifically, we examined 23 metrics that quantified four types of common physical features: (1) river and creek confluences, (2) bathymetric variation including underwater drop-offs, (3) land features such as islands/sandbars, and (4) major underwater channel networks. Our research at 40 sites throughout Plum Island Estuary (PIE) provided solutions to two problems. The first problem was that individual metrics that measured heterogeneity of a single physical feature showed different regional patterns. We solved this first problem by combining multiple metrics for a single feature using a within-physical feature cluster analysis. With this approach, we identified sites with four different types of confluences and three different types of underwater drop-offs. The second problem was that when multiple physical features co-occurred, new patterns of total site-specific heterogeneity were created across the seascape. This pattern of total heterogeneity has potential ecological relevance to structure-oriented predators. To address this second problem, we identified sites with similar types of total physical heterogeneity using an across-physical feature cluster analysis. Then, we calculated an additive heterogeneity index, which integrated all physical features at a site. Finally, we tested if site-specific additive heterogeneity index values differed for across-physical feature clusters. In PIE, the sites with the highest additive heterogeneity index values were clustered together and corresponded to sites where a fish predator, adult striped bass (Morone saxatilis), aggregated in a

  20. Oxygen self-diffusion mechanisms in monoclinic Zr O2 revealed and quantified by density functional theory, random walk analysis, and kinetic Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Yang, Jing; Youssef, Mostafa; Yildiz, Bilge

    2018-01-01

    In this work, we quantify oxygen self-diffusion in monoclinic-phase zirconium oxide as a function of temperature and oxygen partial pressure. A migration barrier of each type of oxygen defect was obtained by first-principles calculations. Random walk theory was used to quantify the diffusivities of oxygen interstitials by using the calculated migration barriers. Kinetic Monte Carlo simulations were used to calculate diffusivities of oxygen vacancies by distinguishing the threefold- and fourfold-coordinated lattice oxygen. By combining the equilibrium defect concentrations obtained in our previous work together with the herein calculated diffusivity of each defect species, we present the resulting oxygen self-diffusion coefficients and the corresponding atomistically resolved transport mechanisms. The predicted effective migration barriers and diffusion prefactors are in reasonable agreement with the experimentally reported values. This work provides insights into oxygen diffusion engineering in Zr O2 -related devices and parametrization for continuum transport modeling.

  1. Quantifying aquatic invasion patterns through space and time

    EPA Science Inventory

    The objective of my study was to quantify the apparent spatio-temporal relationship between anthropogenic introduction pathway intensity and non-native aquatic species presence throughout the Laurentian Great Lakes. Non-native aquatic species early detection programs are based pr...

  2. Determining root correspondence between previously and newly detected objects

    DOEpatents

    Paglieroni, David W.; Beer, N Reginald

    2014-06-17

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  3. Quantifying edge effect extent and its impacts on carbon stocks across a degraded landscape in the Amazon using airborne lidar.

    NASA Astrophysics Data System (ADS)

    dos-Santos, M. N.; Keller, M.; Morton, D. C.; Longo, M.; Scaranello, M. A., Sr.; Pinagé, E. R.; Correa Pabon, R.

    2017-12-01

    Ongoing tropical forest degradation and forest fragmentation increases forest edge area. Forest edges experience hotter, drier, and windier conditions and greater exposure to fires compared to interior areas, which elevate rates of tree mortality. Previous studies have suggested that forests within 100 m from the edge may lose 36% of biomass during the first two decades following fragmentation, although such estimates are based on a limited number of experimental plots. Degraded forests behave differently from intact forests and quantifying edge effect extension in a degraded forest landscape is more challenging compared to experimental studies. To overcome these limitations, we used airborne lidar data to quantify changes in forest structure near 91 edges in a heavily degraded tropical forest in Paragominas Municipality, eastern Brazilian Amazon. Paragominas was a center of timber production in the 1990s. Today, the landscape is a mosaic of different agricultural uses, degraded, secondary and unmanaged forests. A total of 3000 ha of high density (mean density of 17.9 points/m2) lidar data were acquired in August/September 2013 and June/July 2014 over 30 transects (200 x 5000m), systematically distributed over the study area, using the Optech Orion M-200 laser scanning system. We adopted lidar-measured forest heights as the edge effect criteria and found that mean extent of edge effect was highly variable across degraded forests (150 ± 354m) and secondary forest fragments (265 ± 365m). We related the extent of forest edges to the historical disturbances identified in Landsat imagery since 1984. Contrary to previous studies, we found that carbon stocks along forest edges were not significantly lower than forest core biomass when edges were defined by previously estimated range of 100 and 300m. In frontier forests, ecological edge effect may be masked by the cumulative impact of historic forest degradation - an anthropogenic edge effect that extends beyond the

  4. Quantifying redox-induced Schottky barrier variations in memristive devices via in operando spectromicroscopy with graphene electrodes

    PubMed Central

    Baeumer, Christoph; Schmitz, Christoph; Marchewka, Astrid; Mueller, David N.; Valenta, Richard; Hackl, Johanna; Raab, Nicolas; Rogers, Steven P.; Khan, M. Imtiaz; Nemsak, Slavomir; Shim, Moonsub; Menzel, Stephan; Schneider, Claus Michael; Waser, Rainer; Dittmann, Regina

    2016-01-01

    The continuing revolutionary success of mobile computing and smart devices calls for the development of novel, cost- and energy-efficient memories. Resistive switching is attractive because of, inter alia, increased switching speed and device density. On electrical stimulus, complex nanoscale redox processes are suspected to induce a resistance change in memristive devices. Quantitative information about these processes, which has been experimentally inaccessible so far, is essential for further advances. Here we use in operando spectromicroscopy to verify that redox reactions drive the resistance change. A remarkable agreement between experimental quantification of the redox state and device simulation reveals that changes in donor concentration by a factor of 2–3 at electrode-oxide interfaces cause a modulation of the effective Schottky barrier and lead to >2 orders of magnitude change in device resistance. These findings allow realistic device simulations, opening a route to less empirical and more predictive design of future memory cells. PMID:27539213

  5. Quantifying redox-induced Schottky barrier variations in memristive devices via in operando spectromicroscopy with graphene electrodes.

    PubMed

    Baeumer, Christoph; Schmitz, Christoph; Marchewka, Astrid; Mueller, David N; Valenta, Richard; Hackl, Johanna; Raab, Nicolas; Rogers, Steven P; Khan, M Imtiaz; Nemsak, Slavomir; Shim, Moonsub; Menzel, Stephan; Schneider, Claus Michael; Waser, Rainer; Dittmann, Regina

    2016-08-19

    The continuing revolutionary success of mobile computing and smart devices calls for the development of novel, cost- and energy-efficient memories. Resistive switching is attractive because of, inter alia, increased switching speed and device density. On electrical stimulus, complex nanoscale redox processes are suspected to induce a resistance change in memristive devices. Quantitative information about these processes, which has been experimentally inaccessible so far, is essential for further advances. Here we use in operando spectromicroscopy to verify that redox reactions drive the resistance change. A remarkable agreement between experimental quantification of the redox state and device simulation reveals that changes in donor concentration by a factor of 2-3 at electrode-oxide interfaces cause a modulation of the effective Schottky barrier and lead to >2 orders of magnitude change in device resistance. These findings allow realistic device simulations, opening a route to less empirical and more predictive design of future memory cells.

  6. Quantifying redox-induced Schottky barrier variations in memristive devices via in operando spectromicroscopy with graphene electrodes

    NASA Astrophysics Data System (ADS)

    Baeumer, Christoph; Schmitz, Christoph; Marchewka, Astrid; Mueller, David N.; Valenta, Richard; Hackl, Johanna; Raab, Nicolas; Rogers, Steven P.; Khan, M. Imtiaz; Nemsak, Slavomir; Shim, Moonsub; Menzel, Stephan; Schneider, Claus Michael; Waser, Rainer; Dittmann, Regina

    2016-08-01

    The continuing revolutionary success of mobile computing and smart devices calls for the development of novel, cost- and energy-efficient memories. Resistive switching is attractive because of, inter alia, increased switching speed and device density. On electrical stimulus, complex nanoscale redox processes are suspected to induce a resistance change in memristive devices. Quantitative information about these processes, which has been experimentally inaccessible so far, is essential for further advances. Here we use in operando spectromicroscopy to verify that redox reactions drive the resistance change. A remarkable agreement between experimental quantification of the redox state and device simulation reveals that changes in donor concentration by a factor of 2-3 at electrode-oxide interfaces cause a modulation of the effective Schottky barrier and lead to >2 orders of magnitude change in device resistance. These findings allow realistic device simulations, opening a route to less empirical and more predictive design of future memory cells.

  7. Cone Penetration Testing, a new approach to quantify coastal-deltaic land subsidence by peat consolidation

    NASA Astrophysics Data System (ADS)

    Koster, Kay; Erkens, Gilles; Zwanenburg, Cor

    2016-04-01

    accountable for 20 - 75 % of the volume loss, and 0.4 - 0.7 MPa for 75 - 95 % volume loss. This indicates that large amounts of volume by water dissipation has to be lost, before peat experiences a serious increase in strength, which subsequently continuous to increase with only small amount of volume loss. To demonstrate the robustness of our approach to the international field of land subsidence, we applied the obtained empirical relations to previously published CPT logs deriving from the peat-rich San Joaquin-Sacramento delta and the Kalimantan peatlands, and found volume losses that correspond with previously published results. Furthermore, we used the obtained results to predict maximum surface lowering for these areas by consolidation. In conclusion, these promising results and its worldwide popularity yielding large datasets, open the door for CPT as a generic method to contribute to quantifying the imminent threat of coastal-deltaic land subsidence.

  8. The Canon, the Web, and the Long Tail

    ERIC Educational Resources Information Center

    Sanderhoff, Merete

    2017-01-01

    This article argues that releasing images of artworks into the public domain creates a new possibility for the public to challenge the canon or create their own, based on access to previously inaccessible images. Through the dissemination of openly licensed artworks across the Internet, museums can support the public in expanding their engagement…

  9. Quantifiable Lateral Flow Assay Test Strips

    NASA Technical Reports Server (NTRS)

    2003-01-01

    As easy to read as a home pregnancy test, three Quantifiable Lateral Flow Assay (QLFA) strips used to test water for E. coli show different results. The brightly glowing control line on the far right of each strip indicates that all three tests ran successfully. But the glowing test line on the middle left and bottom strips reveal their samples were contaminated with E. coli bacteria at two different concentrations. The color intensity correlates with concentration of contamination.

  10. Quantifying camouflage: how to predict detectability from appearance.

    PubMed

    Troscianko, Jolyon; Skelhorn, John; Stevens, Martin

    2017-01-06

    Quantifying the conspicuousness of objects against particular backgrounds is key to understanding the evolution and adaptive value of animal coloration, and in designing effective camouflage. Quantifying detectability can reveal how colour patterns affect survival, how animals' appearances influence habitat preferences, and how receiver visual systems work. Advances in calibrated digital imaging are enabling the capture of objective visual information, but it remains unclear which methods are best for measuring detectability. Numerous descriptions and models of appearance have been used to infer the detectability of animals, but these models are rarely empirically validated or directly compared to one another. We compared the performance of human 'predators' to a bank of contemporary methods for quantifying the appearance of camouflaged prey. Background matching was assessed using several established methods, including sophisticated feature-based pattern analysis, granularity approaches and a range of luminance and contrast difference measures. Disruptive coloration is a further camouflage strategy where high contrast patterns disrupt they prey's tell-tale outline, making it more difficult to detect. Disruptive camouflage has been studied intensely over the past decade, yet defining and measuring it have proven far more problematic. We assessed how well existing disruptive coloration measures predicted capture times. Additionally, we developed a new method for measuring edge disruption based on an understanding of sensory processing and the way in which false edges are thought to interfere with animal outlines. Our novel measure of disruptive coloration was the best predictor of capture times overall, highlighting the importance of false edges in concealment over and above pattern or luminance matching. The efficacy of our new method for measuring disruptive camouflage together with its biological plausibility and computational efficiency represents a substantial

  11. Incorporating both physical and kinetic limitations in quantifying dissolved oxygen flux to aquatic sediments

    USGS Publications Warehouse

    O'Connor, B.L.; Hondzo, Miki; Harvey, J.W.

    2009-01-01

    Traditionally, dissolved oxygen (DO) fluxes have been calculated using the thin-film theory with DO microstructure data in systems characterized by fine sediments and low velocities. However, recent experimental evidence of fluctuating DO concentrations near the sediment-water interface suggests that turbulence and coherent motions control the mass transfer, and the surface renewal theory gives a more mechanistic model for quantifying fluxes. Both models involve quantifying the mass transfer coefficient (k) and the relevant concentration difference (??C). This study compared several empirical models for quantifying k based on both thin-film and surface renewal theories, as well as presents a new method for quantifying ??C (dynamic approach) that is consistent with the observed DO concentration fluctuations near the interface. Data were used from a series of flume experiments that includes both physical and kinetic uptake limitations of the flux. Results indicated that methods for quantifying k and ??C using the surface renewal theory better estimated the DO flux across a range of fluid-flow conditions. ?? 2009 ASCE.

  12. SOME IS NOT ENOUGH: QUANTIFIER COMPREHENSION IN CORTICOBASAL SYNDROME AND BEHAVIORAL VARIANT FRONTOTEMPORAL DEMENTIA

    PubMed Central

    Morgan, Brianna; Gross, Rachel; Clark, Robin; Dreyfuss, Michael; Boller, Ashley; Camp, Emily; Liang, Tsao-Wei; Avants, Brian; McMillan, Corey; Grossman, Murray

    2011-01-01

    Quantifiers are very common in everyday speech, but we know little about their cognitive basis or neural representation. The present study examined comprehension of three classes of quantifiers that depend on different cognitive components in patients with focal neurodegenerative diseases. Patients evaluated the truth-value of a sentence containing a quantifier relative to a picture illustrating a small number of familiar objects, and performance was related to MRI grey matter atrophy using voxel-based morphometry. We found that patients with corticobasal syndrome (CBS) and posterior cortical atrophy (PCA) are significantly impaired in their comprehension of Cardinal Quantifiers (e.g. “At least three birds are on the branch”), due in part to their deficit in quantity knowledge. MRI analyses related this deficit to temporal-parietal atrophy found in CBS/PCA. We also found that patients with behavioral variant frontotemporal dementia (bvFTD) are significantly impaired in their comprehension of Logical Quantifiers (e.g. “Some the birds are on the branch”), associated with a simple form of perceptual logic, and this correlated with their deficit on executive measures. This deficit was related to disease in rostral prefrontal cortex in bvFTD. These patients were also impaired in their comprehension of Majority Quantifiers (e.g. “At least half of the birds are on the branch”), and this too was correlated with their deficit on executive measures. This was related to disease in the basal ganglia interrupting a frontal-striatal loop critical for executive functioning. These findings suggest that a large-scale frontal-parietal neural network plays a crucial role in quantifier comprehension, and that comprehension of specific classes of quantifiers may be selectively impaired in patients with focal neurodegenerative conditions in these areas. PMID:21930136

  13. Quantifying Ecological Memory of Plant and Ecosystem Processes in Variable Environments

    NASA Astrophysics Data System (ADS)

    Ogle, K.; Barron-Gafford, G. A.; Bentley, L.; Cable, J.; Lucas, R.; Huxman, T. E.; Loik, M. E.; Smith, S. D.; Tissue, D.

    2010-12-01

    Precipitation, soil water, and other factors affect plant and ecosystem processes at multiple time scales. A common assumption is that water availability at a given time directly affects processes at that time. Recent work, especially in pulse-driven, semiarid systems, shows that antecedent water availability, averaged over several days to a couple weeks, can be just as or more important than current water status. Precipitation patterns of previous seasons or past years can also impact plant and ecosystem functioning in many systems. However, we lack an analytical framework for quantifying the importance of and time-scale over which past conditions affect current processes. This study explores the ecological memory of a variety of plant and ecosystem processes. We use memory as a metaphor to describe the time-scale over which antecedent conditions affect the current process. Existing approaches for incorporating antecedent effects arbitrarily select the antecedent integration period (e.g., the past 2 weeks) and the relative importance of past conditions (e.g., assign equal or linearly decreasing weights to past events). In contrast, we utilize a hierarchical Bayesian approach to integrate field data with process-based models, yielding posterior distributions for model parameters, including the duration of the ecological memory (integration period) and the relative importance of past events (weights) to this memory. We apply our approach to data spanning diverse temporal scales and four semiarid sites in the western US: leaf-level stomatal conductance (gs, sub-hourly scale), soil respiration (Rs, hourly to daily scale), and net primary productivity (NPP) and tree-ring widths (annual scale). For gs, antecedent factors (daily rainfall and temperature, hourly vapor pressure deficit) and current soil water explained up to 72% of the variation in gs in the Chihuahuan Desert, with a memory of 10 hours for a grass and 4 days for a shrub. Antecedent factors (past soil water

  14. A novel method for quantifying arm motion similarity.

    PubMed

    Zhi Li; Hauser, Kris; Roldan, Jay Ryan; Milutinovic, Dejan; Rosen, Jacob

    2015-08-01

    This paper proposes a novel task-independent method for quantifying arm motion similarity that can be applied to any kinematic/dynamic variable of interest. Given two arm motions for the same task, not necessarily with the same completion time, it plots the time-normalized curves against one another and generates four real-valued features. To validate these features we apply them to quantify the relationship between healthy and paretic arm motions of chronic stroke patients. Studying both unimanual and bimanual arm motions of eight chronic stroke patients, we find that inter-arm coupling that tends to synchronize the motions of both arms in bimanual motions, has a stronger effect at task-relevant joints than at task-irrelevant joints. It also revealed that the paretic arm suppresses the shoulder flexion of the non-paretic arm, while the latter encourages the shoulder rotation of the former.

  15. Quantifying landscape resilience using vegetation indices

    NASA Astrophysics Data System (ADS)

    Eddy, I. M. S.; Gergel, S. E.

    2014-12-01

    Landscape resilience refers to the ability of systems to adapt to and recover from disturbance. In pastoral landscapes, degradation can be measured in terms of increased desertification and/or shrub encroachment. In many countries across Central Asia, the use and resilience of pastoral systems has changed markedly over the past 25 years, influenced by centralized Soviet governance, private property rights and recently, communal resource governance. In Kyrgyzstan, recent governance reforms were in response to the increasing degradation of pastures attributed to livestock overgrazing. Our goal is to examine and map the landscape-level factors that influence overgrazing throughout successive governance periods. Here, we map and examine some of the spatial factors influencing landscape resilience in agro-pastoral systems in the Kyrgyzstan Republic where pastures occupy >50% of the country's area. We ask three questions: 1) which mechanisms of pasture degradation (desertification vs. shrub encroachment), are detectable using remote sensing vegetation indices?; 2) Are these degraded pastures associated with landscape features that influence herder mobility and accessibility (e.g., terrain, distance to other pastures)?; and 3) Have these patterns changed through successive governance periods? Using a chronosequence of Landsat imagery (1999-2014), NDVI and other VIs were used to identify trends in pasture condition during the growing season. Least-cost path distances as well as graph theoretic indices were derived from topographic factors to assess landscape connectivity (from villages to pastures and among pastures). Fieldwork was used to assess the feasibility and accuracy of this approach using the most recent imagery. Previous research concluded that low herder mobility hindered pasture use, thus we expect the distance from pasture to village to be an important predictor of pasture condition. This research will quantify the magnitude of pastoral degradation and test

  16. Quantifying aboveground forest carbon pools and fluxes from repeat LiDAR surveys

    Treesearch

    Andrew T. Hudak; Eva K. Strand; Lee A. Vierling; John C. Byrne; Jan U. H. Eitel; Sebastian Martinuzzi; Michael J. Falkowski

    2012-01-01

    Sound forest policy and management decisions to mitigate rising atmospheric CO2 depend upon accurate methodologies to quantify forest carbon pools and fluxes over large tracts of land. LiDAR remote sensing is a rapidly evolving technology for quantifying aboveground biomass and thereby carbon pools; however, little work has evaluated the efficacy of repeat LiDAR...

  17. Quantifying the underlying landscape and paths of cancer

    PubMed Central

    Li, Chunhe; Wang, Jin

    2014-01-01

    Cancer is a disease regulated by the underlying gene networks. The emergence of normal and cancer states as well as the transformation between them can be thought of as a result of the gene network interactions and associated changes. We developed a global potential landscape and path framework to quantify cancer and associated processes. We constructed a cancer gene regulatory network based on the experimental evidences and uncovered the underlying landscape. The resulting tristable landscape characterizes important biological states: normal, cancer and apoptosis. The landscape topography in terms of barrier heights between stable state attractors quantifies the global stability of the cancer network system. We propose two mechanisms of cancerization: one is by the changes of landscape topography through the changes in regulation strengths of the gene networks. The other is by the fluctuations that help the system to go over the critical barrier at fixed landscape topography. The kinetic paths from least action principle quantify the transition processes among normal state, cancer state and apoptosis state. The kinetic rates provide the quantification of transition speeds among normal, cancer and apoptosis attractors. By the global sensitivity analysis of the gene network parameters on the landscape topography, we uncovered some key gene regulations determining the transitions between cancer and normal states. This can be used to guide the design of new anti-cancer tactics, through cocktail strategy of targeting multiple key regulation links simultaneously, for preventing cancer occurrence or transforming the early cancer state back to normal state. PMID:25232051

  18. Evaluation of two methods for quantifying passeriform lice

    PubMed Central

    Koop, Jennifer A. H.; Clayton, Dale H.

    2013-01-01

    Two methods commonly used to quantify ectoparasites on live birds are visual examination and dust-ruffling. Visual examination provides an estimate of ectoparasite abundance based on an observer’s timed inspection of various body regions on a bird. Dust-ruffling involves application of insecticidal powder to feathers that are then ruffled to dislodge ectoparasites onto a collection surface where they can then be counted. Despite the common use of these methods in the field, the proportion of actual ectoparasites they account for has only been tested with Rock Pigeons (Columba livia), a relatively large-bodied species (238–302 g) with dense plumage. We tested the accuracy of the two methods using European Starlings (Sturnus vulgaris; ~75 g). We first quantified the number of lice (Brueelia nebulosa) on starlings using visual examination, followed immediately by dust-ruffling. Birds were then euthanized and the proportion of lice accounted for by each method was compared to the total number of lice on each bird as determined with a body-washing method. Visual examination and dust-ruffling each accounted for a relatively small proportion of total lice (14% and 16%, respectively), but both were still significant predictors of abundance. The number of lice observed by visual examination accounted for 68% of the variation in total abundance. Similarly, the number of lice recovered by dust-ruffling accounted for 72% of the variation in total abundance. Our results show that both methods can be used to reliably quantify the abundance of lice on European Starlings and other similar-sized passerines. PMID:24039328

  19. A new index quantifying the precipitation extremes

    NASA Astrophysics Data System (ADS)

    Busuioc, Aristita; Baciu, Madalina; Stoica, Cerasela

    2015-04-01

    Events of extreme precipitation have a great impact on society. They are associated with flooding, erosion and landslides.Various indices have been proposed to quantify these extreme events and they are mainly related to daily precipitation amount, which are usually available for long periods in many places over the world. The climate signal related to changes in the characteristics of precipitation extremes is different over various regions and it is dependent on the season and the index used to quantify the precipitation extremes. The climate model simulations and empirical evidence suggest that warmer climates, due to increased water vapour, lead to more intense precipitation events, even when the total annual precipitation is slightly reduced. It was suggested that there is a shift in the nature of precipitation events towards more intense and less frequent rains and increases in heavy rains are expected to occur in most places, even when the mean precipitation is not increasing. This conclusion was also proved for the Romanian territory in a recent study, showing a significant increasing trend of the rain shower frequency in the warm season over the entire country, despite no significant changes in the seasonal amount and the daily extremes. The shower events counted in that paper refer to all convective rains, including torrential ones giving high rainfall amount in very short time. The problem is to find an appropriate index to quantify such events in terms of their highest intensity in order to extract the maximum climate signal. In the present paper, a new index is proposed to quantify the maximum precipitation intensity in an extreme precipitation event, which could be directly related to the torrential rain intensity. This index is tested at nine Romanian stations (representing various physical-geographical conditions) and it is based on the continuous rainfall records derived from the graphical registrations (pluviograms) available at National

  20. Reasons for Placement of Restorations on Previously Unrestored Tooth Surfaces by Dental PBRN Dentists

    PubMed Central

    Nascimento, Marcelle M.; Gordan, Valeria V.; Qvist, Vibeke; Litaker, Mark S.; Rindal, D. Brad; Williams, O.D.; Fellows, Jeffrey L.; Ritchie, Lloyd K.; Mjör, Ivar A.; McClelland, Jocelyn; Gilbert, Gregg H.

    2010-01-01

    Objective To identify and quantify the reasons for placing restorations on unrestored permanent tooth surfaces and the dental materials used by Dental Practice-Based Research Network (DPBRN; www.DentalPBRN.org) dentists. Methods A total of 229 DPBRN practitioner-investigators collected data on 9,890 consecutive restorations from 5,810 patients. Information included: (1) reasons for restoring; (2) tooth and surfaces restored; and (3) restorative materials employed. Results Primary caries (85%) and non-carious defects (15%), which included abrasion/ abfraction/ erosion lesions and tooth fracture, were the main reasons for placement of restorations. Restorations due to caries were frequently placed on occlusal surfaces (49%), followed by distal, mesial, buccal/facial, lingual/palatal, and incisal surfaces. Amalgam was used for 46% of the molar and 45% of the premolar restorations. Directly placed resin-based composite (RBC) was used for 48% of the molar, 49% of the premolar, and 92% of the anterior restorations. Conclusion Dental caries on occlusal and proximal surfaces of molar teeth are the main reasons for placing restorations on previously unrestored tooth surfaces by DPBRN practitioner-investigators. RBC is the material most commonly used for occlusal and anterior restorations. Amalgam remains the material of choice to restore proximal caries in posterior teeth, although there are significant differences by DPBRN region. PMID:20354094

  1. Pendulum Underwater--An Approach for Quantifying Viscosity

    ERIC Educational Resources Information Center

    Leme, José Costa; Oliveira, Agostinho

    2017-01-01

    The purpose of the experiment presented in this paper is to quantify the viscosity of a liquid. Viscous effects are important in the flow of fluids in pipes, in the bloodstream, in the lubrication of engine parts, and in many other situations. In the present paper, the authors explore the oscillations of a physical pendulum in the form of a long…

  2. Quantifying the Thermal Fatigue of CPV Modules

    NASA Astrophysics Data System (ADS)

    Bosco, Nick; Kurtz, Sarah

    2010-10-01

    A method is presented to quantify thermal fatigue in the CPV die-attach from meteorological data. A comparative study between cities demonstrates a significant difference in the accumulated damage. These differences are most sensitive to the number of larger (ΔT) thermal cycles experienced for a location. High frequency data (<1/min) may be required to most accurately employ this method.

  3. 49 CFR 236.1031 - Previously approved PTC systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Previously approved PTC systems. 236.1031 Section 236.1031 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD....1005 and 236.1007 and otherwise conform to this subpart. (d) Previous approval or recognition of a...

  4. Suitability of ANSI standards for quantifying communication satellite system performance

    NASA Technical Reports Server (NTRS)

    Cass, Robert D.

    1988-01-01

    A study on the application of American National Standards X3.102 and X3.141 to various classes of communication satellite systems from the simple analog bent-pipe to NASA's Advanced Communications Technology Satellite (ACTS) is discussed. These standards are proposed as means for quantifying the end-to-end communication system performance of communication satellite systems. An introductory overview of the two standards are given followed by a review of the characteristics, applications, and advantages of using X3.102 and X3.141 to quantify with a description of the application of these standards to ACTS.

  5. A graph-theoretic method to quantify the airline route authority

    NASA Technical Reports Server (NTRS)

    Chan, Y.

    1979-01-01

    The paper introduces a graph-theoretic method to quantify the legal statements in route certificate which specifies the airline routing restrictions. All the authorized nonstop and multistop routes, including the shortest time routes, can be obtained, and the method suggests profitable route structure alternatives to airline analysts. This method to quantify the C.A.B. route authority was programmed in a software package, Route Improvement Synthesis and Evaluation, and demonstrated in a case study with a commercial airline. The study showed the utility of this technique in suggesting route alternatives and the possibility of improvements in the U.S. route system.

  6. Complexity and approximability of quantified and stochastic constraint satisfaction problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, H. B.; Stearns, R. L.; Marathe, M. V.

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SAT{sub c}(S)). Here, we study simultaneously the complexity of and the existence of efficient approximation algorithms for a number of variants of the problems SAT(S) and SAT{sub c}(S), and for many different D, C, and S.more » These problem variants include decision and optimization problems, for formulas, quantified formulas stochastically-quantified formulas. We denote these problems by Q-SAT(S), MAX-Q-SAT(S), S-SAT(S), MAX-S-SAT(S) MAX-NSF-Q-SAT(S) and MAX-NSF-S-SAT(S). The main contribution is the development of a unified predictive theory for characterizing the the complexity of these problems. Our unified approach is based on the following basic two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic representability. Let k {ge} 2. Let S be a finite set of finite-arity relations on {Sigma}{sub k} with the following condition on S: All finite arity relations on {Sigma}{sub k} can be represented as finite existentially-quantified conjunctions of relations in S applied to variables (to variables and constant symbols in C), Then we prove the following new results: (1) The problems SAT(S) and SAT{sub c}(S) are both NQL-complete and {le}{sub logn}{sup bw}-complete for NP. (2) The problems Q-SAT(S), Q-SAT{sub c}(S), are PSPACE-complete. Letting k = 2, the problem S-SAT(S) and S-SAT{sub c}(S) are PSPACE-complete. (3) {exists} {epsilon} > 0 for which approximating the problems MAX-Q-SAT(S) within {epsilon} times optimum is PSPACE-hard. Letting k =: 2, {exists} {epsilon} > 0 for which approximating the problems MAX-S-SAT(S) within {epsilon} times optimum is PSPACE

  7. Quantifying fish swimming behavior in response to acute exposure of aqueous copper using computer assisted video and digital image analysis

    USGS Publications Warehouse

    Calfee, Robin D.; Puglis, Holly J.; Little, Edward E.; Brumbaugh, William G.; Mebane, Christopher A.

    2016-01-01

    Behavioral responses of aquatic organisms to environmental contaminants can be precursors of other effects such as survival, growth, or reproduction. However, these responses may be subtle, and measurement can be challenging. Using juvenile white sturgeon (Acipenser transmontanus) with copper exposures, this paper illustrates techniques used for quantifying behavioral responses using computer assisted video and digital image analysis. In previous studies severe impairments in swimming behavior were observed among early life stage white sturgeon during acute and chronic exposures to copper. Sturgeon behavior was rapidly impaired and to the extent that survival in the field would be jeopardized, as fish would be swept downstream, or readily captured by predators. The objectives of this investigation were to illustrate protocols to quantify swimming activity during a series of acute copper exposures to determine time to effect during early lifestage development, and to understand the significance of these responses relative to survival of these vulnerable early lifestage fish. With mortality being on a time continuum, determining when copper first affects swimming ability helps us to understand the implications for population level effects. The techniques used are readily adaptable to experimental designs with other organisms and stressors.

  8. Quantifying Fish Swimming Behavior in Response to Acute Exposure of Aqueous Copper Using Computer Assisted Video and Digital Image Analysis

    PubMed Central

    Calfee, Robin D.; Puglis, Holly J.; Little, Edward E.; Brumbaugh, William G.; Mebane, Christopher A.

    2016-01-01

    Behavioral responses of aquatic organisms to environmental contaminants can be precursors of other effects such as survival, growth, or reproduction. However, these responses may be subtle, and measurement can be challenging. Using juvenile white sturgeon (Acipenser transmontanus) with copper exposures, this paper illustrates techniques used for quantifying behavioral responses using computer assisted video and digital image analysis. In previous studies severe impairments in swimming behavior were observed among early life stage white sturgeon during acute and chronic exposures to copper. Sturgeon behavior was rapidly impaired and to the extent that survival in the field would be jeopardized, as fish would be swept downstream, or readily captured by predators. The objectives of this investigation were to illustrate protocols to quantify swimming activity during a series of acute copper exposures to determine time to effect during early lifestage development, and to understand the significance of these responses relative to survival of these vulnerable early lifestage fish. With mortality being on a time continuum, determining when copper first affects swimming ability helps us to understand the implications for population level effects. The techniques used are readily adaptable to experimental designs with other organisms and stressors. PMID:26967350

  9. Quantifying facial expression signal and intensity use during development.

    PubMed

    Rodger, Helen; Lao, Junpeng; Caldara, Roberto

    2018-06-12

    Behavioral studies investigating facial expression recognition during development have applied various methods to establish by which age emotional expressions can be recognized. Most commonly, these methods employ static images of expressions at their highest intensity (apex) or morphed expressions of different intensities, but they have not previously been compared. Our aim was to (a) quantify the intensity and signal use for recognition of six emotional expressions from early childhood to adulthood and (b) compare both measures and assess their functional relationship to better understand the use of different measures across development. Using a psychophysical approach, we isolated the quantity of signal necessary to recognize an emotional expression at full intensity and the quantity of expression intensity (using neutral expression image morphs of varying intensities) necessary for each observer to recognize the six basic emotions while maintaining performance at 75%. Both measures revealed that fear and happiness were the most difficult and easiest expressions to recognize across age groups, respectively, a pattern already stable during early childhood. The quantity of signal and intensity needed to recognize sad, angry, disgust, and surprise expressions decreased with age. Using a Bayesian update procedure, we then reconstructed the response profiles for both measures. This analysis revealed that intensity and signal processing are similar only during adulthood and, therefore, cannot be straightforwardly compared during development. Altogether, our findings offer novel methodological and theoretical insights and tools for the investigation of the developing affective system. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Quantifying Evaporation and Evaluating Runoff Estimation Methods in a Permeable Pavement System - abstract

    EPA Science Inventory

    Studies on quantifying evaporation in permeable pavement systems are limited to few laboratory studies that used a scale to weigh evaporative losses and a field application with a tunnel-evaporation gauge. A primary objective of this research was to quantify evaporation for a la...

  11. 44 CFR 402.5 - Forwarding commodities previously shipped.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... previously shipped. 402.5 Section 402.5 Emergency Management and Assistance DEPARTMENT OF COMMERCE AND DEPARTMENT OF TRANSPORTATION SHIPMENTS ON AMERICAN FLAG SHIPS AND AIRCRAFT (T-1, INT. 1) § 402.5 Forwarding commodities previously shipped. Order T-1 applies to transportation on or discharge from ships documented...

  12. 44 CFR 402.5 - Forwarding commodities previously shipped.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... previously shipped. 402.5 Section 402.5 Emergency Management and Assistance DEPARTMENT OF COMMERCE AND DEPARTMENT OF TRANSPORTATION SHIPMENTS ON AMERICAN FLAG SHIPS AND AIRCRAFT (T-1, INT. 1) § 402.5 Forwarding commodities previously shipped. Order T-1 applies to transportation on or discharge from ships documented...

  13. 44 CFR 402.5 - Forwarding commodities previously shipped.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... previously shipped. 402.5 Section 402.5 Emergency Management and Assistance DEPARTMENT OF COMMERCE AND DEPARTMENT OF TRANSPORTATION SHIPMENTS ON AMERICAN FLAG SHIPS AND AIRCRAFT (T-1, INT. 1) § 402.5 Forwarding commodities previously shipped. Order T-1 applies to transportation on or discharge from ships documented...

  14. 44 CFR 402.5 - Forwarding commodities previously shipped.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... previously shipped. 402.5 Section 402.5 Emergency Management and Assistance DEPARTMENT OF COMMERCE AND DEPARTMENT OF TRANSPORTATION SHIPMENTS ON AMERICAN FLAG SHIPS AND AIRCRAFT (T-1, INT. 1) § 402.5 Forwarding commodities previously shipped. Order T-1 applies to transportation on or discharge from ships documented...

  15. A Generalizable Methodology for Quantifying User Satisfaction

    NASA Astrophysics Data System (ADS)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  16. Detecting and Quantifying Topography in Neural Maps

    PubMed Central

    Yarrow, Stuart; Razak, Khaleel A.; Seitz, Aaron R.; Seriès, Peggy

    2014-01-01

    Topographic maps are an often-encountered feature in the brains of many species, yet there are no standard, objective procedures for quantifying topography. Topographic maps are typically identified and described subjectively, but in cases where the scale of the map is close to the resolution limit of the measurement technique, identifying the presence of a topographic map can be a challenging subjective task. In such cases, an objective topography detection test would be advantageous. To address these issues, we assessed seven measures (Pearson distance correlation, Spearman distance correlation, Zrehen's measure, topographic product, topological correlation, path length and wiring length) by quantifying topography in three classes of cortical map model: linear, orientation-like, and clusters. We found that all but one of these measures were effective at detecting statistically significant topography even in weakly-ordered maps, based on simulated noisy measurements of neuronal selectivity and sparse sampling of the maps. We demonstrate the practical applicability of these measures by using them to examine the arrangement of spatial cue selectivity in pallid bat A1. This analysis shows that significantly topographic arrangements of interaural intensity difference and azimuth selectivity exist at the scale of individual binaural clusters. PMID:24505279

  17. Torque wrench allows readings from inaccessible locations

    NASA Technical Reports Server (NTRS)

    De Barnardo, M.

    1966-01-01

    Torque wrench with an adjustable drive shaft permits indicator to remain in view when used on sections of equipment with limited access. The shaft is capable of protruding from either side of the wrench head by means of spring loaded balls.

  18. Virological Sampling of Inaccessible Wildlife with Drones.

    PubMed

    Geoghegan, Jemma L; Pirotta, Vanessa; Harvey, Erin; Smith, Alastair; Buchmann, Jan P; Ostrowski, Martin; Eden, John-Sebastian; Harcourt, Robert; Holmes, Edward C

    2018-06-02

    There is growing interest in characterizing the viromes of diverse mammalian species, particularly in the context of disease emergence. However, little is known about virome diversity in aquatic mammals, in part due to difficulties in sampling. We characterized the virome of the exhaled breath (or blow) of the Eastern Australian humpback whale ( Megaptera novaeangliae ). To achieve an unbiased survey of virome diversity, a meta-transcriptomic analysis was performed on 19 pooled whale blow samples collected via a purpose-built Unmanned Aerial Vehicle (UAV, or drone) approximately 3 km off the coast of Sydney, Australia during the 2017 winter annual northward migration from Antarctica to northern Australia. To our knowledge, this is the first time that UAVs have been used to sample viruses. Despite the relatively small number of animals surveyed in this initial study, we identified six novel virus species from five viral families. This work demonstrates the potential of UAVs in studies of virus disease, diversity, and evolution.

  19. Quantifying a Negative: How Homeland Security Adds Value

    DTIC Science & Technology

    2015-12-01

    access to future victims. The Law Enforcement agency could then identifying and quantifying the value of future crimes. For example, if a serial ... killer is captured with evidence of the next victim or an established pattern of victimization, network theory could be used to identify the next

  20. Thermophoresis in nanoliter droplets to quantify aptamer binding.

    PubMed

    Seidel, Susanne A I; Markwardt, Niklas A; Lanzmich, Simon A; Braun, Dieter

    2014-07-21

    Biomolecule interactions are central to pharmacology and diagnostics. These interactions can be quantified by thermophoresis, the directed molecule movement along a temperature gradient. It is sensitive to binding induced changes in size, charge, or conformation. Established capillary measurements require at least 0.5 μL per sample. We cut down sample consumption by a factor of 50, using 10 nL droplets produced with acoustic droplet robotics (Labcyte). Droplets were stabilized in an oil-surfactant mix and locally heated with an IR laser. Temperature increase, Marangoni flow, and concentration distribution were analyzed by fluorescence microscopy and numerical simulation. In 10 nL droplets, we quantified AMP-aptamer affinity, cooperativity, and buffer dependence. Miniaturization and the 1536-well plate format make the method high-throughput and automation friendly. This promotes innovative applications for diagnostic assays in human serum or label-free drug discovery screening. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Using multilevel models to quantify heterogeneity in resource selection

    USGS Publications Warehouse

    Wagner, Tyler; Diefenbach, Duane R.; Christensen, Sonja; Norton, Andrew S.

    2011-01-01

    Models of resource selection are being used increasingly to predict or model the effects of management actions rather than simply quantifying habitat selection. Multilevel, or hierarchical, models are an increasingly popular method to analyze animal resource selection because they impose a relatively weak stochastic constraint to model heterogeneity in habitat use and also account for unequal sample sizes among individuals. However, few studies have used multilevel models to model coefficients as a function of predictors that may influence habitat use at different scales or quantify differences in resource selection among groups. We used an example with white-tailed deer (Odocoileus virginianus) to illustrate how to model resource use as a function of distance to road that varies among deer by road density at the home range scale. We found that deer avoidance of roads decreased as road density increased. Also, we used multilevel models with sika deer (Cervus nippon) and white-tailed deer to examine whether resource selection differed between species. We failed to detect differences in resource use between these two species and showed how information-theoretic and graphical measures can be used to assess how resource use may have differed. Multilevel models can improve our understanding of how resource selection varies among individuals and provides an objective, quantifiable approach to assess differences or changes in resource selection.

  2. Bleed-through correction for rendering and correlation analysis in multi-colour localization microscopy

    PubMed Central

    Kim, Dahan; Curthoys, Nikki M.; Parent, Matthew T.; Hess, Samuel T.

    2015-01-01

    Multi-colour localization microscopy has enabled sub-diffraction studies of colocalization between multiple biological species and quantification of their correlation at length scales previously inaccessible with conventional fluorescence microscopy. However, bleed-through, or misidentification of probe species, creates false colocalization and artificially increases certain types of correlation between two imaged species, affecting the reliability of information provided by colocalization and quantified correlation. Despite the potential risk of these artefacts of bleed-through, neither the effect of bleed-through on correlation nor methods of its correction in correlation analyses has been systematically studied at typical rates of bleed-through reported to affect multi-colour imaging. Here, we present a reliable method of bleed-through correction applicable to image rendering and correlation analysis of multi-colour localization microscopy. Application of our bleed-through correction shows our method accurately corrects the artificial increase in both types of correlations studied (Pearson coefficient and pair correlation), at all rates of bleed-through tested, in all types of correlations examined. In particular, anti-correlation could not be quantified without our bleed-through correction, even at rates of bleed-through as low as 2%. Demonstrated with dichroic-based multi-colour FPALM here, our presented method of bleed-through correction can be applied to all types of localization microscopy (PALM, STORM, dSTORM, GSDIM, etc.), including both simultaneous and sequential multi-colour modalities, provided the rate of bleed-through can be reliably determined. PMID:26185614

  3. Bleed-through correction for rendering and correlation analysis in multi-colour localization microscopy.

    PubMed

    Kim, Dahan; Curthoys, Nikki M; Parent, Matthew T; Hess, Samuel T

    2013-09-01

    Multi-colour localization microscopy has enabled sub-diffraction studies of colocalization between multiple biological species and quantification of their correlation at length scales previously inaccessible with conventional fluorescence microscopy. However, bleed-through, or misidentification of probe species, creates false colocalization and artificially increases certain types of correlation between two imaged species, affecting the reliability of information provided by colocalization and quantified correlation. Despite the potential risk of these artefacts of bleed-through, neither the effect of bleed-through on correlation nor methods of its correction in correlation analyses has been systematically studied at typical rates of bleed-through reported to affect multi-colour imaging. Here, we present a reliable method of bleed-through correction applicable to image rendering and correlation analysis of multi-colour localization microscopy. Application of our bleed-through correction shows our method accurately corrects the artificial increase in both types of correlations studied (Pearson coefficient and pair correlation), at all rates of bleed-through tested, in all types of correlations examined. In particular, anti-correlation could not be quantified without our bleed-through correction, even at rates of bleed-through as low as 2%. Demonstrated with dichroic-based multi-colour FPALM here, our presented method of bleed-through correction can be applied to all types of localization microscopy (PALM, STORM, dSTORM, GSDIM, etc.), including both simultaneous and sequential multi-colour modalities, provided the rate of bleed-through can be reliably determined.

  4. Tools for quantifying isotopic niche space and dietary variation at the individual and population level.

    USGS Publications Warehouse

    Newsome, Seth D.; Yeakel, Justin D.; Wheatley, Patrick V.; Tinker, M. Tim

    2012-01-01

    Ecologists are increasingly using stable isotope analysis to inform questions about variation in resource and habitat use from the individual to community level. In this study we investigate data sets from 2 California sea otter (Enhydra lutris nereis) populations to illustrate the advantages and potential pitfalls of applying various statistical and quantitative approaches to isotopic data. We have subdivided these tools, or metrics, into 3 categories: IsoSpace metrics, stable isotope mixing models, and DietSpace metrics. IsoSpace metrics are used to quantify the spatial attributes of isotopic data that are typically presented in bivariate (e.g., δ13C versus δ15N) 2-dimensional space. We review IsoSpace metrics currently in use and present a technique by which uncertainty can be included to calculate the convex hull area of consumers or prey, or both. We then apply a Bayesian-based mixing model to quantify the proportion of potential dietary sources to the diet of each sea otter population and compare this to observational foraging data. Finally, we assess individual dietary specialization by comparing a previously published technique, variance components analysis, to 2 novel DietSpace metrics that are based on mixing model output. As the use of stable isotope analysis in ecology continues to grow, the field will need a set of quantitative tools for assessing isotopic variance at the individual to community level. Along with recent advances in Bayesian-based mixing models, we hope that the IsoSpace and DietSpace metrics described here will provide another set of interpretive tools for ecologists.

  5. Are termite mounds biofilters for methane? - Challenges and new approaches to quantify methane oxidation in termite mounds

    NASA Astrophysics Data System (ADS)

    Nauer, Philipp A.; Hutley, Lindsay B.; Bristow, Mila; Arndt, Stefan K.

    2015-04-01

    Methane emissions from termites contribute around 3% to global methane in the atmosphere, although the total source estimate for termites is the most uncertain among all sources. In tropical regions, the relative source contribution of termites can be far higher due to the high biomass and relative importance of termites in plant decomposition. Past research focused on net emission measurements and their variability, but little is known about underlying processes governing these emissions. In particular, microbial oxidation of methane (MOX) within termite mounds has rarely been investigated. In well-studied ecosystems featuring an oxic matrix above an anoxic methane-producing habitat (e.g. landfills or sediments), the fraction of oxidized methane (fox) can reach up to 90% of gross production. However, conventional mass-balance approaches to apportion production and consumption processes can be challenging to apply in the complex-structured and almost inaccessible environment of a termite mound. In effect, all field-based data on termite-mound MOX is based on one study that measured isotopic shifts in produced and emitted methane. In this study a closed-system isotope fractionation model was applied and estimated fox ranged from 10% to almost 100%. However, it is shown here that by applying an open-system isotope-pool model, the measured isotopic shifts can also be explained by physical transport of methane alone. Different field-based methods to quantify MOX in termite mounds are proposed which do not rely on assumptions of physical gas transport. A simple approach is the use of specific inhibitors for MOX, e.g. difluoromethane (CH2F2), combined with chamber-based flux measurements before and after their application. Data is presented on the suitability of different inhibitors and first results of their application in the field. Alternatively, gas-tracer methods allow the quantification of methane oxidation and reaction kinetics without knowledge of physical gas

  6. 2 CFR 225.45 - Relationship to previous issuance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false Relationship to previous issuance. 225.45 Section 225.45 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements... INDIAN TRIBAL GOVERNMENTS (OMB CIRCULAR A-87) § 225.45 Relationship to previous issuance. (a) The...

  7. 2 CFR 230.45 - Relationship to previous issuance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false Relationship to previous issuance. 230.45 Section 230.45 Grants and Agreements OFFICE OF MANAGEMENT AND BUDGET CIRCULARS AND GUIDANCE Reserved COST PRINCIPLES FOR NON-PROFIT ORGANIZATIONS (OMB CIRCULAR A-122) § 230.45 Relationship to previous issuance. (a...

  8. A robust, sensitive assay for genomic uracil determination by LC/MS/MS reveals lower levels than previously reported.

    PubMed

    Galashevskaya, Anastasia; Sarno, Antonio; Vågbø, Cathrine B; Aas, Per A; Hagen, Lars; Slupphaug, Geir; Krokan, Hans E

    2013-09-01

    Considerable progress has been made in understanding the origins of genomic uracil and its role in genome stability and host defense; however, the main question concerning the basal level of uracil in DNA remains disputed. Results from assays designed to quantify genomic uracil vary by almost three orders of magnitude. To address the issues leading to this inconsistency, we explored possible shortcomings with existing methods and developed a sensitive LC/MS/MS-based method for the absolute quantification of genomic 2'-deoxyuridine (dUrd). To this end, DNA was enzymatically hydrolyzed to 2'-deoxyribonucleosides and dUrd was purified in a preparative HPLC step and analyzed by LC/MS/MS. The standard curve was linear over four orders of magnitude with a quantification limit of 5 fmol dUrd. Control samples demonstrated high inter-experimental accuracy (94.3%) and precision (CV 9.7%). An alternative method that employed UNG2 to excise uracil from DNA for LC/MS/MS analysis gave similar results, but the intra-assay variability was significantly greater. We quantified genomic dUrd in Ung(+/+) and Ung(-/-) mouse embryonic fibroblasts and human lymphoblastoid cell lines carrying UNG mutations. DNA-dUrd is 5-fold higher in Ung(-/-) than in Ung(+/+) fibroblasts and 11-fold higher in UNG2 dysfunctional than in UNG2 functional lymphoblastoid cells. We report approximately 400-600 dUrd per human or murine genome in repair-proficient cells, which is lower than results using other methods and suggests that genomic uracil levels may have previously been overestimated. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  9. 2 CFR 1.215 - Relationship to previous issuances.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 2 Grants and Agreements 1 2013-01-01 2013-01-01 false Relationship to previous issuances. 1.215... ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction toSubtitle A § 1.215... title 2 of the CFR. Specifically: Guidance in * * * On * * * Previously was in * * * (a) Chapter I, part...

  10. 2 CFR 1.215 - Relationship to previous issuances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 2 Grants and Agreements 1 2014-01-01 2014-01-01 false Relationship to previous issuances. 1.215... ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Subtitle A § 1.215... title 2 of the CFR. Specifically: Guidance in * * * On * * * Previously was in * * * (a) Chapter I, part...

  11. 2 CFR 1.215 - Relationship to previous issuances.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 2 Grants and Agreements 1 2012-01-01 2012-01-01 false Relationship to previous issuances. 1.215... ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction to Subtitle A § 1.215... title 2 of the CFR. Specifically: Guidance in * * * On * * * Previously was in * * * (a) Chapter I, part...

  12. 2 CFR 220.40 - Relationship to previous issuance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false Relationship to previous issuance. 220.40 Section 220.40 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements... INSTITUTIONS (OMB CIRCULAR A-21) § 220.40 Relationship to previous issuance. (a) The guidance in this part...

  13. 2 CFR 230.45 - Relationship to previous issuance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false Relationship to previous issuance. 230.45 Section 230.45 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements... ORGANIZATIONS (OMB CIRCULAR A-122) § 230.45 Relationship to previous issuance. (a) The guidance in this part...

  14. 2 CFR 1.215 - Relationship to previous issuances.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false Relationship to previous issuances. 1.215 Section 1.215 Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction toSubtitle A § 1.215 Relationship to previous issuances. Although some of the guidance was...

  15. Interpreting Quantifier Scope Ambiguity: Evidence of Heuristic First, Algorithmic Second Processing

    PubMed Central

    Dwivedi, Veena D.

    2013-01-01

    The present work suggests that sentence processing requires both heuristic and algorithmic processing streams, where the heuristic processing strategy precedes the algorithmic phase. This conclusion is based on three self-paced reading experiments in which the processing of two-sentence discourses was investigated, where context sentences exhibited quantifier scope ambiguity. Experiment 1 demonstrates that such sentences are processed in a shallow manner. Experiment 2 uses the same stimuli as Experiment 1 but adds questions to ensure deeper processing. Results indicate that reading times are consistent with a lexical-pragmatic interpretation of number associated with context sentences, but responses to questions are consistent with the algorithmic computation of quantifier scope. Experiment 3 shows the same pattern of results as Experiment 2, despite using stimuli with different lexical-pragmatic biases. These effects suggest that language processing can be superficial, and that deeper processing, which is sensitive to structure, only occurs if required. Implications for recent studies of quantifier scope ambiguity are discussed. PMID:24278439

  16. Quantifying soil respiration at landscape scales. Chapter 11

    Treesearch

    John B. Bradford; Michael G. Ryan

    2008-01-01

    Soil CO2, efflux, or soil respiration, represents a substantial component of carbon cycling in terrestrial ecosystems. Consequently, quantifying soil respiration over large areas and long time periods is an increasingly important goal. However, soil respiration rates vary dramatically in space and time in response to both environmental conditions...

  17. Dissociation of quantifiers and object nouns in speech in focal neurodegenerative disease.

    PubMed

    Ash, Sharon; Ternes, Kylie; Bisbing, Teagan; Min, Nam Eun; Moran, Eileen; York, Collin; McMillan, Corey T; Irwin, David J; Grossman, Murray

    2016-08-01

    Quantifiers such as many and some are thought to depend in part on the conceptual representation of number knowledge, while object nouns such as cookie and boy appear to depend in part on visual feature knowledge associated with object concepts. Further, number knowledge is associated with a frontal-parietal network while object knowledge is related in part to anterior and ventral portions of the temporal lobe. We examined the cognitive and anatomic basis for the spontaneous speech production of quantifiers and object nouns in non-aphasic patients with focal neurodegenerative disease associated with corticobasal syndrome (CBS, n=33), behavioral variant frontotemporal degeneration (bvFTD, n=54), and semantic variant primary progressive aphasia (svPPA, n=19). We recorded a semi-structured speech sample elicited from patients and healthy seniors (n=27) during description of the Cookie Theft scene. We observed a dissociation: CBS and bvFTD were significantly impaired in the production of quantifiers but not object nouns, while svPPA were significantly impaired in the production of object nouns but not quantifiers. MRI analysis revealed that quantifier production deficits in CBS and bvFTD were associated with disease in a frontal-parietal network important for number knowledge, while impaired production of object nouns in all patient groups was related to disease in inferior temporal regions important for representations of visual feature knowledge of objects. These findings imply that partially dissociable representations in semantic memory may underlie different segments of the lexicon. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Quantifying microwear on experimental Mistassini quartzite scrapers: preliminary results of exploratory research using LSCM and scale-sensitive fractal analysis.

    PubMed

    Stemp, W James; Lerner, Harry J; Kristant, Elaine H

    2013-01-01

    Although previous use-wear studies involving quartz and quartzite have been undertaken by archaeologists, these are comparatively few in number. Moreover, there has been relatively little effort to quantify use-wear on stone tools made from quartzite. The purpose of this article is to determine the effectiveness of a measurement system, laser scanning confocal microscopy (LSCM), to document the surface roughness or texture of experimental Mistassini quartzite scrapers used on two different contact materials (fresh and dry deer hide). As in previous studies using LSCM on chert, flint, and obsidian, this exploratory study incorporates a mathematical algorithm that permits the discrimination of surface roughness based on comparisons at multiple scales. Specifically, we employ measures of relative area (RelA) coupled with the F-test to discriminate used from unused stone tool surfaces, as well as surfaces of quartzite scrapers used on dry and fresh deer hide. Our results further demonstrate the effect of raw material variation on use-wear formation and its documentation using LSCM and RelA. © Wiley Periodicals, Inc.

  19. Quantifying Proportional Variability

    PubMed Central

    Heath, Joel P.; Borowski, Peter

    2013-01-01

    Real quantities can undergo such a wide variety of dynamics that the mean is often a meaningless reference point for measuring variability. Despite their widespread application, techniques like the Coefficient of Variation are not truly proportional and exhibit pathological properties. The non-parametric measure Proportional Variability (PV) [1] resolves these issues and provides a robust way to summarize and compare variation in quantities exhibiting diverse dynamical behaviour. Instead of being based on deviation from an average value, variation is simply quantified by comparing the numbers to each other, requiring no assumptions about central tendency or underlying statistical distributions. While PV has been introduced before and has already been applied in various contexts to population dynamics, here we present a deeper analysis of this new measure, derive analytical expressions for the PV of several general distributions and present new comparisons with the Coefficient of Variation, demonstrating cases in which PV is the more favorable measure. We show that PV provides an easily interpretable approach for measuring and comparing variation that can be generally applied throughout the sciences, from contexts ranging from stock market stability to climate variation. PMID:24386334

  20. 2 CFR 1.215 - Relationship to previous issuances.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false Relationship to previous issuances. 1.215... ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction toSubtitle A § 1.215... title 2 of the CFR. Specifically: Guidance in * * * On * * * Previously was in * * * (a) Chapter I, part...

  1. 44 CFR 402.5 - Forwarding commodities previously shipped.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Forwarding commodities previously shipped. 402.5 Section 402.5 Emergency Management and Assistance DEPARTMENT OF COMMERCE AND DEPARTMENT OF TRANSPORTATION SHIPMENTS ON AMERICAN FLAG SHIPS AND AIRCRAFT (T-1, INT. 1) § 402.5 Forwarding commodities previously shipped. Order T-1...

  2. Quantifying the tibiofemoral joint space using x-ray tomosynthesis.

    PubMed

    Kalinosky, Benjamin; Sabol, John M; Piacsek, Kelly; Heckel, Beth; Gilat Schmidt, Taly

    2011-12-01

    Digital x-ray tomosynthesis (DTS) has the potential to provide 3D information about the knee joint in a load-bearing posture, which may improve diagnosis and monitoring of knee osteoarthritis compared with projection radiography, the current standard of care. Manually quantifying and visualizing the joint space width (JSW) from 3D tomosynthesis datasets may be challenging. This work developed a semiautomated algorithm for quantifying the 3D tibiofemoral JSW from reconstructed DTS images. The algorithm was validated through anthropomorphic phantom experiments and applied to three clinical datasets. A user-selected volume of interest within the reconstructed DTS volume was enhanced with 1D multiscale gradient kernels. The edge-enhanced volumes were divided by polarity into tibial and femoral edge maps and combined across kernel scales. A 2D connected components algorithm was performed to determine candidate tibial and femoral edges. A 2D joint space width map (JSW) was constructed to represent the 3D tibiofemoral joint space. To quantify the algorithm accuracy, an adjustable knee phantom was constructed, and eleven posterior-anterior (PA) and lateral DTS scans were acquired with the medial minimum JSW of the phantom set to 0-5 mm in 0.5 mm increments (VolumeRad™, GE Healthcare, Chalfont St. Giles, United Kingdom). The accuracy of the algorithm was quantified by comparing the minimum JSW in a region of interest in the medial compartment of the JSW map to the measured phantom setting for each trial. In addition, the algorithm was applied to DTS scans of a static knee phantom and the JSW map compared to values estimated from a manually segmented computed tomography (CT) dataset. The algorithm was also applied to three clinical DTS datasets of osteoarthritic patients. The algorithm segmented the JSW and generated a JSW map for all phantom and clinical datasets. For the adjustable phantom, the estimated minimum JSW values were plotted against the measured values for all

  3. User guide : process for quantifying the benefits of research.

    DOT National Transportation Integrated Search

    2017-07-01

    The Minnesota Department of Transportation Research Services has adopted a process for quantifying the monetary benefits of research projects, such as the dollar value of particular ideas when implemented across the states transportation system. T...

  4. Flexible Electronics-Based Transformers for Extreme Environments

    NASA Technical Reports Server (NTRS)

    Quadrelli, Marco B.; Stoica, Adrian; Ingham, Michel; Thakur, Anubhav

    2015-01-01

    This paper provides a survey of the use of modular multifunctional systems, called Flexible Transformers, to facilitate the exploration of extreme and previously inaccessible environments. A novel dynamics and control model of a modular algorithm for assembly, folding, and unfolding of these innovative structural systems is also described, together with the control model and the simulation results.

  5. Quantifying Structural and Compositional Changes in Forest Cover in NW Yunnan, China

    NASA Astrophysics Data System (ADS)

    Hakkenberg, C.

    2012-12-01

    NW Yunnan, China is a region renowned for high levels of biodiversity, endemism and genetically distinct refugial plant populations. It is also a focal area for China's national reforestation efforts like the Natural Forest Protection Program (NFPP), intended to control erosion in the Upper Yangtze watershed. As part of a larger project to investigate the role of reforestation programs in facilitating the emergence of increasingly species-rich forest communities on a previously degraded and depauperate land mosaic in montane SW China, this study uses a series of Landsat TM images to quantify the spatial pattern and rate of structural and compositional change in forests recovering from medium to large-scale disturbances in the area over the past 25 years. Beyond the fundamental need to assess the outcomes of one of the world's largest reforestation programs, this research offers approaches to confronting two critical methodological issues: (1) techniques for characterizing subtle changes in the nature of vegetation cover, and (2) reducing change detection uncertainty due to persistent cloud cover and shadow. To address difficulties in accurately assessing the structure and composition of vegetative regrowth, a biophysical model was parameterized with over 300 ground-truthed canopy cover assessment points to determine pattern and rate of long-term vegetation changes. To combat pervasive shadow and cloud cover, an interactive generalized additive model (GAM) model based on topographic and spatial predictors was used to overcome some of the constraints of satellite image analysis in Himalayan regions characterized by extreme topography and extensive cloud cover during the summer monsoon. The change detection is assessed for accuracy using ground-truthed observations in a variety of forest cover types and topographic positions. Results indicate effectiveness in reducing the areal extent of unclassified regions and increasing total change detection accuracy. In addition

  6. Quantifying rock mass strength degradation in coastal rock cliffs

    NASA Astrophysics Data System (ADS)

    Brain, Matthew; Lim, Michael; Rosser, Nick; Petley, David; Norman, Emma; Barlow, John

    2010-05-01

    Although rock cliffs are generally perceived to evolve through undercutting and cantilever collapse of material, the recent application of high-resolution three-dimensional monitoring techniques has suggested that the volumetric losses recorded from layers above the intertidal zone produce an equally significant contribution to cliff behaviour. It is therefore important to understand the controls on rockfalls in such layers. Here we investigate the progressive influence of subaerial exposure and weathering on the geotechnical properties of the rocks encountered within the geologically complex coastal cliffs of the northeast coast of England, UK. Through a program of continuous in situ monitoring of local environmental and tidal conditions and laboratory rock strength testing, we aim to better quantify the relationships between environmental processes and the geotechnical response of the cliff materials. We have cut fresh (not previously exposed) samples from the three main rock types (sandstone, mudstone and shale) found within the cliff to uniform size, shape and volume, thus minimizing variability and removing previous surface weathering effects. In order to characterise the intact strength of the rocks, we have undertaken unconfined compressive strength and triaxial strength tests using high pressure (400 kN maximum axial load; 64 MPa maximum cell pressure) triaxial testing apparatus. The results outline the peak strength characteristics of the unweathered materials. We then divided the samples of each lithology into different experimental groups. The first set of samples remained in the laboratory at constant temperature and humidity; this group provides our control. Samples from each of the three rock types were located at heights on the cliff face corresponding with the different lithologies: at the base (mudstone), in the mid cliff (shale) and at the top of the cliff (sandstone). This subjected them to the same conditions experienced by the in situ cliff

  7. Quantifying protein-protein interactions in high throughput using protein domain microarrays.

    PubMed

    Kaushansky, Alexis; Allen, John E; Gordus, Andrew; Stiffler, Michael A; Karp, Ethan S; Chang, Bryan H; MacBeath, Gavin

    2010-04-01

    Protein microarrays provide an efficient way to identify and quantify protein-protein interactions in high throughput. One drawback of this technique is that proteins show a broad range of physicochemical properties and are often difficult to produce recombinantly. To circumvent these problems, we have focused on families of protein interaction domains. Here we provide protocols for constructing microarrays of protein interaction domains in individual wells of 96-well microtiter plates, and for quantifying domain-peptide interactions in high throughput using fluorescently labeled synthetic peptides. As specific examples, we will describe the construction of microarrays of virtually every human Src homology 2 (SH2) and phosphotyrosine binding (PTB) domain, as well as microarrays of mouse PDZ domains, all produced recombinantly in Escherichia coli. For domains that mediate high-affinity interactions, such as SH2 and PTB domains, equilibrium dissociation constants (K(D)s) for their peptide ligands can be measured directly on arrays by obtaining saturation binding curves. For weaker binding domains, such as PDZ domains, arrays are best used to identify candidate interactions, which are then retested and quantified by fluorescence polarization. Overall, protein domain microarrays provide the ability to rapidly identify and quantify protein-ligand interactions with minimal sample consumption. Because entire domain families can be interrogated simultaneously, they provide a powerful way to assess binding selectivity on a proteome-wide scale and provide an unbiased perspective on the connectivity of protein-protein interaction networks.

  8. Spatial dynamics of ecosystem service flows: a comprehensive approach to quantifying actual services

    USGS Publications Warehouse

    Bagstad, Kenneth J.; Johnson, Gary W.; Voigt, Brian; Villa, Ferdinando

    2013-01-01

    Recent ecosystem services research has highlighted the importance of spatial connectivity between ecosystems and their beneficiaries. Despite this need, a systematic approach to ecosystem service flow quantification has not yet emerged. In this article, we present such an approach, which we formalize as a class of agent-based models termed “Service Path Attribution Networks” (SPANs). These models, developed as part of the Artificial Intelligence for Ecosystem Services (ARIES) project, expand on ecosystem services classification terminology introduced by other authors. Conceptual elements needed to support flow modeling include a service's rivalness, its flow routing type (e.g., through hydrologic or transportation networks, lines of sight, or other approaches), and whether the benefit is supplied by an ecosystem's provision of a beneficial flow to people or by absorption of a detrimental flow before it reaches them. We describe our implementation of the SPAN framework for five ecosystem services and discuss how to generalize the approach to additional services. SPAN model outputs include maps of ecosystem service provision, use, depletion, and flows under theoretical, possible, actual, inaccessible, and blocked conditions. We highlight how these different ecosystem service flow maps could be used to support various types of decision making for conservation and resource management planning.

  9. Quantifying South East Asia's forest degradation using latest generation optical and radar satellite remote sensing

    NASA Astrophysics Data System (ADS)

    Broich, M.; Tulbure, M. G.; Wijaya, A.; Weisse, M.; Stolle, F.

    2017-12-01

    Deforestation and forest degradation form the 2nd largest source of anthropogenic CO2 emissions. While deforestation is being globally mapped with satellite image time series, degradation remains insufficiently quantified. Previous studies quantified degradation for small scale, local sites. A method suitable for accurate mapping across large areas has not yet been developed due to the variability of the low magnitude and short-lived degradation signal and the absence of data with suitable resolution properties. Here we use a combination of newly available streams of free optical and radar image time series acquired by NASA and ESA, and HPC-based data science algorithms to innovatively quantify degradation consistently across Southeast Asia (SEA). We used Sentinel1 c-band radar data and NASA's new Harmonized Landsat8 (L8) Sentinel2 (S2) product (HLS) for cloud free optical images. Our results show that dense time series of cloud penetrating Sentinel 1 c-band radar can provide degradation alarm flags, while the HLS product of cloud-free optical images can unambiguously confirm degradation alarms. The detectability of degradation differed across SEA. In the seasonal forest of continental SEA the reliability of our radar-based alarm flags increased as the variability in landscape moisture decreases in the dry season. We reliably confirmed alarms with optical image time series during the late dry season, where degradation in open canopy forests becomes detectable once the undergrowth vegetation has died down. Conversely, in insular SEA landscape moisture is low, the radar time series generated degradation alarms flags with moderate to high reliability throughout the year, further confirmed with the HLS product. Based on the HLS product we can now confirm degradation within < 6 months on average as opposed to 1 year when using either L8 or S2 alone. In contrast to continental SEA, across insular SEA our degradation maps are not suitable to provide annual maps of total

  10. Clinical methods to quantify trunk mobility in an elite male surfing population.

    PubMed

    Furness, James; Climstein, Mike; Sheppard, Jeremy M; Abbott, Allan; Hing, Wayne

    2016-05-01

    Thoracic mobility in the sagittal and horizontal planes are key requirements in the sport of surfing; however to date the normal values of these movements have not yet been quantified in a surfing population. To develop a reliable method to quantify thoracic mobility in the sagittal plane; to assess the reliability of an existing thoracic rotation method, and quantify thoracic mobility in an elite male surfing population. Clinical Measurement, reliability and comparative study. A total of 30 subjects were used to determine the reliability component. 15 elite surfers were used as part of a comparative analysis with age and gender matched controls. Intraclass correlation coefficient values ranged between 0.95-0.99 (95% CI; 0.89-0.99) for both thoracic methods. The elite surfing group had significantly (p ≤ 0.05) greater rotation than the comparative group (mean rotation 63.57° versus 40.80°, respectively). This study has illustrated reliable methods to assess the thoracic spine in the sagittal plane and thoracic rotation. It has also quantified ROM in a surfing cohort; identifying thoracic rotation as a key movement. This information may provide clinicians, coaches and athletic trainers with imperative information regarding the importance of maintaining adequate thoracic rotation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. NREL, Johns Hopkins SAIS Develop Method to Quantify Life Cycle Land Use of

    Science.gov Websites

    Life Cycle Land Use of Electricity from Natural Gas News Release: NREL, Johns Hopkins SAIS Develop Method to Quantify Life Cycle Land Use of Electricity from Natural Gas October 2, 2017 A case study of time provides quantifiable information on the life cycle land use of generating electricity from

  12. Quantifying chemical reactions by using mixing analysis.

    PubMed

    Jurado, Anna; Vázquez-Suñé, Enric; Carrera, Jesús; Tubau, Isabel; Pujades, Estanislao

    2015-01-01

    This work is motivated by a sound understanding of the chemical processes that affect the organic pollutants in an urban aquifer. We propose an approach to quantify such processes using mixing calculations. The methodology consists of the following steps: (1) identification of the recharge sources (end-members) and selection of the species (conservative and non-conservative) to be used, (2) identification of the chemical processes and (3) evaluation of mixing ratios including the chemical processes. This methodology has been applied in the Besòs River Delta (NE Barcelona, Spain), where the River Besòs is the main aquifer recharge source. A total number of 51 groundwater samples were collected from July 2007 to May 2010 during four field campaigns. Three river end-members were necessary to explain the temporal variability of the River Besòs: one river end-member is from the wet periods (W1) and two are from dry periods (D1 and D2). This methodology has proved to be useful not only to compute the mixing ratios but also to quantify processes such as calcite and magnesite dissolution, aerobic respiration and denitrification undergone at each observation point. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Quantifying spatial distribution of spurious mixing in ocean models.

    PubMed

    Ilıcak, Mehmet

    2016-12-01

    Numerical mixing is inevitable for ocean models due to tracer advection schemes. Until now, there is no robust way to identify the regions of spurious mixing in ocean models. We propose a new method to compute the spatial distribution of the spurious diapycnic mixing in an ocean model. This new method is an extension of available potential energy density method proposed by Winters and Barkan (2013). We test the new method in lock-exchange and baroclinic eddies test cases. We can quantify the amount and the location of numerical mixing. We find high-shear areas are the main regions which are susceptible to numerical truncation errors. We also test the new method to quantify the numerical mixing in different horizontal momentum closures. We conclude that Smagorinsky viscosity has less numerical mixing than the Leith viscosity using the same non-dimensional constant.

  14. Matrix Dissolution Techniques Applied to Extract and Quantify Precipitates from a Microalloyed Steel

    NASA Astrophysics Data System (ADS)

    Lu, Junfang; Wiskel, J. Barry; Omotoso, Oladipo; Henein, Hani; Ivey, Douglas G.

    2011-07-01

    Microalloyed steels possess good strength and toughness, as well as excellent weldability; these attributes are necessary for oil and gas pipelines in northern climates. These properties are attributed in part to the presence of nanosized carbide and carbonitride precipitates. To understand the strengthening mechanisms and to optimize the strengthening effects, it is necessary to quantify the size distribution, volume fraction, and chemical speciation of these precipitates. However, characterization techniques suitable for quantifying fine precipitates are limited because of their fine sizes, wide particle size distributions, and low volume fractions. In this article, two matrix dissolution techniques have been developed to extract precipitates from a Grade100 (yield strength of 690 MPa) microalloyed steel. Relatively large volumes of material can be analyzed, and statistically significant quantities of precipitates of different sizes are collected. Transmission electron microscopy (TEM) and X-ray diffraction (XRD) are combined to analyze the chemical speciation of these precipitates. Rietveld refinement of XRD patterns is used to quantify fully the relative amounts of the precipitates. The size distribution of the nanosized precipitates is quantified using dark-field imaging in the TEM.

  15. A new paradigm of quantifying ecosystem stress through chemical signatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kravitz, Ben; Guenther, Alex B.; Gu, Lianhong

    Stress-induced emissions of biogenic volatile organic compounds (VOCs) from terrestrial ecosystems may be one of the dominant sources of VOC emissions world-wide. Understanding the ecosystem stress response could reveal how ecosystems will respond and adapt to climate change and, in turn, quantify changes in the atmospheric burden of VOC oxidants and secondary organic aerosols. Here we argue, based on preliminary evidence from several opportunistic measurement sources, that chemical signatures of stress can be identified and quantified at the ecosystem scale. We also outline future endeavors that we see as next steps toward uncovering quantitative signatures of stress, including new advancesmore » in both VOC data collection and analysis of "big data."« less

  16. Quantifying Anthropogenic Dust Emissions

    NASA Astrophysics Data System (ADS)

    Webb, Nicholas P.; Pierre, Caroline

    2018-02-01

    Anthropogenic land use and land cover change, including local environmental disturbances, moderate rates of wind-driven soil erosion and dust emission. These human-dust cycle interactions impact ecosystems and agricultural production, air quality, human health, biogeochemical cycles, and climate. While the impacts of land use activities and land management on aeolian processes can be profound, the interactions are often complex and assessments of anthropogenic dust loads at all scales remain highly uncertain. Here, we critically review the drivers of anthropogenic dust emission and current evaluation approaches. We then identify and describe opportunities to: (1) develop new conceptual frameworks and interdisciplinary approaches that draw on ecological state-and-transition models to improve the accuracy and relevance of assessments of anthropogenic dust emissions; (2) improve model fidelity and capacity for change detection to quantify anthropogenic impacts on aeolian processes; and (3) enhance field research and monitoring networks to support dust model applications to evaluate the impacts of disturbance processes on local to global-scale wind erosion and dust emissions.

  17. Quantifying three dimensional reconnection in fragmented current layers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wyper, P. F., E-mail: peter.f.wyper@nasa.gov; Hesse, M., E-mail: michael.hesse-1@nasa.gov

    There is growing evidence that when magnetic reconnection occurs in high Lundquist number plasmas such as in the Solar Corona or the Earth's Magnetosphere it does so within a fragmented, rather than a smooth current layer. Within the extent of these fragmented current regions, the associated magnetic flux transfer and energy release occur simultaneously in many different places. This investigation focusses on how best to quantify the rate at which reconnection occurs in such layers. An analytical theory is developed which describes the manner in which new connections form within fragmented current layers in the absence of magnetic nulls. Itmore » is shown that the collective rate at which new connections form can be characterized by two measures; a total rate which measures the true rate at which new connections are formed and a net rate which measures the net change of connection associated with the largest value of the integral of E{sub ||} through all of the non-ideal regions. Two simple analytical models are presented which demonstrate how each should be applied and what they quantify.« less

  18. Quantifying prosthetic gait deviation using simple outcome measures

    PubMed Central

    Kark, Lauren; Odell, Ross; McIntosh, Andrew S; Simmons, Anne

    2016-01-01

    AIM: To develop a subset of simple outcome measures to quantify prosthetic gait deviation without needing three-dimensional gait analysis (3DGA). METHODS: Eight unilateral, transfemoral amputees and 12 unilateral, transtibial amputees were recruited. Twenty-eight able-bodied controls were recruited. All participants underwent 3DGA, the timed-up-and-go test and the six-minute walk test (6MWT). The lower-limb amputees also completed the Prosthesis Evaluation Questionnaire. Results from 3DGA were summarised using the gait deviation index (GDI), which was subsequently regressed, using stepwise regression, against the other measures. RESULTS: Step-length (SL), self-selected walking speed (SSWS) and the distance walked during the 6MWT (6MWD) were significantly correlated with GDI. The 6MWD was the strongest, single predictor of the GDI, followed by SL and SSWS. The predictive ability of the regression equations were improved following inclusion of self-report data related to mobility and prosthetic utility. CONCLUSION: This study offers a practicable alternative to quantifying kinematic deviation without the need to conduct complete 3DGA. PMID:27335814

  19. Clinical relevance of quantified fundus autofluorescence in diabetic macular oedema.

    PubMed

    Yoshitake, S; Murakami, T; Uji, A; Unoki, N; Dodo, Y; Horii, T; Yoshimura, N

    2015-05-01

    To quantify the signal intensity of fundus autofluorescence (FAF) and evaluate its association with visual function and optical coherence tomography (OCT) findings in diabetic macular oedema (DMO). We reviewed 103 eyes of 78 patients with DMO and 30 eyes of 22 patients without DMO. FAF images were acquired using Heidelberg Retina Angiograph 2, and the signal levels of FAF in the individual subfields of the Early Treatment Diabetic Retinopathy Study grid were measured. We evaluated the association between quantified FAF and the logMAR VA and OCT findings. One hundred and three eyes with DMO had lower FAF signal intensity levels in the parafoveal subfields compared with 30 eyes without DMO. The autofluorescence intensity in the parafoveal subfields was associated negatively with logMAR VA and the retinal thickness in the corresponding subfields. The autofluorescence levels in the parafoveal subfield, except the nasal subfield, were lower in eyes with autofluorescent cystoid spaces in the corresponding subfield than in those without autofluorescent cystoid spaces. The autofluorescence level in the central subfield was related to foveal cystoid spaces but not logMAR VA or retinal thickness in the corresponding area. Quantified FAF in the parafovea has diagnostic significance and is clinically relevant in DMO.

  20. Clinical relevance of quantified fundus autofluorescence in diabetic macular oedema

    PubMed Central

    Yoshitake, S; Murakami, T; Uji, A; Unoki, N; Dodo, Y; Horii, T; Yoshimura, N

    2015-01-01

    Purpose To quantify the signal intensity of fundus autofluorescence (FAF) and evaluate its association with visual function and optical coherence tomography (OCT) findings in diabetic macular oedema (DMO). Methods We reviewed 103 eyes of 78 patients with DMO and 30 eyes of 22 patients without DMO. FAF images were acquired using Heidelberg Retina Angiograph 2, and the signal levels of FAF in the individual subfields of the Early Treatment Diabetic Retinopathy Study grid were measured. We evaluated the association between quantified FAF and the logMAR VA and OCT findings. Results One hundred and three eyes with DMO had lower FAF signal intensity levels in the parafoveal subfields compared with 30 eyes without DMO. The autofluorescence intensity in the parafoveal subfields was associated negatively with logMAR VA and the retinal thickness in the corresponding subfields. The autofluorescence levels in the parafoveal subfield, except the nasal subfield, were lower in eyes with autofluorescent cystoid spaces in the corresponding subfield than in those without autofluorescent cystoid spaces. The autofluorescence level in the central subfield was related to foveal cystoid spaces but not logMAR VA or retinal thickness in the corresponding area. Conclusions Quantified FAF in the parafovea has diagnostic significance and is clinically relevant in DMO. PMID:25771817

  1. Quantifying uncertainty in climate change science through empirical information theory.

    PubMed

    Majda, Andrew J; Gershgorin, Boris

    2010-08-24

    Quantifying the uncertainty for the present climate and the predictions of climate change in the suite of imperfect Atmosphere Ocean Science (AOS) computer models is a central issue in climate change science. Here, a systematic approach to these issues with firm mathematical underpinning is developed through empirical information theory. An information metric to quantify AOS model errors in the climate is proposed here which incorporates both coarse-grained mean model errors as well as covariance ratios in a transformation invariant fashion. The subtle behavior of model errors with this information metric is quantified in an instructive statistically exactly solvable test model with direct relevance to climate change science including the prototype behavior of tracer gases such as CO(2). Formulas for identifying the most sensitive climate change directions using statistics of the present climate or an AOS model approximation are developed here; these formulas just involve finding the eigenvector associated with the largest eigenvalue of a quadratic form computed through suitable unperturbed climate statistics. These climate change concepts are illustrated on a statistically exactly solvable one-dimensional stochastic model with relevance for low frequency variability of the atmosphere. Viable algorithms for implementation of these concepts are discussed throughout the paper.

  2. Systematic Validation of Protein Force Fields against Experimental Data

    PubMed Central

    Eastwood, Michael P.; Dror, Ron O.; Shaw, David E.

    2012-01-01

    Molecular dynamics simulations provide a vehicle for capturing the structures, motions, and interactions of biological macromolecules in full atomic detail. The accuracy of such simulations, however, is critically dependent on the force field—the mathematical model used to approximate the atomic-level forces acting on the simulated molecular system. Here we present a systematic and extensive evaluation of eight different protein force fields based on comparisons of experimental data with molecular dynamics simulations that reach a previously inaccessible timescale. First, through extensive comparisons with experimental NMR data, we examined the force fields' abilities to describe the structure and fluctuations of folded proteins. Second, we quantified potential biases towards different secondary structure types by comparing experimental and simulation data for small peptides that preferentially populate either helical or sheet-like structures. Third, we tested the force fields' abilities to fold two small proteins—one α-helical, the other with β-sheet structure. The results suggest that force fields have improved over time, and that the most recent versions, while not perfect, provide an accurate description of many structural and dynamical properties of proteins. PMID:22384157

  3. Flight Analysis of an Autonomously Navigated Experimental Lander for High Altitude Recovery

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey; Niehaus, Justin; Goodenow, Debra; Dunker, Storm; Montague, David

    2016-01-01

    First steps have been taken to qualify a family of parafoil systems capable of increasing the survivability and reusability of high-altitude balloon payloads. The research is motivated by the common risk facing balloon payloads where expensive flight hardware can often land in inaccessible areas that make them difficult or impossible to recover. The Autonomously Navigated Experimental Lander (ANGEL) flight test introduced a commercial Guided Parachute Aerial Delivery System (GPADS) to a previously untested environment at 108,000ft MSL to determine its high-altitude survivability and capabilities. Following release, ANGEL descended under a drogue until approximately 25,000ft, at which point the drogue was jettisoned and the main parachute was deployed, commencing navigation. Multiple data acquisition platforms were used to characterize the return-to-point technology performance and help determine its suitability for returning future scientific payloads ranging from 180 to 10,000lbs to safer and more convenient landing locations. This report describes the test vehicle design, and summarizes the captured sensor data. Various post-flight analyses are used to quantify the system's performance, gondola load data, and serve as a reference point for subsequent missions.

  4. Flight Analysis of an Autonomously Navigated Experimental Lander

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey; Niehaus, Justin; Goodenow, Debra; Dunker, Storm; Montague, David

    2016-01-01

    First steps have been taken to qualify a family of parafoil systems capable of increasing the survivability and reusability of high-altitude balloon payloads. The research is motivated by the common risk facing balloon payloads where expensive flight hardware can often land in inaccessible areas that make them difficult or impossible to recover. The Autonomously Navigated Experimental Lander (ANGEL) flight test introduced a commercial Guided Parachute Aerial Delivery System (GPADS) to a previously untested environment at 108,000 feet Mean Sea Level (MSL) to determine its high-altitude survivability and capabilities. Following release, ANGEL descended under a drogue until approximately 25,000 feet, at which point the drogue was jettisoned and the main parachute was deployed, commencing navigation. Multiple data acquisition platforms were used to characterize the return-to-point technology performance and help determine its suitability for returning future scientific payloads ranging from 180 to 10,000 pounds to safer and more convenient landing locations. This report describes the test vehicle design, and summarizes the captured sensor data. Various post-flight analyses are used to quantify the systems performance, gondola load data, and serve as a reference point for subsequent missions.

  5. Quantifying structural states of soft mudrocks

    NASA Astrophysics Data System (ADS)

    Li, B.; Wong, R. C. K.

    2016-05-01

    In this paper, a cm model is proposed to quantify structural states of soft mudrocks, which are dependent on clay fractions and porosities. Physical properties of natural and reconstituted soft mudrock samples are used to derive two parameters in the cm model. With the cm model, a simplified homogenization approach is proposed to estimate geomechanical properties and fabric orientation distributions of soft mudrocks based on the mixture theory. Soft mudrocks are treated as a mixture of nonclay minerals and clay-water composites. Nonclay minerals have a high stiffness and serve as a structural framework of mudrocks when they have a high volume fraction. Clay-water composites occupy the void space among nonclay minerals and serve as an in-fill matrix. With the increase of volume fraction of clay-water composites, there is a transition in the structural state from the state of framework supported to the state of matrix supported. The decreases in shear strength and pore size as well as increases in compressibility and anisotropy in fabric are quantitatively related to such transition. The new homogenization approach based on the proposed cm model yields better performance evaluation than common effective medium modeling approaches because the interactions among nonclay minerals and clay-water composites are considered. With wireline logging data, the cm model is applied to quantify the structural states of Colorado shale formations at different depths in the Cold Lake area, Alberta, Canada. Key geomechancial parameters are estimated based on the proposed homogenization approach and the critical intervals with low strength shale formations are identified.

  6. Distractor Repetitions Retrieve Previous Responses and Previous Targets: Experimental Dissociations of Distractor-Response and Distractor-Target Bindings

    ERIC Educational Resources Information Center

    Giesen, Carina; Rothermund, Klaus

    2014-01-01

    Even an irrelevant distractor stimulus is integrated into event files. Subsequently repeating the distractor triggers retrieval of the event file; however, an unresolved issue concerns the question of "what" is retrieved by the distractor. While recent studies predominantly assume that the distractor retrieves the previous response, it…

  7. Quantifying density-independent mortality of temperate tree species

    Treesearch

    Heather E Lintz; Andrew N. Gray; Andrew Yost; Richard Sniezko; Chris Woodall; Matt Reilly; Karen Hutten; Mark Elliott

    2016-01-01

    Forest resilience to climate change is a topic of national concern as our standing assets and future forestsare important to our livelihood. Many tree species are predicted to decline or disappear while othersmay be able to adapt or migrate. Efforts to quantify and disseminate the current condition of forests areurgently needed to guide management and policy. Here, we...

  8. [Prevalence of previously diagnosed diabetes mellitus in Mexico.

    PubMed

    Rojas-Martínez, Rosalba; Basto-Abreu, Ana; Aguilar-Salinas, Carlos A; Zárate-Rojas, Emiliano; Villalpando, Salvador; Barrientos-Gutiérrez, Tonatiuh

    2018-01-01

    To compare the prevalence of previously diagnosed diabetes in 2016 with previous national surveys and to describe treatment and its complications. Mexico's national surveys Ensa 2000, Ensanut 2006, 2012 and 2016 were used. For 2016, logistic regression models and measures of central tendency and dispersion were obtained. The prevalence of previously diagnosed diabetes in 2016 was 9.4%. The increase of 2.2% relative to 2012 was not significant and only observed in patients older than 60 years. While preventive measures have increased, the access to medical treatment and lifestyle has not changed. The treatment has been modified, with an increase in insulin and decrease in hypoglycaemic agents. Population aging, lack of screening actions and the increase in diabetes complications will lead to an increase on the burden of disease. Policy measures targeting primary and secondary prevention of diabetes are crucial.

  9. Quantifying Pilot Visual Attention in Low Visibility Terminal Operations

    NASA Technical Reports Server (NTRS)

    Ellis, Kyle K.; Arthur, J. J.; Latorella, Kara A.; Kramer, Lynda J.; Shelton, Kevin J.; Norman, Robert M.; Prinzel, Lawrence J.

    2012-01-01

    Quantifying pilot visual behavior allows researchers to determine not only where a pilot is looking and when, but holds implications for specific behavioral tracking when these data are coupled with flight technical performance. Remote eye tracking systems have been integrated into simulators at NASA Langley with effectively no impact on the pilot environment. This paper discusses the installation and use of a remote eye tracking system. The data collection techniques from a complex human-in-the-loop (HITL) research experiment are discussed; especially, the data reduction algorithms and logic to transform raw eye tracking data into quantified visual behavior metrics, and analysis methods to interpret visual behavior. The findings suggest superior performance for Head-Up Display (HUD) and improved attentional behavior for Head-Down Display (HDD) implementations of Synthetic Vision System (SVS) technologies for low visibility terminal area operations. Keywords: eye tracking, flight deck, NextGen, human machine interface, aviation

  10. Quantifying phalangeal curvature: an empirical comparison of alternative methods.

    PubMed

    Stern, J T; Jungers, W L; Susman, R L

    1995-05-01

    It has been generally assumed and theoretically argued that the curvature of finger and toe bones seen in some nonhuman primates is associated with cheiridial use in an arboreal setting. Assessment of such curvature in fossil primates has been used to infer the positional behavior of these animals. Several methods of quantifying curvature of bones have been proposed. The measure most commonly applied to phalanges is that of included angle, but this has come under some criticism. We consider various other approaches for quantifying phalangeal curvature, demonstrating that some are equivalent to use of included angle, but that one--normalized curvature moment arm (NCMA)--represents a true alternative. A comparison of NCMA to included angle, both calculated on manual and pedal proximal phalanges of humans, apes, some monkeys, and the Hadar fossils, revealed that these two different measures of curvature are highly correlated and result in very similar distributional patterns.

  11. Kinematics of the asal rift (djibouti) determined from the deformation of fieale volcano.

    PubMed

    De Chabalier, J B; Avouac, J P

    1994-09-16

    Because of its subaerial exposure the Asal rift segment provides an exceptional opportunity to quantify the deformation field of an active rift and assess the contribution of tectonics and volcanism to rifting processes. The present topography of the Asal rift results from the tectonic dismemberment during the last 100,000 years of a large central volcanic edifice that formed astride the rift zone 300,000 to 100,000 years ago. Three-dimensional deformation of this volcano has been quantified from the combined analysis of the topography and geology. The analysis indicates that spreading at 17 to 29 millimeters per year in a N40 degrees +/- 5 degrees E direction accounts for most of the separation between Arabia and Somalia. The small topographic subsidence relative to extension suggests that tectonic thinning of the crust has been balanced by injection and underplating of magmatic material of near crustal density. The methodology developed in this study could also be applied to quantify deformation in relatively inaccessible areas where the main available information is topography or bathymetry.

  12. Quantifying Repetitive Speech in Autism Spectrum Disorders and Language Impairment

    PubMed Central

    van Santen, Jan P. H.; Sproat, Richard W.; Hill, Alison Presmanes

    2013-01-01

    We report on an automatic technique for quantifying two types of repetitive speech: repetitions of what the child says him/herself (self-repeats) and of what is uttered by an interlocutor (echolalia). We apply this technique to a sample of 111 children between the ages of four and eight: 42 typically developing children (TD), 19 children with specific language impairment (SLI), 25 children with autism spectrum disorders (ASD) plus language impairment (ALI), and 25 children with ASD with normal, non-impaired language (ALN). The results indicate robust differences in echolalia between the TD and ASD groups as a whole (ALN + ALI), and between TD and ALN children. There were no significant differences between ALI and SLI children for echolalia or self-repetitions. The results confirm previous findings that children with ASD repeat the language of others more than other populations of children. On the other hand, self-repetition does not appear to be significantly more frequent in ASD, nor does it matter whether the child’s echolalia occurred within one (immediate) or two turns (near-immediate) of the adult’s original utterance. Furthermore, non-significant differences between ALN and SLI, between TD and SLI, and between ALI and TD are suggestive that echolalia may not be specific to ALN or to ASD in general. One important innovation of this work is an objective fully automatic technique for assessing the amount of repetition in a transcript of a child’s utterances. PMID:23661504

  13. The Quantified Self (QS) Movement and Some Emerging Opportunities for the Educational Technology Field

    ERIC Educational Resources Information Center

    Lee, Victor R.

    2013-01-01

    The Quantified Self (QS) movement is a growing global effort to use new mobile and wearable technologies to automatically obtain personal data about everyday activities. The social and material infrastructure associated with the Quantified Self (QS) movement provides a number of ideas that educational technologists should consider incorporating…

  14. Mountain torrents: Quantifying vulnerability and assessing uncertainties

    PubMed Central

    Totschnig, Reinhold; Fuchs, Sven

    2013-01-01

    Vulnerability assessment for elements at risk is an important component in the framework of risk assessment. The vulnerability of buildings affected by torrent processes can be quantified by vulnerability functions that express a mathematical relationship between the degree of loss of individual elements at risk and the intensity of the impacting process. Based on data from the Austrian Alps, we extended a vulnerability curve for residential buildings affected by fluvial sediment transport processes to other torrent processes and other building types. With respect to this goal to merge different data based on different processes and building types, several statistical tests were conducted. The calculation of vulnerability functions was based on a nonlinear regression approach applying cumulative distribution functions. The results suggest that there is no need to distinguish between different sediment-laden torrent processes when assessing vulnerability of residential buildings towards torrent processes. The final vulnerability functions were further validated with data from the Italian Alps and different vulnerability functions presented in the literature. This comparison showed the wider applicability of the derived vulnerability functions. The uncertainty inherent to regression functions was quantified by the calculation of confidence bands. The derived vulnerability functions may be applied within the framework of risk management for mountain hazards within the European Alps. The method is transferable to other mountain regions if the input data needed are available. PMID:27087696

  15. Information criteria for quantifying loss of reversibility in parallelized KMC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gourgoulias, Konstantinos, E-mail: gourgoul@math.umass.edu; Katsoulakis, Markos A., E-mail: markos@math.umass.edu; Rey-Bellet, Luc, E-mail: luc@math.umass.edu

    Parallel Kinetic Monte Carlo (KMC) is a potent tool to simulate stochastic particle systems efficiently. However, despite literature on quantifying domain decomposition errors of the particle system for this class of algorithms in the short and in the long time regime, no study yet explores and quantifies the loss of time-reversibility in Parallel KMC. Inspired by concepts from non-equilibrium statistical mechanics, we propose the entropy production per unit time, or entropy production rate, given in terms of an observable and a corresponding estimator, as a metric that quantifies the loss of reversibility. Typically, this is a quantity that cannot bemore » computed explicitly for Parallel KMC, which is why we develop a posteriori estimators that have good scaling properties with respect to the size of the system. Through these estimators, we can connect the different parameters of the scheme, such as the communication time step of the parallelization, the choice of the domain decomposition, and the computational schedule, with its performance in controlling the loss of reversibility. From this point of view, the entropy production rate can be seen both as an information criterion to compare the reversibility of different parallel schemes and as a tool to diagnose reversibility issues with a particular scheme. As a demonstration, we use Sandia Lab's SPPARKS software to compare different parallelization schemes and different domain (lattice) decompositions.« less

  16. Information criteria for quantifying loss of reversibility in parallelized KMC

    NASA Astrophysics Data System (ADS)

    Gourgoulias, Konstantinos; Katsoulakis, Markos A.; Rey-Bellet, Luc

    2017-01-01

    Parallel Kinetic Monte Carlo (KMC) is a potent tool to simulate stochastic particle systems efficiently. However, despite literature on quantifying domain decomposition errors of the particle system for this class of algorithms in the short and in the long time regime, no study yet explores and quantifies the loss of time-reversibility in Parallel KMC. Inspired by concepts from non-equilibrium statistical mechanics, we propose the entropy production per unit time, or entropy production rate, given in terms of an observable and a corresponding estimator, as a metric that quantifies the loss of reversibility. Typically, this is a quantity that cannot be computed explicitly for Parallel KMC, which is why we develop a posteriori estimators that have good scaling properties with respect to the size of the system. Through these estimators, we can connect the different parameters of the scheme, such as the communication time step of the parallelization, the choice of the domain decomposition, and the computational schedule, with its performance in controlling the loss of reversibility. From this point of view, the entropy production rate can be seen both as an information criterion to compare the reversibility of different parallel schemes and as a tool to diagnose reversibility issues with a particular scheme. As a demonstration, we use Sandia Lab's SPPARKS software to compare different parallelization schemes and different domain (lattice) decompositions.

  17. Quantifying light-dependent circadian disruption in humans and animal models.

    PubMed

    Rea, Mark S; Figueiro, Mariana G

    2014-12-01

    Although circadian disruption is an accepted term, little has been done to develop methods to quantify the degree of disruption or entrainment individual organisms actually exhibit in the field. A variety of behavioral, physiological and hormonal responses vary in amplitude over a 24-h period and the degree to which these circadian rhythms are synchronized to the daily light-dark cycle can be quantified with a technique known as phasor analysis. Several studies have been carried out using phasor analysis in an attempt to measure circadian disruption exhibited by animals and by humans. To perform these studies, species-specific light measurement and light delivery technologies had to be developed based upon a fundamental understanding of circadian phototransduction mechanisms in the different species. When both nocturnal rodents and diurnal humans, experienced different species-specific light-dark shift schedules, they showed, based upon phasor analysis of the light-dark and activity-rest patterns, similar levels of light-dependent circadian disruption. Indeed, both rodents and humans show monotonically increasing and quantitatively similar levels of light-dependent circadian disruption with increasing shift-nights per week. Thus, phasor analysis provides a method for quantifying circadian disruption in the field and in the laboratory as well as a bridge between ecological measurements of circadian entrainment in humans and parametric studies of circadian disruption in animal models, including nocturnal rodents.

  18. Determining and Forecasting Savings from Competing Previously Sole Source/Noncompetitive Contracts

    DTIC Science & Technology

    1978-10-01

    SUMMARY A. BACKGROUND. Within the defense market , It is difficult to isolate, identify and quantify the impact of competition on acquisition costs...63 C. F04iCASTING METhODOLOGY .................. . 7 0. COMPETITION INDEX . . . . . . . . . . . . . . . . . . .. . 77 E . USE AS A FORECASTING TOOL...program is still active. e . From this projection, calculate the actual total contract price coiencing with the buy-out competition by multiplying the

  19. Ordered weighted averaging with fuzzy quantifiers: GIS-based multicriteria evaluation for land-use suitability analysis

    NASA Astrophysics Data System (ADS)

    Malczewski, Jacek

    2006-12-01

    The objective of this paper is to incorporate the concept of fuzzy (linguistic) quantifiers into the GIS-based land suitability analysis via ordered weighted averaging (OWA). OWA is a multicriteria evaluation procedure (or combination operator). The nature of the OWA procedure depends on some parameters, which can be specified by means of fuzzy (linguistic) quantifiers. By changing the parameters, OWA can generate a wide range of decision strategies or scenarios. The quantifier-guided OWA procedure is illustrated using land-use suitability analysis in a region of Mexico.

  20. A Computational Approach to Quantifiers as an Explanation for Some Language Impairments in Schizophrenia

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Styla, Rafal; Szymanik, Jakub

    2011-01-01

    We compared the processing of natural language quantifiers in a group of patients with schizophrenia and a healthy control group. In both groups, the difficulty of the quantifiers was consistent with computational predictions, and patients with schizophrenia took more time to solve the problems. However, they were significantly less accurate only…

  1. Serum Hydroxyl Radical Scavenging Capacity as Quantified with Iron-Free Hydroxyl Radical Source

    PubMed Central

    Endo, Nobuyuki; Oowada, Shigeru; Sueishi, Yoshimi; Shimmei, Masashi; Makino, Keisuke; Fujii, Hirotada; Kotake, Yashige

    2009-01-01

    We have developed a simple ESR spin trapping based method for hydroxyl (OH) radical scavenging-capacity determination, using iron-free OH radical source. Instead of the widely used Fenton reaction, a short (typically 5 seconds) in situ UV-photolysis of a dilute hydrogen peroxide aqueous solution was employed to generate reproducible amounts of OH radicals. ESR spin trapping was applied to quantify OH radicals; the decrease in the OH radical level due to the specimen’s scavenging activity was converted into the OH radical scavenging capacity (rate). The validity of the method was confirmed in pure antioxidants, and the agreement with the previous data was satisfactory. In the second half of this work, the new method was applied to the sera of chronic renal failure (CRF) patients. We show for the first time that after hemodialysis, OH radical scavenging capacity of the CRF serum was restored to the level of healthy control. This method is simple and rapid, and the low concentration hydrogen peroxide is the only chemical added to the system, that could eliminate the complexity of iron-involved Fenton reactions or the use of the pulse-radiolysis system. PMID:19794928

  2. Construct Validity of Accelerometry-Derived Force to Quantify Basketball Movement Patterns.

    PubMed

    Staunton, Craig; Wundersitz, Daniel; Gordon, Brett; Kingsley, Michael

    2017-12-01

    This study assessed the construct validity of accelerometry-derived net force to quantify the external demands of basketball movements. Twenty-eight basketballers completed the Yo-Yo intermittent recovery test (Yo-Yo-IR1) and basketball exercise simulation test (BEST). Intensity was quantified using accelerometry-derived average net force (AvF Net ) and PlayerLoad TM per minute (PL/min). Within-player correlations were determined between intensity and running speed during Yo-Yo-IR1. Measured AvF Net was determined for movements during the BEST and predicted AvF Net was calculated using movement speed and correlations from Yo-Yo-IR1. Relationships between AvF Net and running speed during Yo-Yo-IR1 were nearly perfect (r 2 =0.95, 95% CI: 0.94-0.96; p<0.001) and stronger than correlations between running speed and PL/min (r 2 =0.80, 95% CI: 0.73-0.87; p<0.001). Differences between measured and predicted AvF Net were small during jogging and running (<1%), but large for basketball movements including jumping, change-of-direction and shuffling (15%-41%). As hypothesised, AvF Net differed by playing position (11%-16%; p <0.001) and reflected the additional demand upon players with larger body mass and lower movement efficiency. Both sprint speed and AvF Net reduced during the course of the BEST ( p ≤0.013). These findings confirm the construct validity of AvF Net to quantify the external demand of basketball movements. Accelerometry-derived net force has the potential to quantify the external demands of basketballers during training and competition. © Georg Thieme Verlag KG Stuttgart · New York.

  3. Analytical Algorithms to Quantify the Uncertainty in Remaining Useful Life Prediction

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Saxena, Abhinav; Daigle, Matthew; Goebel, Kai

    2013-01-01

    This paper investigates the use of analytical algorithms to quantify the uncertainty in the remaining useful life (RUL) estimate of components used in aerospace applications. The prediction of RUL is affected by several sources of uncertainty and it is important to systematically quantify their combined effect by computing the uncertainty in the RUL prediction in order to aid risk assessment, risk mitigation, and decisionmaking. While sampling-based algorithms have been conventionally used for quantifying the uncertainty in RUL, analytical algorithms are computationally cheaper and sometimes, are better suited for online decision-making. While exact analytical algorithms are available only for certain special cases (for e.g., linear models with Gaussian variables), effective approximations can be made using the the first-order second moment method (FOSM), the first-order reliability method (FORM), and the inverse first-order reliability method (Inverse FORM). These methods can be used not only to calculate the entire probability distribution of RUL but also to obtain probability bounds on RUL. This paper explains these three methods in detail and illustrates them using the state-space model of a lithium-ion battery.

  4. Quantifying fluctuations in market liquidity: analysis of the bid-ask spread.

    PubMed

    Plerou, Vasiliki; Gopikrishnan, Parameswaran; Stanley, H Eugene

    2005-04-01

    Quantifying the statistical features of the bid-ask spread offers the possibility of understanding some aspects of market liquidity. Using quote data for the 116 most frequently traded stocks on the New York Stock Exchange over the two-year period 1994-1995, we analyze the fluctuations of the average bid-ask spread S over a time interval deltat. We find that S is characterized by a distribution that decays as a power law P[S>x] approximately x(-zeta(S) ), with an exponent zeta(S) approximately = 3 for all 116 stocks analyzed. Our analysis of the autocorrelation function of S shows long-range power-law correlations, (S(t)S(t + tau)) approximately tau(-mu(s)), similar to those previously found for the volatility. We next examine the relationship between the bid-ask spread and the volume Q, and find that S approximately ln Q; we find that a similar logarithmic relationship holds between the transaction-level bid-ask spread and the trade size. We then study the relationship between S and other indicators of market liquidity such as the frequency of trades N and the frequency of quote updates U, and find S approximately ln N and S approximately ln U. Lastly, we show that the bid-ask spread and the volatility are also related logarithmically.

  5. Quantifying climatic controls on river network topology across scales

    NASA Astrophysics Data System (ADS)

    Ranjbar Moshfeghi, S.; Hooshyar, M.; Wang, D.; Singh, A.

    2017-12-01

    Branching structure of river networks is an important topologic and geomorphologic feature that depends on several factors (e.g. climate, tectonic). However, mechanisms that cause these drainage patterns in river networks are poorly understood. In this study, we investigate the effects of varying climatic forcing on river network topology and geomorphology. For this, we select 20 catchments across the United States with different long-term climatic conditions quantified by climate aridity index (AI), defined here as the ratio of mean annual potential evaporation (Ep) to precipitation (P), capturing variation in runoff and vegetation cover. The river networks of these catchments are extracted, using a curvature-based method, from high-resolution (1 m) digital elevation models and several metrics such as drainage density, branching angle, and width functions are computed. We also use a multiscale-entropy-based approach to quantify the topologic irregularity and structural richness of these river networks. Our results reveal systematic impacts of climate forcing on the structure of river networks.

  6. 75 FR 76056 - FEDERAL REGISTER CITATION OF PREVIOUS ANNOUNCEMENT:

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-07

    ... SECURITIES AND EXCHANGE COMMISSION Sunshine Act Meeting FEDERAL REGISTER CITATION OF PREVIOUS ANNOUNCEMENT: STATUS: Closed meeting. PLACE: 100 F Street, NE., Washington, DC. DATE AND TIME OF PREVIOUSLY ANNOUNCED MEETING: Thursday, December 9, 2010 at 2 p.m. CHANGE IN THE MEETING: Time change. The closed...

  7. The effect of previous traumatic injury on homicide risk.

    PubMed

    Griffin, Russell L; Davis, Gregory G; Levitan, Emily B; MacLennan, Paul A; Redden, David T; McGwin, Gerald

    2014-07-01

    Research has reported that a strong risk factor for traumatic injury is having a previous injury (i.e., recidivism). To date, the only study examining the relationship between recidivism and homicide reported strong associations, but was limited by possible selection bias. The current matched case-control study utilized coroner's data from 2004 to 2008. Subjects were linked to trauma registry data to determine whether the person had a previous traumatic injury. Conditional logistic regression was used to estimate odds ratios (ORs) and 95% confidence intervals (95% CIs) for the association between homicide and recidivism. Homicide risk was increased for those having a previous traumatic injury (OR 1.81, 95% CI 1.09-2.99) or a previous intentional injury (OR 2.53, 95% CI 1.24-5.17). These results suggest an association between homicide and injury recidivism, and that trauma centers may be an effective setting for screening individuals for secondary prevention efforts of homicide through violence prevention programs. © 2014 American Academy of Forensic Sciences.

  8. Quantifying cross-border movements and migrations for guiding the strategic planning of malaria control and elimination

    PubMed Central

    2014-01-01

    Background Identifying human and malaria parasite movements is important for control planning across all transmission intensities. Imported infections can reintroduce infections into areas previously free of infection, maintain ‘hotspots’ of transmission and import drug resistant strains, challenging national control programmes at a variety of temporal and spatial scales. Recent analyses based on mobile phone usage data have provided valuable insights into population and likely parasite movements within countries, but these data are restricted to sub-national analyses, leaving important cross-border movements neglected. Methods National census data were used to analyse and model cross-border migration and movement, using East Africa as an example. ‘Hotspots’ of origin-specific immigrants from neighbouring countries were identified for Kenya, Tanzania and Uganda. Populations of origin-specific migrants were compared to distance from origin country borders and population size at destination, and regression models were developed to quantify and compare differences in migration patterns. Migration data were then combined with existing spatially-referenced malaria data to compare the relative propensity for cross-border malaria movement in the region. Results The spatial patterns and processes for immigration were different between each origin and destination country pair. Hotspots of immigration, for example, were concentrated close to origin country borders for most immigrants to Tanzania, but for Kenya, a similar pattern was only seen for Tanzanian and Ugandan immigrants. Regression model fits also differed between specific migrant groups, with some migration patterns more dependent on population size at destination and distance travelled than others. With these differences between immigration patterns and processes, and heterogeneous transmission risk in East Africa and the surrounding region, propensities to import malaria infections also likely show

  9. Method and apparatus for detecting and quantifying bacterial spores on a surface

    NASA Technical Reports Server (NTRS)

    Ponce, Adrian (Inventor)

    2009-01-01

    A method and an apparatus for detecting and quantifying bacterial spores on a surface. In accordance with the method: bacterial spores are transferred from a place of origin to a test surface, the test surface comprises lanthanide ions. Aromatic molecules are released from the bacterial spores; a complex of the lanthanide ions and aromatic molecules is formed on the test surface, the complex is excited to generate a characteristic luminescence on the test surface; the luminescence on the test surface is detected and quantified.

  10. Method and Apparatus for Detecting and Quantifying Bacterial Spores on a Surface

    NASA Technical Reports Server (NTRS)

    Ponce, Adrian (Inventor)

    2016-01-01

    A method and an apparatus for detecting and quantifying bacterial spores on a surface. In accordance with the method: bacterial spores are transferred from a place of origin to a test surface, the test surface comprises lanthanide ions. Aromatic molecules are released from the bacterial spores; a complex of the lanthanide ions and aromatic molecules is formed on the test surface, the complex is excited to generate a characteristic luminescence on the test surface; the luminescence on the test surface is detected and quantified.

  11. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data pointsmore » can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.« less

  12. Quantifying serum antibody in bird fanciers' hypersensitivity pneumonitis.

    PubMed

    McSharry, Charles; Dye, George M; Ismail, Tengku; Anderson, Kenneth; Spiers, Elizabeth M; Boyd, Gavin

    2006-06-26

    Detecting serum antibody against inhaled antigens is an important diagnostic adjunct for hypersensitivity pneumonitis (HP). We sought to validate a quantitative fluorimetric assay testing serum from bird fanciers. Antibody activity was assessed in bird fanciers and control subjects using various avian antigens and serological methods, and the titer was compared with symptoms of HP. IgG antibody against pigeon serum antigens, quantified by fluorimetry, provided a good discriminator of disease. Levels below 10 mg/L were insignificant, and increasing titers were associated with disease. The assay was unaffected by total IgG, autoantibodies and antibody to dietary hen's egg antigens. Antigens from pigeon serum seem sufficient to recognize immune sensitivity to most common pet avian species. Decreasing antibody titers confirmed antigen avoidance. Increasing antibody titer reflected the likelihood of HP, and decreasing titers confirmed antigen avoidance. Quantifying antibody was rapid and the increased sensitivity will improve the rate of false-negative reporting and obviate the need for invasive diagnostic procedures. Automated fluorimetry provides a method for the international standardization of HP serology thereby improving quality control and improving its suitability as a diagnostic adjunct.

  13. Quantifying potential recharge in mantled sinkholes using ERT.

    PubMed

    Schwartz, Benjamin F; Schreiber, Madeline E

    2009-01-01

    Potential recharge through thick soils in mantled sinkholes was quantified using differential electrical resistivity tomography (ERT). Conversion of time series two-dimensional (2D) ERT profiles into 2D volumetric water content profiles using a numerically optimized form of Archie's law allowed us to monitor temporal changes in water content in soil profiles up to 9 m in depth. Combining Penman-Monteith daily potential evapotranspiration (PET) and daily precipitation data with potential recharge calculations for three sinkhole transects indicates that potential recharge occurred only during brief intervals over the study period and ranged from 19% to 31% of cumulative precipitation. Spatial analysis of ERT-derived water content showed that infiltration occurred both on sinkhole flanks and in sinkhole bottoms. Results also demonstrate that mantled sinkholes can act as regions of both rapid and slow recharge. Rapid recharge is likely the result of flow through macropores (such as root casts and thin gravel layers), while slow recharge is the result of unsaturated flow through fine-grained sediments. In addition to developing a new method for quantifying potential recharge at the field scale in unsaturated conditions, we show that mantled sinkholes are an important component of storage in a karst system.

  14. Quantifying Complexity in Quantum Phase Transitions via Mutual Information Complex Networks

    NASA Astrophysics Data System (ADS)

    Valdez, Marc Andrew; Jaschke, Daniel; Vargas, David L.; Carr, Lincoln D.

    2017-12-01

    We quantify the emergent complexity of quantum states near quantum critical points on regular 1D lattices, via complex network measures based on quantum mutual information as the adjacency matrix, in direct analogy to quantifying the complexity of electroencephalogram or functional magnetic resonance imaging measurements of the brain. Using matrix product state methods, we show that network density, clustering, disparity, and Pearson's correlation obtain the critical point for both quantum Ising and Bose-Hubbard models to a high degree of accuracy in finite-size scaling for three classes of quantum phase transitions, Z2, mean field superfluid to Mott insulator, and a Berzinskii-Kosterlitz-Thouless crossover.

  15. Utilizing novel diversity estimators to quantify multiple dimensions of microbial biodiversity across domains

    PubMed Central

    2013-01-01

    Background Microbial ecologists often employ methods from classical community ecology to analyze microbial community diversity. However, these methods have limitations because microbial communities differ from macro-organismal communities in key ways. This study sought to quantify microbial diversity using methods that are better suited for data spanning multiple domains of life and dimensions of diversity. Diversity profiles are one novel, promising way to analyze microbial datasets. Diversity profiles encompass many other indices, provide effective numbers of diversity (mathematical generalizations of previous indices that better convey the magnitude of differences in diversity), and can incorporate taxa similarity information. To explore whether these profiles change interpretations of microbial datasets, diversity profiles were calculated for four microbial datasets from different environments spanning all domains of life as well as viruses. Both similarity-based profiles that incorporated phylogenetic relatedness and naïve (not similarity-based) profiles were calculated. Simulated datasets were used to examine the robustness of diversity profiles to varying phylogenetic topology and community composition. Results Diversity profiles provided insights into microbial datasets that were not detectable with classical univariate diversity metrics. For all datasets analyzed, there were key distinctions between calculations that incorporated phylogenetic diversity as a measure of taxa similarity and naïve calculations. The profiles also provided information about the effects of rare species on diversity calculations. Additionally, diversity profiles were used to examine thousands of simulated microbial communities, showing that similarity-based and naïve diversity profiles only agreed approximately 50% of the time in their classification of which sample was most diverse. This is a strong argument for incorporating similarity information and calculating diversity

  16. 5 CFR 532.405 - Use of highest previous rate.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Use of highest previous rate. 532.405 Section 532.405 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PREVAILING RATE SYSTEMS Pay Administration § 532.405 Use of highest previous rate. (a)(1) Subject to the...

  17. 28 CFR 10.5 - Incorporation of papers previously filed.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Incorporation of papers previously filed... CARRYING ON ACTIVITIES WITHIN THE UNITED STATES Registration Statement § 10.5 Incorporation of papers previously filed. Papers and documents already filed with the Attorney General pursuant to the said act and...

  18. 28 CFR 10.5 - Incorporation of papers previously filed.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Incorporation of papers previously filed... CARRYING ON ACTIVITIES WITHIN THE UNITED STATES Registration Statement § 10.5 Incorporation of papers previously filed. Papers and documents already filed with the Attorney General pursuant to the said act and...

  19. 27 CFR 26.55 - Previously approved formulas.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... BUREAU, DEPARTMENT OF THE TREASURY ALCOHOL LIQUORS AND ARTICLES FROM PUERTO RICO AND THE VIRGIN ISLANDS Formulas for Products From Puerto Rico § 26.55 Previously approved formulas. Any formula approved on Form...

  20. 27 CFR 26.55 - Previously approved formulas.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... BUREAU, DEPARTMENT OF THE TREASURY LIQUORS LIQUORS AND ARTICLES FROM PUERTO RICO AND THE VIRGIN ISLANDS Formulas for Products From Puerto Rico § 26.55 Previously approved formulas. Any formula approved on Form...

  1. 27 CFR 26.55 - Previously approved formulas.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... BUREAU, DEPARTMENT OF THE TREASURY ALCOHOL LIQUORS AND ARTICLES FROM PUERTO RICO AND THE VIRGIN ISLANDS Formulas for Products From Puerto Rico § 26.55 Previously approved formulas. Any formula approved on Form...

  2. 27 CFR 26.55 - Previously approved formulas.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... BUREAU, DEPARTMENT OF THE TREASURY LIQUORS LIQUORS AND ARTICLES FROM PUERTO RICO AND THE VIRGIN ISLANDS Formulas for Products From Puerto Rico § 26.55 Previously approved formulas. Any formula approved on Form...

  3. 27 CFR 26.55 - Previously approved formulas.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... BUREAU, DEPARTMENT OF THE TREASURY LIQUORS LIQUORS AND ARTICLES FROM PUERTO RICO AND THE VIRGIN ISLANDS Formulas for Products From Puerto Rico § 26.55 Previously approved formulas. Any formula approved on Form...

  4. Dealing with Quantifier Scope Ambiguity in Natural Language Understanding

    ERIC Educational Resources Information Center

    Hafezi Manshadi, Mohammad

    2014-01-01

    Quantifier scope disambiguation (QSD) is one of the most challenging problems in deep natural language understanding (NLU) systems. The most popular approach for dealing with QSD is to simply leave the semantic representation (scope-) underspecified and to incrementally add constraints to filter out unwanted readings. Scope underspecification has…

  5. Towards simulating and quantifying the light-cone EoR 21-cm signal

    NASA Astrophysics Data System (ADS)

    Mondal, Rajesh; Bharadwaj, Somnath; Datta, Kanan K.

    2018-02-01

    The light-cone (LC) effect causes the Epoch of Reionization (EoR) 21-cm signal T_b (\\hat{n}, ν ) to evolve significantly along the line-of-sight (LoS) direction ν. In the first part of this paper, we present a method to properly incorporate the LC effect in simulations of the EoR 21-cm signal that includes peculiar velocities. Subsequently, we discuss how to quantify the second-order statistics of the EoR 21-cm signal in the presence of the LC effect. We demonstrate that the 3D power spectrum P(k) fails to quantify the entire information because it assumes the signal to be ergodic and periodic, whereas the LC effect breaks these conditions along the LoS. Considering a LC simulation centred at redshift 8 where the mean neutral fraction drops from 0.65 to 0.35 across the box, we find that P(k) misses out ˜ 40 per cent of the information at the two ends of the 17.41 MHz simulation bandwidth. The multifrequency angular power spectrum (MAPS) C_{ℓ}(ν_1,ν_2) quantifies the statistical properties of T_b (\\hat{n}, ν ) without assuming the signal to be ergodic and periodic along the LoS. We expect this to quantify the entire statistical information of the EoR 21-cm signal. We apply MAPS to our LC simulation and present preliminary results for the EoR 21-cm signal.

  6. No discrimination against previous mates in a sexually cannibalistic spider

    NASA Astrophysics Data System (ADS)

    Fromhage, Lutz; Schneider, Jutta M.

    2005-09-01

    In several animal species, females discriminate against previous mates in subsequent mating decisions, increasing the potential for multiple paternity. In spiders, female choice may take the form of selective sexual cannibalism, which has been shown to bias paternity in favor of particular males. If cannibalistic attacks function to restrict a male's paternity, females may have little interest to remate with males having survived such an attack. We therefore studied the possibility of female discrimination against previous mates in sexually cannibalistic Argiope bruennichi, where females almost always attack their mate at the onset of copulation. We compared mating latency and copulation duration of males having experienced a previous copulation either with the same or with a different female, but found no evidence for discrimination against previous mates. However, males copulated significantly shorter when inserting into a used, compared to a previously unused, genital pore of the female.

  7. Quantifying the role of forest soil and bedrock in the acid neutralization of surface water in steep hillslopes.

    PubMed

    Asano, Yuko; Uchida, Taro

    2005-02-01

    The role of soil and bedrock in acid neutralizing processes has been difficult to quantify because of hydrological and biogeochemical uncertainties. To quantify those roles, hydrochemical observations were conducted at two hydrologically well-defined, steep granitic hillslopes in the Tanakami Mountains of Japan. These paired hillslopes are similar except for their soils; Fudoji is leached of base cations (base saturation <6%), while Rachidani is covered with fresh soil (base saturation >30%), because the erosion rate is 100-1000 times greater. The results showed that (1) soil solution pH at the soil-bedrock interface at Fudoji (4.3) was significantly lower than that of Rachidani (5.5), (2) the hillslope discharge pH in both hillslopes was similar (6.7-6.8), and (3) at Fudoji, 60% of the base cations leaching from the hillslope were derived from bedrock, whereas only 20% were derived from bedrock in Rachidani. Further, previously published results showed that the stream pH could not be predicted from the acid deposition rate and soil base saturation status. These results demonstrate that bedrock plays an especially important role when the overlying soil has been leached of base cations. These results indicate that while the status of soil acidification is a first-order control on vulnerability to surface water acidification, in some cases such as at Fudoji, subsurface interaction with the bedrock determines the sensitivity of surface water to acidic deposition.

  8. Assessment and Optimization of the Accuracy of an Aircraft-Based Technique Used to Quantify Greenhouse Gas Emission Rates from Point Sources

    NASA Astrophysics Data System (ADS)

    Shepson, P. B.; Lavoie, T. N.; Kerlo, A. E.; Stirm, B. H.

    2016-12-01

    Understanding the contribution of anthropogenic activities to atmospheric greenhouse gas concentrations requires an accurate characterization of emission sources. Previously, we have reported the use of a novel aircraft-based mass balance measurement technique to quantify greenhouse gas emission rates from point and area sources, however, the accuracy of this approach has not been evaluated to date. Here, an assessment of method accuracy and precision was performed by conducting a series of six aircraft-based mass balance experiments at a power plant in southern Indiana and comparing the calculated CO2 emission rates to the reported hourly emission measurements made by continuous emissions monitoring systems (CEMS) installed directly in the exhaust stacks at the facility. For all flights, CO2 emissions were quantified before CEMS data were released online to ensure unbiased analysis. Additionally, we assess the uncertainties introduced to the final emission rate caused by our analysis method, which employs a statistical kriging model to interpolate and extrapolate the CO2 fluxes across the flight transects from the ground to the top of the boundary layer. Subsequently, using the results from these flights combined with the known emissions reported by the CEMS, we perform an inter-model comparison of alternative kriging methods to evaluate the performance of the kriging approach.

  9. Quantifying trail erosion and stream sedimentation with sediment tracers

    Treesearch

    Mark S. Riedel

    2006-01-01

    Abstract--The impacts of forest disturbance and roads on stream sedimentation have been rigorously investigated and documented. While historical research on turbidity and suspended sediments has been thorough, studies of stream bed sedimentation have typically relied on semi-quantitative measures such as embeddedness or marginal pool depth. To directly quantify the...

  10. Quantifying in-stream nitrate reaction rates using continuously-collected water quality data

    Treesearch

    Matthew Miller; Anthony Tesoriero; Paul Capel

    2016-01-01

    High frequency in situ nitrate data from three streams of varying hydrologic condition, land use, and watershed size were used to quantify the mass loading of nitrate to streams from two sources – groundwater discharge and event flow – at a daily time step for one year. These estimated loadings were used to quantify temporally-variable in-stream nitrate processing ...

  11. Quantifying causal mechanisms to determine how protected areas affect poverty through changes in ecosystem services and infrastructure.

    PubMed

    Ferraro, Paul J; Hanauer, Merlin M

    2014-03-18

    To develop effective environmental policies, we must understand the mechanisms through which the policies affect social and environmental outcomes. Unfortunately, empirical evidence about these mechanisms is limited, and little guidance for quantifying them exists. We develop an approach to quantifying the mechanisms through which protected areas affect poverty. We focus on three mechanisms: changes in tourism and recreational services; changes in infrastructure in the form of road networks, health clinics, and schools; and changes in regulating and provisioning ecosystem services and foregone production activities that arise from land-use restrictions. The contributions of ecotourism and other ecosystem services to poverty alleviation in the context of a real environmental program have not yet been empirically estimated. Nearly two-thirds of the poverty reduction associated with the establishment of Costa Rican protected areas is causally attributable to opportunities afforded by tourism. Although protected areas reduced deforestation and increased regrowth, these land cover changes neither reduced nor exacerbated poverty, on average. Protected areas did not, on average, affect our measures of infrastructure and thus did not contribute to poverty reduction through this mechanism. We attribute the remaining poverty reduction to unobserved dimensions of our mechanisms or to other mechanisms. Our study empirically estimates previously unidentified contributions of ecotourism and other ecosystem services to poverty alleviation in the context of a real environmental program. We demonstrate that, with existing data and appropriate empirical methods, conservation scientists and policymakers can begin to elucidate the mechanisms through which ecosystem conservation programs affect human welfare.

  12. Quantifying causal mechanisms to determine how protected areas affect poverty through changes in ecosystem services and infrastructure

    PubMed Central

    Ferraro, Paul J.; Hanauer, Merlin M.

    2014-01-01

    To develop effective environmental policies, we must understand the mechanisms through which the policies affect social and environmental outcomes. Unfortunately, empirical evidence about these mechanisms is limited, and little guidance for quantifying them exists. We develop an approach to quantifying the mechanisms through which protected areas affect poverty. We focus on three mechanisms: changes in tourism and recreational services; changes in infrastructure in the form of road networks, health clinics, and schools; and changes in regulating and provisioning ecosystem services and foregone production activities that arise from land-use restrictions. The contributions of ecotourism and other ecosystem services to poverty alleviation in the context of a real environmental program have not yet been empirically estimated. Nearly two-thirds of the poverty reduction associated with the establishment of Costa Rican protected areas is causally attributable to opportunities afforded by tourism. Although protected areas reduced deforestation and increased regrowth, these land cover changes neither reduced nor exacerbated poverty, on average. Protected areas did not, on average, affect our measures of infrastructure and thus did not contribute to poverty reduction through this mechanism. We attribute the remaining poverty reduction to unobserved dimensions of our mechanisms or to other mechanisms. Our study empirically estimates previously unidentified contributions of ecotourism and other ecosystem services to poverty alleviation in the context of a real environmental program. We demonstrate that, with existing data and appropriate empirical methods, conservation scientists and policymakers can begin to elucidate the mechanisms through which ecosystem conservation programs affect human welfare. PMID:24567397

  13. Using Hyperpolarized 129Xe MRI to Quantify the Pulmonary Ventilation Distribution

    PubMed Central

    He, Mu; Driehuys, Bastiaan; Que, Loretta G.; Huang, Yuh-Chin T.

    2017-01-01

    Background Ventilation heterogeneity is impossible to detect with spirometry. Alternatively, pulmonary ventilation can be imaged 3-dimensionally using inhaled 129Xe MRI. To date such images have been quantified primarily based on ventilation defects. Here, we introduce a robust means to transform 129Xe MRI scans such that the underlying ventilation distribution and its heterogeneity can be quantified. Methods Quantitative 129Xe ventilation MRI was conducted in 12 younger (24.7±5.2 yrs), and 10 older (62.2±7.2 yrs) healthy individuals, as well as 9 younger (25.9±6.4 yrs) and 10 older (63.2±6.1 yrs) asthmatics. The younger healthy population was used to establish a reference ventilation distribution and thresholds for 6 intensity bins. These were used to display and quantify regions of ventilation defect (VDR), low ventilation (LVR) and high ventilation (HVR). Results The ventilation distribution in young subjects was roughly Gaussian with a mean and SD of 0.52±0.18, resulting in VDR=2.1±1.3%, LVR=15.6±5.4% and HVR=17.4±3.1%. Older healthy volunteers exhibited a significantly right-skewed distribution (0.46±0.20, p=0.034), resulting in significantly increased VDR (7.0±4.8%, p=0.008) and LVR (24.5±11.5%, p=0.025). In the asthmatics, VDR and LVR increased in the older population, and HVR was significantly reduced (13.5±4.6% vs 18.9±4.5%, p=0.009). Quantitative 129Xe MRI also revealed different ventilation distribution patterns in response to albuterol in two asthmatics with normal FEV1. Conclusions Quantitative 129Xe MRI provides a robust and objective means to display and quantify the pulmonary ventilation distribution, even in subjects who have airway function impairment not appreciated by spirometry. PMID:27617823

  14. A previous hamstring injury affects kicking mechanics in soccer players.

    PubMed

    Navandar, Archit; Veiga, Santiago; Torres, Gonzalo; Chorro, David; Navarro, Enrique

    2018-01-10

    Although the kicking skill is influenced by limb dominance and sex, how a previous hamstring injury affects kicking has not been studied in detail. Thus, the objective of this study was to evaluate the effect of sex and limb dominance on kicking in limbs with and without a previous hamstring injury. 45 professional players (males: n=19, previously injured players=4, age=21.16 ± 2.00 years; females: n=19, previously injured players=10, age=22.15 ± 4.50 years) performed 5 kicks each with their preferred and non-preferred limb at a target 7m away, which were recorded with a three-dimensional motion capture system. Kinematic and kinetic variables were extracted for the backswing, leg cocking, leg acceleration and follow through phases. A shorter backswing (20.20 ± 3.49% vs 25.64 ± 4.57%), and differences in knee flexion angle (58 ± 10o vs 72 ± 14o) and hip flexion velocity (8 ± 0rad/s vs 10 ± 2rad/s) were observed in previously injured, non-preferred limb kicks for females. A lower peak hip linear velocity (3.50 ± 0.84m/s vs 4.10 ± 0.45m/s) was observed in previously injured, preferred limb kicks of females. These differences occurred in the backswing and leg-cocking phases where the hamstring muscles were the most active. A variation in the functioning of the hamstring muscles and that of the gluteus maximus and iliopsoas in the case of a previous injury could account for the differences observed in the kicking pattern. Therefore, the effects of a previous hamstring injury must be considered while designing rehabilitation programs to re-educate kicking movement.

  15. Using the raindrop size distribution to quantify the soil detachment rate at the laboratory scale

    NASA Astrophysics Data System (ADS)

    Jomaa, S.; Jaffrain, J.; Barry, D. A.; Berne, A.; Sander, G. C.

    2010-05-01

    Rainfall simulators are beneficial tools for studying soil erosion processes and sediment transport for different circumstances and scales. They are useful to better understand soil erosion mechanisms and, therefore, to develop and validate process-based erosion models. Simulators permit experimental replicates for both simple and complex configurations. The 2 m × 6 m EPFL erosion flume is equipped with a hydraulic slope control and a sprinkling system located on oscillating bars 3 m above the surface. It provides a near-uniform spatial rainfall distribution. The intensity of the precipitation can be adjusted by changing the oscillation interval. The flume is filled to a depth of 0.32 m with an agricultural loamy soil. Raindrop detachment is an important process in interrill erosion, the latter varying with the soil properties as well as the raindrop size distribution and drop velocity. Since the soil detachment varies with the kinetic energy of raindrops, an accurate characterization of drop size distribution (DSD, measured, e.g., using a laser disdrometer) can potentially support erosion calculations. Here, a laser disdrometer was used at different rainfall intensities in the EPFL flume to quantify the rainfall event in terms of number of drops, diameter and velocity. At the same time, soil particle motion was measured locally using splash cups. These cups measured the detached material rates into upslope and downslope compartments. In contrast to previously reported splash cup experiments, the cups used in this study were equipped at the top with upside-down funnels, the upper part having the same diameter as the soil sampled at the bottom. This ensured that the soil detached and captured by the device was not re-exposed to rainfall. The experimental data were used to quantify the relationship between the raindrop distribution and the splash-driven sediment transport.

  16. Subsequent childbirth after a previous traumatic birth.

    PubMed

    Beck, Cheryl Tatano; Watson, Sue

    2010-01-01

    Nine percent of new mothers in the United States who participated in the Listening to Mothers II Postpartum Survey screened positive for meeting the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for posttraumatic stress disorder after childbirth. Women who have had a traumatic birth experience report fewer subsequent children and a longer length of time before their second baby. Childbirth-related posttraumatic stress disorder impacts couples' physical relationship, communication, conflict, emotions, and bonding with their children. The purpose of this study was to describe the meaning of women's experiences of a subsequent childbirth after a previous traumatic birth. Phenomenology was the research design used. An international sample of 35 women participated in this Internet study. Women were asked, "Please describe in as much detail as you can remember your subsequent pregnancy, labor, and delivery following your previous traumatic birth." Colaizzi's phenomenological data analysis approach was used to analyze the stories of the 35 women. Data analysis yielded four themes: (a) riding the turbulent wave of panic during pregnancy; (b) strategizing: attempts to reclaim their body and complete the journey to motherhood; (c) bringing reverence to the birthing process and empowering women; and (d) still elusive: the longed-for healing birth experience. Subsequent childbirth after a previous birth trauma has the potential to either heal or retraumatize women. During pregnancy, women need permission and encouragement to grieve their prior traumatic births to help remove the burden of their invisible pain.

  17. Quantifying Short-Chain Chlorinated Paraffin Congener Groups.

    PubMed

    Yuan, Bo; Bogdal, Christian; Berger, Urs; MacLeod, Matthew; Gebbink, Wouter A; Alsberg, Tomas; de Wit, Cynthia A

    2017-09-19

    Accurate quantification of short-chain chlorinated paraffins (SCCPs) poses an exceptional challenge to analytical chemists. SCCPs are complex mixtures of chlorinated alkanes with variable chain length and chlorination level; congeners with a fixed chain length (n) and number of chlorines (m) are referred to as a "congener group" C n Cl m . Recently, we resolved individual C n Cl m by mathematically deconvolving soft ionization high-resolution mass spectra of SCCP mixtures. Here we extend the method to quantifying C n Cl m by introducing C n Cl m specific response factors (RFs) that are calculated from 17 SCCP chain-length standards with a single carbon chain length and variable chlorination level. The signal pattern of each standard is measured on APCI-QTOF-MS. RFs of each C n Cl m are obtained by pairwise optimization of the normal distribution's fit to the signal patterns of the 17 chain-length standards. The method was verified by quantifying SCCP technical mixtures and spiked environmental samples with accuracies of 82-123% and 76-109%, respectively. The absolute differences between calculated and manufacturer-reported chlorination degrees were -0.9 to 1.0%Cl for SCCP mixtures of 49-71%Cl. The quantification method has been replicated with ECNI magnetic sector MS and ECNI-Q-Orbitrap-MS. C n Cl m concentrations determined with the three instruments were highly correlated (R 2 > 0.90) with each other.

  18. Quantifying the ventilatory control contribution to sleep apnoea using polysomnography.

    PubMed

    Terrill, Philip I; Edwards, Bradley A; Nemati, Shamim; Butler, James P; Owens, Robert L; Eckert, Danny J; White, David P; Malhotra, Atul; Wellman, Andrew; Sands, Scott A

    2015-02-01

    Elevated loop gain, consequent to hypersensitive ventilatory control, is a primary nonanatomical cause of obstructive sleep apnoea (OSA) but it is not possible to quantify this in the clinic. Here we provide a novel method to estimate loop gain in OSA patients using routine clinical polysomnography alone. We use the concept that spontaneous ventilatory fluctuations due to apnoeas/hypopnoeas (disturbance) result in opposing changes in ventilatory drive (response) as determined by loop gain (response/disturbance). Fitting a simple ventilatory control model (including chemical and arousal contributions to ventilatory drive) to the ventilatory pattern of OSA reveals the underlying loop gain. Following mathematical-model validation, we critically tested our method in patients with OSA by comparison with a standard (continuous positive airway pressure (CPAP) drop method), and by assessing its ability to detect the known reduction in loop gain with oxygen and acetazolamide. Our method quantified loop gain from baseline polysomnography (correlation versus CPAP-estimated loop gain: n=28; r=0.63, p<0.001), detected the known reduction in loop gain with oxygen (n=11; mean±sem change in loop gain (ΔLG) -0.23±0.08, p=0.02) and acetazolamide (n=11; ΔLG -0.20±0.06, p=0.005), and predicted the OSA response to loop gain-lowering therapy. We validated a means to quantify the ventilatory control contribution to OSA pathogenesis using clinical polysomnography, enabling identification of likely responders to therapies targeting ventilatory control. Copyright ©ERS 2015.

  19. Quantifying the ventilatory control contribution to sleep apnoea using polysomnography

    PubMed Central

    Terrill, Philip I.; Edwards, Bradley A.; Nemati, Shamim; Butler, James P.; Owens, Robert L.; Eckert, Danny J.; White, David P.; Malhotra, Atul; Wellman, Andrew; Sands, Scott A.

    2015-01-01

    Elevated loop gain, consequent to hypersensitive ventilatory control, is a primary nonanatomical cause of obstructive sleep apnoea (OSA) but it is not possible to quantify this in the clinic. Here we provide a novel method to estimate loop gain in OSA patients using routine clinical polysomnography alone. We use the concept that spontaneous ventilatory fluctuations due to apnoeas/hypopnoeas (disturbance) result in opposing changes in ventilatory drive (response) as determined by loop gain (response/disturbance). Fitting a simple ventilatory control model (including chemical and arousal contributions to ventilatory drive) to the ventilatory pattern of OSA reveals the underlying loop gain. Following mathematical-model validation, we critically tested our method in patients with OSA by comparison with a standard (continuous positive airway pressure (CPAP) drop method), and by assessing its ability to detect the known reduction in loop gain with oxygen and acetazolamide. Our method quantified loop gain from baseline polysomnography (correlation versus CPAP-estimated loop gain: n=28; r=0.63, p<0.001), detected the known reduction in loop gain with oxygen (n=11; mean±SEM change in loop gain (ΔLG) −0.23±0.08, p=0.02) and acetazolamide (n=11; ΔLG −0.20±0.06, p=0.005), and predicted the OSA response to loop gain-lowering therapy. We validated a means to quantify the ventilatory control contribution to OSA pathogenesis using clinical polysomnography, enabling identification of likely responders to therapies targeting ventilatory control. PMID:25323235

  20. Engineering Students Designing a Statistical Procedure for Quantifying Variability

    ERIC Educational Resources Information Center

    Hjalmarson, Margret A.

    2007-01-01

    The study examined first-year engineering students' responses to a statistics task that asked them to generate a procedure for quantifying variability in a data set from an engineering context. Teams used technological tools to perform computations, and their final product was a ranking procedure. The students could use any statistical measures,…

  1. Quantifying fossil fuel CO2 from continuous measurements of APO: a novel approach

    NASA Astrophysics Data System (ADS)

    Pickers, Penelope; Manning, Andrew C.; Forster, Grant L.; van der Laan, Sander; Wilson, Phil A.; Wenger, Angelina; Meijer, Harro A. J.; Oram, David E.; Sturges, William T.

    2016-04-01

    Using atmospheric measurements to accurately quantify CO2 emissions from fossil fuel sources requires the separation of biospheric and anthropogenic CO2 fluxes. The ability to quantify the fossil fuel component of CO2 (ffCO2) from atmospheric measurements enables more accurate 'top-down' verification of CO2 emissions inventories, which frequently have large uncertainty. Typically, ffCO2 is quantified (in ppm units) from discrete atmospheric measurements of Δ14CO2, combined with higher resolution atmospheric CO measurements, and with knowledge of CO:ffCO2 ratios. In the United Kingdom (UK), however, measurements of Δ14CO2 are often significantly biased by nuclear power plant influences, which limit the use of this approach. We present a novel approach for quantifying ffCO2 using measurements of APO (Atmospheric Potential Oxygen; a tracer derived from concurrent measurements of CO2 and O2) from two measurement sites in Norfolk, UK. Our approach is similar to that used for quantifying ffCO2 from CO measurements (ffCO2(CO)), whereby ffCO2(APO) = (APOmeas - APObg)/RAPO, where (APOmeas - APObg) is the APO deviation from the background, and RAPO is the APO:CO2 combustion ratio for fossil fuel. Time varying values of RAPO are calculated from the global gridded COFFEE (CO2 release and Oxygen uptake from Fossil Fuel Emission Estimate) dataset, combined with NAME (Numerical Atmospheric-dispersion Modelling Environment) transport model footprints. We compare our ffCO2(APO) results to results obtained using the ffCO2(CO) method, using CO:CO2 fossil fuel emission ratios (RCO) from the EDGAR (Emission Database for Global Atmospheric Research) database. We find that the APO ffCO2 quantification method is more precise than the CO method, owing primarily to a smaller range of possible APO:CO2 fossil fuel emission ratios, compared to the CO:CO2 emission ratio range. Using a long-term dataset of atmospheric O2, CO2, CO and Δ14CO2 from Lutjewad, The Netherlands, we examine the

  2. ASPRS research on quantifying the geometric quality of lidar data

    USGS Publications Warehouse

    Sampath, Aparajithan; Heidemann, Hans K.; Stensaas, Gregory L.; Christopherson, Jon B.

    2014-01-01

    The ASPRS Lidar Cal/Val (calibration/validation) Working Group led by the US Geological Survey (USGS) to establish “Guidelines on Geometric Accuracy and Quality of Lidar Data” has made excellent progress via regular teleconferences and meetings. The group is focused on identifying data quality metrics and establishing a set of guidelines for quantifying the quality of lidar data. The working group has defined and agreed on lidar Data Quality Measures (DQMs) to be used for this purpose. The DQMs are envisaged as the first ever consistent way of checking lidar data. It is expected that these metrics will be used as standard methods for quantifying the geometric quality of lidar data. The goal of this article is to communicate these developments to the readers and the larger geospatial community and invite them to participate in the process.  

  3. Quantifying CO2 Emissions From Individual Power Plants From Space

    NASA Astrophysics Data System (ADS)

    Nassar, Ray; Hill, Timothy G.; McLinden, Chris A.; Wunch, Debra; Jones, Dylan B. A.; Crisp, David

    2017-10-01

    In order to better manage anthropogenic CO2 emissions, improved methods of quantifying emissions are needed at all spatial scales from the national level down to the facility level. Although the Orbiting Carbon Observatory 2 (OCO-2) satellite was not designed for monitoring power plant emissions, we show that in some cases, CO2 observations from OCO-2 can be used to quantify daily CO2 emissions from individual middle- to large-sized coal power plants by fitting the data to plume model simulations. Emission estimates for U.S. power plants are within 1-17% of reported daily emission values, enabling application of the approach to international sites that lack detailed emission information. This affirms that a constellation of future CO2 imaging satellites, optimized for point sources, could monitor emissions from individual power plants to support the implementation of climate policies.

  4. Quantifying non-recurring delay on New York City's arterial highways.

    DOT National Transportation Integrated Search

    2008-12-01

    This research project was undertaken for NYSDOT to provide a better understanding of the impacts of traffic incidents/accident s on traffic delays on New York Citys Arterial Highways, and to better quantify and predict non-recurring traffic delay ...

  5. Web client and ODBC access to legacy database information: a low cost approach.

    PubMed Central

    Sanders, N. W.; Mann, N. H.; Spengler, D. M.

    1997-01-01

    A new method has been developed for the Department of Orthopaedics of Vanderbilt University Medical Center to access departmental clinical data. Previously this data was stored only in the medical center's mainframe DB2 database, it is now additionally stored in a departmental SQL database. Access to this data is available via any ODBC compliant front-end or a web client. With a small budget and no full time staff, we were able to give our department on-line access to many years worth of patient data that was previously inaccessible. PMID:9357735

  6. Quantifying the uncertainty in site amplification modeling and its effects on site-specific seismic-hazard estimation in the upper Mississippi embayment and adjacent areas

    USGS Publications Warehouse

    Cramer, C.H.

    2006-01-01

    The Mississippi embayment, located in the central United States, and its thick deposits of sediments (over 1 km in places) have a large effect on earthquake ground motions. Several previous studies have addressed how these thick sediments might modify probabilistic seismic-hazard maps. The high seismic hazard associated with the New Madrid seismic zone makes it particularly important to quantify the uncertainty in modeling site amplification to better represent earthquake hazard in seismic-hazard maps. The methodology of the Memphis urban seismic-hazard-mapping project (Cramer et al., 2004) is combined with the reference profile approach of Toro and Silva (2001) to better estimate seismic hazard in the Mississippi embayment. Improvements over previous approaches include using the 2002 national seismic-hazard model, fully probabilistic hazard calculations, calibration of site amplification with improved nonlinear soil-response estimates, and estimates of uncertainty. Comparisons are made with the results of several previous studies, and estimates of uncertainty inherent in site-amplification modeling for the upper Mississippi embayment are developed. I present new seismic-hazard maps for the upper Mississippi embayment with the effects of site geology incorporating these uncertainties.

  7. Quantifying climate feedbacks in polar regions.

    PubMed

    Goosse, Hugues; Kay, Jennifer E; Armour, Kyle C; Bodas-Salcedo, Alejandro; Chepfer, Helene; Docquier, David; Jonko, Alexandra; Kushner, Paul J; Lecomte, Olivier; Massonnet, François; Park, Hyo-Seok; Pithan, Felix; Svensson, Gunilla; Vancoppenolle, Martin

    2018-05-15

    The concept of feedback is key in assessing whether a perturbation to a system is amplified or damped by mechanisms internal to the system. In polar regions, climate dynamics are controlled by both radiative and non-radiative interactions between the atmosphere, ocean, sea ice, ice sheets and land surfaces. Precisely quantifying polar feedbacks is required for a process-oriented evaluation of climate models, a clear understanding of the processes responsible for polar climate changes, and a reduction in uncertainty associated with model projections. This quantification can be performed using a simple and consistent approach that is valid for a wide range of feedbacks, offering the opportunity for more systematic feedback analyses and a better understanding of polar climate changes.

  8. Fatigue in healthy and diseased individuals.

    PubMed

    Finsterer, Josef; Mahjoub, Sinda Zarrouk

    2014-08-01

    Although fatigue is experienced by everyone, its definition and classification remains under debate. A review of the previously published data on fatigue. Fatigue is influenced by age, gender, physical condition, type of food, latency to last meal, mental status, psychological conditions, personality type, life experience, and the health status of an individual. Fatigue may not only be a symptom but also a measurable and quantifiable dimension, also known as fatigability. Additionally, it may be classified as a condition occurring at rest or under exercise or stress, as physiologic reaction or pathologic condition, as spontaneous phenomenon or triggerable state, as resistant or irresistant to preconditioning, training, or attitude, as prominent or collateral experience, and as accessible or inaccessible to any type of treatment or intervention. Fatigue may be the sole symptom of a disease or one among others. It may be also classified as acute or chronic. Quantification of fatigability is achievable by fatigue scores, force measurement, electromyography, or other means. Fatigue and fatigability need to be delineated from conditions such as sleepiness, apathy, exhaustion, exercise intolerance, lack of vigor, weakness, inertia, or tiredness. Among neurological disorders, the prevalence of fatigue is particularly increased in multiple sclerosis, amyotrophic lateral sclerosis, Parkinson disease, traumatic brain injury, stroke, and bleeding and also in neuromuscular disorders. Fatigue may be influenced by training, mental preconditioning, or drugs. Fatigue needs to be recognized as an important condition that is not only a symptom but may also be quantified and can be modified by various measures depending on the underlying cause. © The Author(s) 2013.

  9. Elliptic Cylinder Airborne Sampling and Geostatistical Mass Balance Approach for Quantifying Local Greenhouse Gas Emissions.

    PubMed

    Tadić, Jovan M; Michalak, Anna M; Iraci, Laura; Ilić, Velibor; Biraud, Sébastien C; Feldman, Daniel R; Bui, Thaopaul; Johnson, Matthew S; Loewenstein, Max; Jeong, Seongeun; Fischer, Marc L; Yates, Emma L; Ryoo, Ju-Mee

    2017-09-05

    In this study, we explore observational, experimental, methodological, and practical aspects of the flux quantification of greenhouse gases from local point sources by using in situ airborne observations, and suggest a series of conceptual changes to improve flux estimates. We address the major sources of uncertainty reported in previous studies by modifying (1) the shape of the typical flight path, (2) the modeling of covariance and anisotropy, and (3) the type of interpolation tools used. We show that a cylindrical flight profile offers considerable advantages compared to traditional profiles collected as curtains, although this new approach brings with it the need for a more comprehensive subsequent analysis. The proposed flight pattern design does not require prior knowledge of wind direction and allows for the derivation of an ad hoc empirical correction factor to partially alleviate errors resulting from interpolation and measurement inaccuracies. The modified approach is applied to a use-case for quantifying CH 4 emission from an oil field south of San Ardo, CA, and compared to a bottom-up CH 4 emission estimate.

  10. Quantifying competitive ability of perennial grasses to inhibit Scotch broom

    Treesearch

    Timothy Harrington

    2011-01-01

    Greenhouse pot studies were conducted to quantify the competitive abilities of three native perennial grass species to inhibit development of Scotch broom (Cytisus scoparius (L.) Link ) seedlings: spike bentgrass (Agrostis exarata Trin. ), blue wildrye (Elymus glaucus Buckley), and western fescue (

  11. A NEW METHOD TO QUANTIFY CORE TEMPERATURE INSTABILITY IN RODENTS.

    EPA Science Inventory

    Methods to quantify instability of autonomic systems such as temperature regulation should be important in toxicant and drug safety studies. Stability of core temperature (Tc) in laboratory rodents is susceptible to a variety of stimuli. Calculating the temperature differential o...

  12. Rapid culture-independent microbial analysis aboard the international space station (ISS) stage two: quantifying three microbial biomarkers.

    PubMed

    Morris, Heather C; Damon, Michael; Maule, Jake; Monaco, Lisa A; Wainwright, Norm

    2012-09-01

    Abstract A portable, rapid, microbial detection unit, the Lab-On-a-Chip Application Development Portable Test System (LOCAD-PTS), was launched to the International Space Station (ISS) as a technology demonstration unit in December 2006. Results from the first series of experiments designed to detect Gram-negative bacteria on ISS surfaces by quantifying a single microbial biomarker lipopolysaccharide (LPS) were reported in a previous article. Herein, we report additional technology demonstration experiments expanding the on-orbit capabilities of the LOCAD-PTS to detecting three different microbial biomarkers on ISS surfaces. Six different astronauts on more than 20 occasions participated in these experiments, which were designed to test the new beta-glucan (fungal cell wall molecule) and lipoteichoic acid (LTA; Gram-positive bacterial cell wall component) cartridges individually and in tandem with the existing Limulus Amebocyte Lysate (LAL; Gram-negative bacterial LPS detection) cartridges. Additionally, we conducted the sampling side by side with the standard culture-based detection method currently used on the ISS. Therefore, we present data on the distribution of three microbial biomarkers collected from various surfaces in every module present on the ISS at the time of sampling. In accordance with our previous experiments, we determined that spacecraft surfaces known to be frequently in contact with crew members demonstrated higher values of all three microbial molecules. Key Words: Planetary protection-Spaceflight-Microbiology-Biosensor. Astrobiology 12, 830-840.

  13. Using expert opinion to quantify unmeasured confounding bias parameters.

    PubMed

    Navadeh, Soodabeh; Mirzazadeh, Ali; McFarland, Willi; Woolf-King, Sarah; Mansournia, Mohammad Ali

    2016-06-27

    To develop and apply a method to quantify bias parameters in the case example of the association between alcohol use and HIV-serodiscordant condomless anal sex with potential confounding by sensation seeking among men who have sex with men (MSM), using expert opinion as an external data source. Through an online survey, we sought the input of 41 epidemiologist and behavioural scientists to quantify six parameters in the population of MSM: the proportion of high sensation seeking among heavy-drinking MSM, the proportion of sensation seeking among low-level drinking MSM, and the risk ratio (RR) of the association between sensation seeking and condomless anal sex, for HIV-positive and HIV-negative MSM. Eleven experts responded. For HIV-positive heavy drinkers, the proportion of high sensation seeking was 53.6% (beta distribution [α=5.50, β=4.78]), and 41.1% (beta distribution [α=3.10, β=4.46]) in HIV-negative heavy drinkers. In HIV-positive low-level alcohol drinkers, high sensation seeking was 26.9% (beta distribution [α=1.81, β=4.92]), similar to high sensation seeking among HIV-negative low-level alcohol drinkers (25.3%) (beta distribution [α=2.00, β=5.89]). The lnRR for the association between sensation seeking and condomless anal sex was ln(2.4) (normal distribution [μ=0.889, σ=0.438]) in HIV-positive and ln(1.5) (normal distribution [μ=0.625, σ=0.391]) in HIV-negative MSM. Expert opinion can be a simple and efficient method for deriving bias parameters to quantify and adjust for hypothesized confounding. In this test case, expert opinion confirmed sensation seeking as a confounder for the effect of alcohol on condomless anal sex and provided the parameters necessary for probabilistic bias analysis.

  14. FRAGSTATS: spatial pattern analysis program for quantifying landscape structure.

    Treesearch

    Kevin McGarigal; Barbara J. Marks

    1995-01-01

    This report describes a program, FRAGSTATS, developed to quantify landscape structure. FRAGSTATS offers a comprehensive choice of landscape metrics and was designed to be as versatile as possible. The program is almost completely automated and thus requires little technical training. Two separate versions of FRAGSTATS exist: one for vector images and one for raster...

  15. Quantifying complexity in translational research: an integrated approach.

    PubMed

    Munoz, David A; Nembhard, Harriet Black; Kraschnewski, Jennifer L

    2014-01-01

    The purpose of this paper is to quantify complexity in translational research. The impact of major operational steps and technical requirements is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. A three-phase integrated quality function deployment (QFD) and analytic hierarchy process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, the authors found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research.

  16. Quantifying historical carbon and climate debts among nations

    NASA Astrophysics Data System (ADS)

    Matthews, H. Damon

    2016-01-01

    Contributions to historical climate change have varied substantially among nations. These differences reflect underlying inequalities in wealth and development, and pose a fundamental challenge to the implementation of a globally equitable climate mitigation strategy. This Letter presents a new way to quantify historical inequalities among nations using carbon and climate debts, defined as the amount by which national climate contributions have exceeded a hypothetical equal per-capita share over time. Considering only national CO2 emissions from fossil fuel combustion, accumulated carbon debts across all nations from 1990 to 2013 total 250 billion tonnes of CO2, representing 40% of cumulative world emissions since 1990. Expanding this to reflect the temperature response to a range of emissions, historical climate debts accrued between 1990 and 2010 total 0.11 °C, close to a third of observed warming over that period. Large fractions of this debt are carried by industrialized countries, but also by countries with high levels of deforestation and agriculture. These calculations could contribute to discussions of climate responsibility by providing a tangible way to quantify historical inequalities, which could then inform the funding of mitigation, adaptation and the costs of loss and damages in those countries that have contributed less to historical warming.

  17. Quantifying complexity in translational research: an integrated approach

    PubMed Central

    Munoz, David A.; Nembhard, Harriet Black; Kraschnewski, Jennifer L.

    2014-01-01

    Purpose This article quantifies complexity in translational research. The impact of major operational steps and technical requirements (TR) is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. Design/Methodology/Approach A three-phase integrated Quality Function Deployment (QFD) and Analytic Hierarchy Process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Findings Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, we found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. Research limitations/implications As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. Practical implications The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Originality/value Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research. PMID:25417380

  18. Method and Apparatus for Detecting and Quantifying Bacterial Spores on a Surface

    NASA Technical Reports Server (NTRS)

    Ponce, Adrian (Inventor)

    2017-01-01

    A method and an apparatus for detecting and quantifying bacterial spores on a surface. In accordance with the method: a matrix including lanthanide ions is provided on the surface containing the bacterial spores; functionalized aromatic molecules are released from the bacterial spores on the surface; a complex of the lanthanide ion and the aromatic molecule is formed on the surface; the complex of the lanthanide ion and the aromatic molecule is excited to generate a characteristic luminescence of the complex on the surface; and the bacterial spores exhibiting the luminescence of the complex on the surface are detected and quantified.

  19. Method and apparatus for detecting and quantifying bacterial spores on a surface

    NASA Technical Reports Server (NTRS)

    Ponce, Adrian (Inventor)

    2009-01-01

    A method and an apparatus for detecting and quantifying bacterial spores on a surface. In accordance with the method: a matrix including lanthanide ions is provided on the surface containing the bacterial spores; functionalized aromatic molecules are released from the bacterial spores on the surface; a complex of the lanthanide ion and the aromatic molecule is formed on the surface; the complex of the lanthanide ion and the aromatic molecule is excited to generate a characteristic luminescence of the complex on the surface; and the bacterial spores exhibiting the luminescence of the complex on the surface are detected and quantified.

  20. Quantifying repetitive speech in autism spectrum disorders and language impairment.

    PubMed

    van Santen, Jan P H; Sproat, Richard W; Hill, Alison Presmanes

    2013-10-01

    We report on an automatic technique for quantifying two types of repetitive speech: repetitions of what the child says him/herself (self-repeats) and of what is uttered by an interlocutor (echolalia). We apply this technique to a sample of 111 children between the ages of four and eight: 42 typically developing children (TD), 19 children with specific language impairment (SLI), 25 children with autism spectrum disorders (ASD) plus language impairment (ALI), and 25 children with ASD with normal, non-impaired language (ALN). The results indicate robust differences in echolalia between the TD and ASD groups as a whole (ALN + ALI), and between TD and ALN children. There were no significant differences between ALI and SLI children for echolalia or self-repetitions. The results confirm previous findings that children with ASD repeat the language of others more than other populations of children. On the other hand, self-repetition does not appear to be significantly more frequent in ASD, nor does it matter whether the child's echolalia occurred within one (immediate) or two turns (near-immediate) of the adult's original utterance. Furthermore, non-significant differences between ALN and SLI, between TD and SLI, and between ALI and TD are suggestive that echolalia may not be specific to ALN or to ASD in general. One important innovation of this work is an objective fully automatic technique for assessing the amount of repetition in a transcript of a child's utterances. © 2013 International Society for Autism Research, Wiley Periodicals, Inc.

  1. [Adolescents previously involved in Satanism: mental health problems experience].

    PubMed

    Heathcote, H; Gmeiner, A; Poggenpoel, M

    1998-03-01

    As far as the phenomena of adolescents previously involved with satanism that experience obstacles in the strive for mental health, no research has previously been done. Adolescents previously involved in satanism, presents behaviour problems like aggressive outbursts depression, "psychosis", or suicide attempts that can even lead to suicide. In the phenomena-analysis semi-structured, phenomenological interviews with the respondents and their parents, were performed. The respondents were requested to write a naive sketch about there life. After the data-control was done, guidelines for nursing staff had been set. The guidelines are set for the management of adolescents that has previously been involved in satanism, and experiences obstacles in their strive for mental health. Interviews with experts in satanism was done, literature in the form of books, magazines and newsclippings were used to verify the findings in the research. The most important guidelines are that: the caregivers have to be reborn Christians; they are not allowed to show, any fear or sympathy; they have to have sufficient knowledge about satanism; the adolescent has to be unconditionally accepted; the caregivers have to work in a team; the adolescents have to be taught to deal with their emotions.

  2. Adolescents previously involved in Satanism experiencing mental health problems.

    PubMed

    Heathcote, H; Gmeiner, A; Poggenpoel, M

    1999-06-01

    No research has previously been done regarding the phenomenon of adolescents who have previously been involved in Satanism and who experience obstacles in their strive for mental health. Adolescents previously involved in Satanism present behavioral problems like aggressive outbursts, depression, "psychosis" or suicide attempts, that could lead to suicide. In the phenomenon-analysis semi-structured, phenomenological interviews were performed with the respondents and their parents. The respondents were requested to write a naïve sketch about their life. After completion of the data-control, guidelines for nursing staff were set. The guidelines are set for the management of adolescents who have previously been involved in Satanism and who experience obstacles in their strive for mental health. Interviews with experts in Satanism were conducted, literature in the form of books, magazines and newspaper-clippings were used to verify the research findings. The most important guidelines are that the caregivers have to be reborn Christians; they are not allowed to show any fear or sympathy; they must have sufficient knowledge about Satanism; the adolescents have to be unconditionally accepted; the caregivers have to work in a team and the adolescents have to be taught to deal with their emotions.

  3. Previously unknown species of Aspergillus.

    PubMed

    Gautier, M; Normand, A-C; Ranque, S

    2016-08-01

    The use of multi-locus DNA sequence analysis has led to the description of previously unknown 'cryptic' Aspergillus species, whereas classical morphology-based identification of Aspergillus remains limited to the section or species-complex level. The current literature highlights two main features concerning these 'cryptic' Aspergillus species. First, the prevalence of such species in clinical samples is relatively high compared with emergent filamentous fungal taxa such as Mucorales, Scedosporium or Fusarium. Second, it is clearly important to identify these species in the clinical laboratory because of the high frequency of antifungal drug-resistant isolates of such Aspergillus species. Matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) has recently been shown to enable the identification of filamentous fungi with an accuracy similar to that of DNA sequence-based methods. As MALDI-TOF MS is well suited to the routine clinical laboratory workflow, it facilitates the identification of these 'cryptic' Aspergillus species at the routine mycology bench. The rapid establishment of enhanced filamentous fungi identification facilities will lead to a better understanding of the epidemiology and clinical importance of these emerging Aspergillus species. Based on routine MALDI-TOF MS-based identification results, we provide original insights into the key interpretation issues of a positive Aspergillus culture from a clinical sample. Which ubiquitous species that are frequently isolated from air samples are rarely involved in human invasive disease? Can both the species and the type of biological sample indicate Aspergillus carriage, colonization or infection in a patient? Highly accurate routine filamentous fungi identification is central to enhance the understanding of these previously unknown Aspergillus species, with a vital impact on further improved patient care. Copyright © 2016 European Society of Clinical Microbiology and

  4. Quantifying intracellular rates of glycolytic and oxidative ATP production and consumption using extracellular flux measurements

    PubMed Central

    Mookerjee, Shona A.; Gerencser, Akos A.; Nicholls, David G.; Brand, Martin D.

    2017-01-01

    Partitioning of ATP generation between glycolysis and oxidative phosphorylation is central to cellular bioenergetics but cumbersome to measure. We describe here how rates of ATP generation by each pathway can be calculated from simultaneous measurements of extracellular acidification and oxygen consumption. We update theoretical maximum ATP yields by mitochondria and cells catabolizing different substrates. Mitochondrial P/O ratios (mol of ATP generated per mol of [O] consumed) are 2.73 for oxidation of pyruvate plus malate and 1.64 for oxidation of succinate. Complete oxidation of glucose by cells yields up to 33.45 ATP/glucose with a maximum P/O of 2.79. We introduce novel indices to quantify bioenergetic phenotypes. The glycolytic index reports the proportion of ATP production from glycolysis and identifies cells as primarily glycolytic (glycolytic index > 50%) or primarily oxidative. The Warburg effect is a chronic increase in glycolytic index, quantified by the Warburg index. Additional indices quantify the acute flexibility of ATP supply. The Crabtree index and Pasteur index quantify the responses of oxidative and glycolytic ATP production to alterations in glycolysis and oxidative reactions, respectively; the supply flexibility index quantifies overall flexibility of ATP supply; and the bioenergetic capacity quantifies the maximum rate of total ATP production. We illustrate the determination of these indices using C2C12 myoblasts. Measurement of ATP use revealed no significant preference for glycolytic or oxidative ATP by specific ATP consumers. Overall, we demonstrate how extracellular fluxes quantitatively reflect intracellular ATP turnover and cellular bioenergetics. We provide a simple spreadsheet to calculate glycolytic and oxidative ATP production rates from raw extracellular acidification and respiration data. PMID:28270511

  5. 31 CFR 202.5 - Previously designated depositaries.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Previously designated depositaries. 202.5 Section 202.5 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY FINANCIAL MANAGEMENT SERVICE DEPOSITARIES AND FINANCIAL AGENTS...

  6. 31 CFR 202.5 - Previously designated depositaries.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance:Treasury 2 2012-07-01 2012-07-01 false Previously designated depositaries. 202.5 Section 202.5 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY FINANCIAL MANAGEMENT SERVICE DEPOSITARIES AND FINANCIAL AGENTS...

  7. Predicting and quantifying soil processes using “geomorphon” landform Classification

    USDA-ARS?s Scientific Manuscript database

    Soil development and behavior vary spatially at multiple observation scales. Predicting and quantifying soil properties and processes via a catena integrates predictable landscape scale variation relevant to both management decisions and soil survey. Soil maps generally convey variation as a set of ...

  8. Proceedings of Quantifying Sustainability in Puerto Rico: A Scientific Discussion

    EPA Science Inventory

    The purpose of the U.S. Environmental Protection Agency’s (EPA) Office of Research and Development’s (ORD) symposium/workshop entitled, “Quantifying Sustainability in Puerto Rico: A Scientific Discussion,” was to establish a dialogue between researchers and decision makers and fa...

  9. Designing a systematic landscape monitoring approach for quantifying ecosystem services

    EPA Science Inventory

    A key problem encountered early on by governments striving to incorporate the ecosystem services concept into decision making is quantifying ecosystem services across large landscapes. Basically, they are faced with determining what to measure, how to measure it and how to aggre...

  10. Quantifying and mapping spatial variability in simulated forest plots

    Treesearch

    Gavin R. Corral; Harold E. Burkhart

    2016-01-01

    We used computer simulations to test the efficacy of multivariate statistical methods to detect, quantify, and map spatial variability of forest stands. Simulated stands were developed of regularly-spaced plantations of loblolly pine (Pinus taeda L.). We assumed no affects of competition or mortality, but random variability was added to individual tree characteristics...

  11. Quantifying Stakeholder Values of VET Provision in the Netherlands

    ERIC Educational Resources Information Center

    van der Sluis, Margriet E.; Reezigt, Gerry J.; Borghans, Lex

    2014-01-01

    It is well-known that the quality of vocational education and training (VET) depends on how well a given programme aligns with the values and interests of its stakeholders, but it is less well-known what these values and interests are and to what extent they are shared across different groups of stakeholders. We use vignettes to quantify the…

  12. Children's Developing Knowledge of "Wh"-/Quantifier Question-Answer Relations

    ERIC Educational Resources Information Center

    Achimova, Asya; Syrett, Kristen; Musolino, Julien; Déprez, Viviane

    2017-01-01

    In response to questions in which a "wh"-term interacts with a universal quantifier in object position, such as "Who picked every toy?," children as old as 5 years of age often provide a list, pairing toys with the people who picked each of them. This response pattern is unexpected, it has been claimed, because children appear…

  13. Quantifying Ladder Fuels: A New Approach Using LiDAR

    Treesearch

    Heather Kramer; Brandon Collins; Maggi Kelly; Scott Stephens

    2014-01-01

    We investigated the relationship between LiDAR and ladder fuels in the northern Sierra Nevada, California USA. Ladder fuels are often targeted in hazardous fuel reduction treatments due to their role in propagating fire from the forest floor to tree crowns. Despite their importance, ladder fuels are difficult to quantify. One common approach is to calculate canopy base...

  14. Radiofrequency thermo-ablation of PVNS in the knee: initial results.

    PubMed

    Lalam, Radhesh K; Cribb, Gillian L; Cassar-Pullicino, Victor N; Cool, Wim P; Singh, Jaspreet; Tyrrell, Prudencia N M; Tins, Bernhard J; Winn, Naomi

    2015-12-01

    Pigmented villonodular synovitis (PVNS) is normally treated by arthroscopic or open surgical excision. We present our initial experience with radiofrequency thermo-ablation (RF ablation) of PVNS located in an inaccessible location in the knee. Review of all patients with histologically proven PVNS treated with RF ablation and with at least 2-year follow-up. Three patients met inclusion criteria and were treated with RF ablation. Two of the patients were treated successfully by one ablation procedure. One of the three patients had a recurrence which was also treated successfully by repeat RF ablation. There were no complications and all patients returned to their previous occupations following RF ablation. In this study we demonstrated the feasibility of performing RF ablation to treat PVNS in relatively inaccessible locations with curative intent. We have also discussed various post-ablation imaging appearances which can confound the assessment for residual/recurrent disease.

  15. 22 CFR 40.93 - Aliens unlawfully present after previous immigration violation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 22 Foreign Relations 1 2014-04-01 2014-04-01 false Aliens unlawfully present after previous... TO BOTH NONIMMIGRANTS AND IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.93 Aliens unlawfully present after previous immigration violation. An alien described...

  16. 22 CFR 40.93 - Aliens unlawfully present after previous immigration violation.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 22 Foreign Relations 1 2013-04-01 2013-04-01 false Aliens unlawfully present after previous... TO BOTH NONIMMIGRANTS AND IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.93 Aliens unlawfully present after previous immigration violation. An alien described...

  17. 22 CFR 40.93 - Aliens unlawfully present after previous immigration violation.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 22 Foreign Relations 1 2012-04-01 2012-04-01 false Aliens unlawfully present after previous... TO BOTH NONIMMIGRANTS AND IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.93 Aliens unlawfully present after previous immigration violation. An alien described...

  18. 22 CFR 40.93 - Aliens unlawfully present after previous immigration violation.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Aliens unlawfully present after previous... TO BOTH NONIMMIGRANTS AND IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.93 Aliens unlawfully present after previous immigration violation. An alien described...

  19. 22 CFR 40.93 - Aliens unlawfully present after previous immigration violation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Aliens unlawfully present after previous... TO BOTH NONIMMIGRANTS AND IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.93 Aliens unlawfully present after previous immigration violation. An alien described...

  20. Quantifying nonhomogeneous colors in agricultural materials part I: method development.

    PubMed

    Balaban, M O

    2008-11-01

    Measuring the color of food and agricultural materials using machine vision (MV) has advantages not available by other measurement methods such as subjective tests or use of color meters. The perception of consumers may be affected by the nonuniformity of colors. For relatively uniform colors, average color values similar to those given by color meters can be obtained by MV. For nonuniform colors, various image analysis methods (color blocks, contours, and "color change index"[CCI]) can be applied to images obtained by MV. The degree of nonuniformity can be quantified, depending on the level of detail desired. In this article, the development of the CCI concept is presented. For images with a wide range of hue values, the color blocks method quantifies well the nonhomogeneity of colors. For images with a narrow hue range, the CCI method is a better indicator of color nonhomogeneity.

  1. Psychometric Properties of a Standardized Observation Protocol to Quantify Pediatric Physical Therapy Actions.

    PubMed

    Sonderer, Patrizia; Akhbari Ziegler, Schirin; Gressbach Oertle, Barbara; Meichtry, André; Hadders-Algra, Mijna

    2017-07-01

    Pediatric physical therapy (PPT) is characterized by heterogeneity. This blurs the evaluation of effective components of PPT. The Groningen Observation Protocol (GOP) was developed to quantify contents of PPT. This study assesses the reliability and completeness of the GOP. Sixty infant PPT sessions were video-taped. Two random samples of 10 videos were used to determine interrater and intrarater reliability using interclass correlation coefficients (ICCs) with 95% confidence intervals. Completeness of GOP 2.0 was based on 60 videos. Interrater reliability of quantifying PPT actions was excellent (ICC, 0.75-1.0) in 71% and sufficient to good (ICC, 0.4-0.74) in 24% of PPT actions. Intrarater reliability was excellent in 94% and sufficient to good in 6% of PPT actions. Completeness was good for greater than 90% of PPT actions. GOP 2.0 has good reliability and completeness. After appropriate training, it is a useful tool to quantify PPT for children with developmental disorders.

  2. The likelihood of achieving quantified road safety targets: a binary logistic regression model for possible factors.

    PubMed

    Sze, N N; Wong, S C; Lee, C Y

    2014-12-01

    In past several decades, many countries have set quantified road safety targets to motivate transport authorities to develop systematic road safety strategies and measures and facilitate the achievement of continuous road safety improvement. Studies have been conducted to evaluate the association between the setting of quantified road safety targets and road fatality reduction, in both the short and long run, by comparing road fatalities before and after the implementation of a quantified road safety target. However, not much work has been done to evaluate whether the quantified road safety targets are actually achieved. In this study, we used a binary logistic regression model to examine the factors - including vehicle ownership, fatality rate, and national income, in addition to level of ambition and duration of target - that contribute to a target's success. We analyzed 55 quantified road safety targets set by 29 countries from 1981 to 2009, and the results indicate that targets that are in progress and with lower level of ambitions had a higher likelihood of eventually being achieved. Moreover, possible interaction effects on the association between level of ambition and the likelihood of success are also revealed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. A simple method for quantifying jump loads in volleyball athletes.

    PubMed

    Charlton, Paula C; Kenneally-Dabrowski, Claire; Sheppard, Jeremy; Spratford, Wayne

    2017-03-01

    Evaluate the validity of a commercially available wearable device, the Vert, for measuring vertical displacement and jump count in volleyball athletes. Propose a potential method of quantifying external load during training and match play within this population. Validation study. The ability of the Vert device to measure vertical displacement in male, junior elite volleyball athletes was assessed against reference standard laboratory motion analysis. The ability of the Vert device to count jumps during training and match-play was assessed via comparison with retrospective video analysis to determine precision and recall. A method of quantifying external load, known as the load index (LdIx) algorithm was proposed using the product of the jump count and average kinetic energy. Correlation between two separate Vert devices and three-dimensional trajectory data were good to excellent for all jump types performed (r=0.83-0.97), with a mean bias of between 3.57-4.28cm. When matched against jumps identified through video analysis, the Vert demonstrated excellent precision (0.995-1.000) evidenced by a low number of false positives. The number of false negatives identified with the Vert was higher resulting in lower recall values (0.814-0.930). The Vert is a commercially available tool that has potential for measuring vertical displacement and jump count in elite junior volleyball athletes without the need for time-consuming analysis and bespoke software. Subsequently, allowing the collected data to better quantify load using the proposed algorithm (LdIx). Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  4. Quantifying progression and regression of thrombotic risk in experimental atherosclerosis

    PubMed Central

    Palekar, Rohun U.; Jallouk, Andrew P.; Goette, Matthew J.; Chen, Junjie; Myerson, Jacob W.; Allen, John S.; Akk, Antonina; Yang, Lihua; Tu, Yizheng; Miller, Mark J.; Pham, Christine T. N.; Wickline, Samuel A.; Pan, Hua

    2015-01-01

    Currently, there are no generally applicable noninvasive methods for defining the relationship between atherosclerotic vascular damage and risk of focal thrombosis. Herein, we demonstrate methods to delineate the progression and regression of vascular damage in response to an atherogenic diet by quantifying the in vivo accumulation of semipermeable 200–300 nm perfluorocarbon core nanoparticles (PFC-NP) in ApoE null mouse plaques with [19F] magnetic resonance spectroscopy (MRS). Permeability to PFC-NP remained minimal until 12 weeks on diet, then increased rapidly following 12 weeks, but regressed to baseline within 8 weeks after diet normalization. Markedly accelerated clotting (53.3% decrease in clotting time) was observed in carotid artery preparations of fat-fed mice subjected to photochemical injury as defined by the time to flow cessation. For all mice on and off diet, an inverse linear relationship was observed between the permeability to PFC-NP and accelerated thrombosis (P = 0.02). Translational feasibility for quantifying plaque permeability and vascular damage in vivo was demonstrated with clinical 3 T MRI of PFC-NP accumulating in plaques of atherosclerotic rabbits. These observations suggest that excessive permeability to PFC-NP may indicate prothrombotic risk in damaged atherosclerotic vasculature, which resolves within weeks after dietary therapy.—Palekar, R. U., Jallouk, A. P., Goette, M. J., Chen, J., Myerson, J. W., Allen, J. S., Akk, A., Yang, L., Tu, Y., Miller, M. J., Pham, C. T. N., Wickline, S. A., Pan, H. Quantifying progression and regression of thrombotic risk in experimental atherosclerosis. PMID:25857553

  5. Using nitrate to quantify quick flow in a karst aquifer

    USGS Publications Warehouse

    Mahler, B.J.; Garner, B.D.

    2009-01-01

    In karst aquifers, contaminated recharge can degrade spring water quality, but quantifying the rapid recharge (quick flow) component of spring flow is challenging because of its temporal variability. Here, we investigate the use of nitrate in a two-endmember mixing model to quantify quick flow in Barton Springs, Austin, Texas. Historical nitrate data from recharging creeks and Barton Springs were evaluated to determine a representative nitrate concentration for the aquifer water endmember (1.5 mg/L) and the quick flow endmember (0.17 mg/L for nonstormflow conditions and 0.25 mg/L for stormflow conditions). Under nonstormflow conditions for 1990 to 2005, model results indicated that quick flow contributed from 0% to 55% of spring flow. The nitrate-based two-endmember model was applied to the response of Barton Springs to a storm and results compared to those produced using the same model with ??18O and specific conductance (SC) as tracers. Additionally, the mixing model was modified to allow endmember quick flow values to vary over time. Of the three tracers, nitrate appears to be the most advantageous because it is conservative and because the difference between the concentrations in the two endmembers is large relative to their variance. The ??18O- based model was very sensitive to variability within the quick flow endmember, and SC was not conservative over the timescale of the storm response. We conclude that a nitrate-based two-endmember mixing model might provide a useful approach for quantifying the temporally variable quick flow component of spring flow in some karst systems. ?? 2008 National Ground Water Association.

  6. Quantifying reproducibility in computational biology: the case of the tuberculosis drugome.

    PubMed

    Garijo, Daniel; Kinnings, Sarah; Xie, Li; Xie, Lei; Zhang, Yinliang; Bourne, Philip E; Gil, Yolanda

    2013-01-01

    How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to "reproducibility maps" that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one's own laboratory.

  7. Quantifying variable rainfall intensity events on runoff and sediment losses

    USDA-ARS?s Scientific Manuscript database

    Coastal Plain soils in Georgia are susceptible to runoff, sediment, and chemical losses from short duration-high intensity, runoff producing storms at critical times during the growing season. We quantified runoff and sediment losses from a Tifton loamy sand managed under conventional- (CT) and stri...

  8. Quantifying fear effects on prey demography in nature.

    PubMed

    Peers, Michael J L; Majchrzak, Yasmine N; Neilson, Eric; Lamb, Clayton T; Hämäläinen, Anni; Haines, Jessica A; Garland, Laura; Doran-Myers, Darcy; Broadley, Kate; Boonstra, Rudy; Boutin, Stan

    2018-06-13

    In recent years, it has been argued that the effect of predator fear exacts a greater demographic toll on prey populations than the direct killing of prey. However, efforts to quantify the effects of fear have primarily relied on experiments that replace predators with predator cues. Interpretation of these experiments must consider two important caveats: (1) the magnitude of experimenter-induced predator cues may not be realistically comparable to those of the prey's natural sensory environment, and (2) given functional predators are removed from the treatments, the fear effect is measured in the absence of any consumptive effects, a situation which never occurs in nature. We contend that demographic consequences of fear in natural populations may have been overestimated because the intensity of predator cues applied by experimenters in the majority of studies has been unnaturally high, in some instances rarely occurring in nature without consumption. Furthermore, the removal of consumption from the treatments creates the potential situation that individual prey in poor condition (those most likely to contribute strongly to the observed fear effects via starvation or reduced reproductive output) may have been consumed by predators in nature prior to the expression of fear effects, thus confounding consumptive and fear effects. Here, we describe an alternative treatment design that does not utilize predator cues, and in so doing, better quantifies the demographic effect of fear on wild populations. This treatment substitutes the traditional cue experiment where consumptive effects are eliminated and fear is simulated with a design where fear is removed and consumptive effects are simulated through the experimental removal of prey. Comparison to a natural population would give a more robust estimate of the effect of fear in the presence of consumption on the demographic variable of interest. This approach represents a critical advance in quantifying the

  9. Quantifying climate feedbacks in polar regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goosse, Hugues; Kay, Jennifer E.; Armour, Kyle C.

    The concept of feedback is key in assessing whether a perturbation to a system is amplified or damped by mechanisms internal to the system. In polar regions, climate dynamics are controlled by both radiative and non-radiative interactions between the atmosphere, ocean, sea ice, ice sheets and land surfaces. Precisely quantifying polar feedbacks is required for a process-oriented evaluation of climate models, a clear understanding of the processes responsible for polar climate changes, and a reduction in uncertainty associated with model projections. This quantification can be performed using a simple and consistent approach that is valid for a wide range ofmore » feedbacks, thus offering the opportunity for more systematic feedback analyses and a better understanding of polar climate changes.« less

  10. Quantifying climate feedbacks in polar regions

    DOE PAGES

    Goosse, Hugues; Kay, Jennifer E.; Armour, Kyle C.; ...

    2018-05-15

    The concept of feedback is key in assessing whether a perturbation to a system is amplified or damped by mechanisms internal to the system. In polar regions, climate dynamics are controlled by both radiative and non-radiative interactions between the atmosphere, ocean, sea ice, ice sheets and land surfaces. Precisely quantifying polar feedbacks is required for a process-oriented evaluation of climate models, a clear understanding of the processes responsible for polar climate changes, and a reduction in uncertainty associated with model projections. This quantification can be performed using a simple and consistent approach that is valid for a wide range ofmore » feedbacks, thus offering the opportunity for more systematic feedback analyses and a better understanding of polar climate changes.« less

  11. Quantifying gait patterns in Parkinson's disease

    NASA Astrophysics Data System (ADS)

    Romero, Mónica; Atehortúa, Angélica; Romero, Eduardo

    2017-11-01

    Parkinson's disease (PD) is constituted by a set of motor symptoms, namely tremor, rigidity, and bradykinesia, which are usually described but not quantified. This work proposes an objective characterization of PD gait patterns by approximating the single stance phase a single grounded pendulum. This model estimates the force generated by the gait during the single support from gait data. This force describes the motion pattern for different stages of the disease. The model was validated using recorded videos of 8 young control subjects, 10 old control subjects and 10 subjects with Parkinson's disease in different stages. The estimated force showed differences among stages of Parkinson disease, observing a decrease of the estimated force for the advanced stages of this illness.

  12. [The influence of previous pregnancy terminations, miscarriages and still-births on the incidence of babies with low birth weight and premature births as well as a somatic classification of newborns].

    PubMed

    Voigt, M; Olbertz, D; Fusch, C; Krafczyk, D; Briese, V; Schneider, K T M

    2008-02-01

    The influence of previous interruptions, miscarriages and IUFD on the IUGR and preterm rate as well as on the somatic staging (gestational age and birth weight) of the new born is a subject of controversial discussion in the literature. The present paper attempts to quantify these risks of the medical history. 2 282 412 singleton pregnancies of the period 1995 to 2000 were evaluated from the German Perinatal Database. For the analysis 1 065 202 pregnancies (46.7 %) of those mothers without any live birth in the medical history were assessed. To exclude any influence from previous abortions patients with previous miscarriages and IUFDs were excluded. The control collective were new borns whose mothers had suffered neither from miscarriages nor from abortions or IUFD. Previous interruptions, miscarriages and IUFD influence the rate of new borns with low birth weight and increase the rate of prematurity. With increasing numbers of isolated or combined risks in the medical history, the rate of newborns with a low birth weight or with prematurity is increased. The lowest risk was found after one interruption, the highest rate with two or more IUFDs. Interruptions, miscarriages or IUFD are not risk factors for IUGR or SGA. Previous interruptions, miscarriages and IUFD are relevant risk factors for prematurity and are related with low birth weight of the new borns. Pregnant women with such risk factors have to been considered as risk pregnancies and need intensive surveillance.

  13. 18 CFR 154.302 - Previously submitted material.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Previously submitted material. 154.302 Section 154.302 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... concurrently with the rate change filing. There must be furnished to the Director, Office of Energy Market...

  14. 18 CFR 366.6 - Previously authorized activities.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Previously authorized activities. 366.6 Section 366.6 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... POWER ACT AND NATURAL GAS ACT BOOKS AND RECORDS Definitions and Provisions Under PUHCA 2005, the Federal...

  15. 18 CFR 366.6 - Previously authorized activities.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Previously authorized activities. 366.6 Section 366.6 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... POWER ACT AND NATURAL GAS ACT BOOKS AND RECORDS Definitions and Provisions Under PUHCA 2005, the Federal...

  16. Quantifying low-frequency revertants in oral poliovirus vaccine using next generation sequencing.

    PubMed

    Sarcey, Eric; Serres, Aurélie; Tindy, Fabrice; Chareyre, Audrey; Ng, Siemon; Nicolas, Marine; Vetter, Emmanuelle; Bonnevay, Thierry; Abachin, Eric; Mallet, Laurent

    2017-08-01

    Spontaneous reversion to neurovirulence of live attenuated oral poliovirus vaccine (OPV) serotype 3 (chiefly involving the n.472U>C mutation), must be monitored during production to ensure vaccine safety and consistency. Mutant analysis by polymerase chain reaction and restriction enzyme cleavage (MAPREC) has long been endorsed by the World Health Organization as the preferred in vitro test for this purpose; however, it requires radiolabeling, which is no longer supported by many laboratories. We evaluated the performance and suitability of next generation sequencing (NGS) as an alternative to MAPREC. The linearity of NGS was demonstrated at revertant concentrations equivalent to the study range of 0.25%-1.5%. NGS repeatability and intermediate precision were comparable across all tested samples, and NGS was highly reproducible, irrespective of sequencing platform or analysis software used. NGS was performed on OPV serotype 3 working seed lots and monovalent bulks (n=21) that were previously tested using MAPREC, and which covered the representative range of vaccine production. Percentages of 472-C revertants identified by NGS and MAPREC were comparable and highly correlated (r≥0.80), with a Pearson correlation coefficient of 0.95585 (p<0.0001). NGS demonstrated statistically equivalent performance to that of MAPREC for quantifying low-frequency OPV serotype 3 revertants, and offers a valid alternative to MAPREC. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Quantifying and Improving International Space Station Survivability Following Orbital Debris Penetration

    NASA Technical Reports Server (NTRS)

    Williamsen, Joel; Evans, Hilary; Bohl, Bill; Evans, Steven; Parker, Nelson (Technical Monitor)

    2001-01-01

    The increase of the orbital debris environment in low-earth orbit has prompted NASA to develop analytical tools for quantifying and lowering the likelihood of crew loss following orbital debris penetration of the International Space Station (ISS). NASA uses the Manned Spacecraft and Crew Survivability (MSCSurv) computer program to simulate the events that may cause crew loss following orbital debris penetration of ISS manned modules, including: (1) critical cracking (explosive decompression) of the module; (2) critical external equipment penetration (such as hydrazine and high pressure tanks); (3) critical internal system penetration (guidance, control, and other vital components); (4) hazardous payload penetration (furnaces, pressure bottles, and toxic substances); (5) crew injury (from fragments, overpressure, light flash, and temperature rise); (6) hypoxia from loss of cabin pressure; and (7) thrust from module hole causing high angular velocity (occurring only when key Guidance, Navigation, and Control (GN&C) equipment is damaged) and, thus, preventing safe escape vehicle (EV) departure. MSCSurv is also capable of quantifying the 'end effects' of orbital debris penetration, such as the likelihood of crew escape, the probability of each module depressurizing, and late loss of station control. By quantifying these effects (and their associated uncertainties), NASA is able to improve the likelihood of crew survivability following orbital debris penetration due to improved crew operations and internal designs.

  18. Multiscale contact mechanics model for RF-MEMS switches with quantified uncertainties

    NASA Astrophysics Data System (ADS)

    Kim, Hojin; Huda Shaik, Nurul; Xu, Xin; Raman, Arvind; Strachan, Alejandro

    2013-12-01

    We introduce a multiscale model for contact mechanics between rough surfaces and apply it to characterize the force-displacement relationship for a metal-dielectric contact relevant for radio frequency micro-electromechanicl system (MEMS) switches. We propose a mesoscale model to describe the history-dependent force-displacement relationships in terms of the surface roughness, the long-range attractive interaction between the two surfaces, and the repulsive interaction between contacting asperities (including elastic and plastic deformation). The inputs to this model are the experimentally determined surface topography and the Hamaker constant as well as the mechanical response of individual asperities obtained from density functional theory calculations and large-scale molecular dynamics simulations. The model captures non-trivial processes including the hysteresis during loading and unloading due to plastic deformation, yet it is computationally efficient enough to enable extensive uncertainty quantification and sensitivity analysis. We quantify how uncertainties and variability in the input parameters, both experimental and theoretical, affect the force-displacement curves during approach and retraction. In addition, a sensitivity analysis quantifies the relative importance of the various input quantities for the prediction of force-displacement during contact closing and opening. The resulting force-displacement curves with quantified uncertainties can be directly used in device-level simulations of micro-switches and enable the incorporation of atomic and mesoscale phenomena in predictive device-scale simulations.

  19. Quantifying Cancer Risk from Radiation.

    PubMed

    Keil, Alexander P; Richardson, David B

    2017-12-06

    Complex statistical models fitted to data from studies of atomic bomb survivors are used to estimate the human health effects of ionizing radiation exposures. We describe and illustrate an approach to estimate population risks from ionizing radiation exposure that relaxes many assumptions about radiation-related mortality. The approach draws on developments in methods for causal inference. The results offer a different way to quantify radiation's effects and show that conventional estimates of the population burden of excess cancer at high radiation doses are driven strongly by projecting outside the range of current data. Summary results obtained using the proposed approach are similar in magnitude to those obtained using conventional methods, although estimates of radiation-related excess cancers differ for many age, sex, and dose groups. At low doses relevant to typical exposures, the strength of evidence in data is surprisingly weak. Statements regarding human health effects at low doses rely strongly on the use of modeling assumptions. © 2017 Society for Risk Analysis.

  20. Quantifying entanglement with witness operators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandao, Fernando G.S.L.

    2005-08-15

    We present a unifying approach to the quantification of entanglement based on entanglement witnesses, which includes several already established entanglement measures such as the negativity, the concurrence, and the robustness of entanglement. We then introduce an infinite family of new entanglement quantifiers, having as its limits the best separable approximation measure and the generalized robustness. Gaussian states, states with symmetry, states constrained to super-selection rules, and states composed of indistinguishable particles are studied under the view of the witnessed entanglement. We derive new bounds to the fidelity of teleportation d{sub min}, for the distillable entanglement E{sub D} and for themore » entanglement of formation. A particular measure, the PPT-generalized robustness, stands out due to its easy calculability and provides sharper bounds to d{sub min} and E{sub D} than the negativity in most of the states. We illustrate our approach studying thermodynamical properties of entanglement in the Heisenberg XXX and dimerized models.« less

  1. EpiProfile Quantifies Histone Peptides With Modifications by Extracting Retention Time and Intensity in High-resolution Mass Spectra*

    PubMed Central

    Yuan, Zuo-Fei; Lin, Shu; Molden, Rosalynn C.; Cao, Xing-Jun; Bhanu, Natarajan V.; Wang, Xiaoshi; Sidoli, Simone; Liu, Shichong; Garcia, Benjamin A.

    2015-01-01

    Histone post-translational modifications contribute to chromatin function through their chemical properties which influence chromatin structure and their ability to recruit chromatin interacting proteins. Nanoflow liquid chromatography coupled with high resolution tandem mass spectrometry (nanoLC-MS/MS) has emerged as the most suitable technology for global histone modification analysis because of the high sensitivity and the high mass accuracy of this approach that provides confident identification. However, analysis of histones with this method is even more challenging because of the large number and variety of isobaric histone peptides and the high dynamic range of histone peptide abundances. Here, we introduce EpiProfile, a software tool that discriminates isobaric histone peptides using the distinguishing fragment ions in their tandem mass spectra and extracts the chromatographic area under the curve using previous knowledge about peptide retention time. The accuracy of EpiProfile was evaluated by analysis of mixtures containing different ratios of synthetic histone peptides. In addition to label-free quantification of histone peptides, EpiProfile is flexible and can quantify different types of isotopically labeled histone peptides. EpiProfile is unique in generating layouts (i.e. relative retention time) of histone peptides when compared with manual quantification of the data and other programs (such as Skyline), filling the need of an automatic and freely available tool to quantify labeled and non-labeled modified histone peptides. In summary, EpiProfile is a valuable nanoflow liquid chromatography coupled with high resolution tandem mass spectrometry-based quantification tool for histone peptides, which can also be adapted to analyze nonhistone protein samples. PMID:25805797

  2. Exoskeleton Heterogeneity in Crustaceans: Quantifying Compositional and Structural Variations Across Body Parts

    NASA Astrophysics Data System (ADS)

    Ulrich, R. N.; Mergelsberg, S. T.; Dove, P. M.

    2016-12-01

    Crustacean exoskeletons are a complex biocomposite of organic macromolecules and calcium carbonate minerals. The highly divergent functions and diverse morphologies of these biominerals across taxa raise the question of whether these differences are systematically reflected in exoskeleton composition and structure. Previous studies that investigated element concentrations in exoskeletons used spectroscopic methods. However, the findings were largely inconclusive because of analytical limitations and most studies concluded that magnesium, phosphorus, and other trace elements are mostly contained in the mineral fraction because concentrations in the organic framework could not be resolved. This experimental study was designed to quantify the distributions of Ca, P, Mg, and Sr in the mineral versus organic fractions of exoskeletons from the American Lobster (H. americanus), Dungeness Crab (M. magister), and Red Rock Crab (M. productus). Samples of exoskeleton from 10 body parts were collected in triplicate and dissolved using three procedures specific to extracting the 1) mineral, 2) protein, and 3) chitin phases separately. Chemical analyses of the resulting effluents using ICP-OES show the mineral fraction of the skeleton can contain significant amounts of mineralized Mg and P particularly for body parts associated with a significant difference in mineral structural ordering. The protein fraction contains more Mg and P than expected based on estimates from previous studies (Hild et al., 2008). While the element distributions vary greatly depending on the location, in body parts with thicker cuticle (e.g. claw) the mineral component appears to control overall composition. The findings have implications for paleoenvironmental reconstructions based upon exoskeleton composition. First, the chemical composition of an exoskeleton cannot be assumed constant across the different body parts of an entire organism. This is particularly true when the exoskeleton of the claw is

  3. Information on Quantifiers and Argument Structure in English Learner's Dictionaries.

    ERIC Educational Resources Information Center

    Lee, Thomas Hun-tak

    1993-01-01

    Lexicographers have been arguing for the inclusion of abstract and complex grammatical information in dictionaries. This paper examines the extent to which information about quantifiers and the argument structure of verbs is encoded in English learner's dictionaries. The Oxford Advanced Learner's Dictionary (1989), the Longman Dictionary of…

  4. Previous prelabor or intrapartum cesarean delivery and risk of placenta previa.

    PubMed

    Downes, Katheryne L; Hinkle, Stefanie N; Sjaarda, Lindsey A; Albert, Paul S; Grantz, Katherine L

    2015-05-01

    The purpose of this study was to examine the association between previous cesarean delivery and subsequent placenta previa while distinguishing cesarean delivery before the onset of labor from intrapartum cesarean delivery. We conducted a retrospective cohort study of electronic medical records from 20 Utah hospitals (2002-2010) with restriction to the first 2 singleton deliveries of nulliparous women at study entry (n=26,987). First pregnancy delivery mode was classified as (1) vaginal (reference), (2) cesarean delivery before labor onset (prelabor), or (3) cesarean delivery after labor onset (intrapartum). Risk of second delivery previa was estimated by previous delivery mode with the use of logistic regression and was adjusted for maternal age, insurance, smoking, comorbidities, previous pregnancy loss, and history of previa. Most first deliveries were vaginal (82%; n=22,142), followed by intrapartum cesarean delivery (14.6%; n=3931), or prelabor cesarean delivery (3.4%; n=914). Incidence of second delivery previa was 0.29% (n=78) and differed by previous delivery mode: vaginal, 0.24%; prelabor cesarean delivery, 0.98%; intrapartum cesarean delivery, 0.38% (P<.001). Relative to vaginal delivery, previous prelabor cesarean delivery was associated with an increased risk of second delivery previa (adjusted odds ratio, 2.62; 95% confidence interval, 1.24-5.56). There was no significant association between previous intrapartum cesarean delivery and previa (adjusted odds ratio, 1.22; 95% confidence interval, 0.68-2.19). Previous prelabor cesarean delivery was associated with a >2-fold significantly increased risk of previa in the second delivery, although the approximately 20% increased risk of previa that was associated with previous intrapartum cesarean delivery was not significant. Although rare, the increased risk of placenta previa after previous prelabor cesarean delivery may be important when considering nonmedically indicated prelabor cesarean delivery

  5. PATIENT-CENTERED DECISION MAKING: LESSONS FROM MULTI-CRITERIA DECISION ANALYSIS FOR QUANTIFYING PATIENT PREFERENCES.

    PubMed

    Marsh, Kevin; Caro, J Jaime; Zaiser, Erica; Heywood, James; Hamed, Alaa

    2018-01-01

    Patient preferences should be a central consideration in healthcare decision making. However, stories of patients challenging regulatory and reimbursement decisions has led to questions on whether patient voices are being considered sufficiently during those decision making processes. This has led some to argue that it is necessary to quantify patient preferences before they can be adequately considered. This study considers the lessons from the use of multi-criteria decision analysis (MCDA) for efforts to quantify patient preferences. It defines MCDA and summarizes the benefits it can provide to decision makers, identifies examples of MCDAs that have involved patients, and summarizes good practice guidelines as they relate to quantifying patient preferences. The guidance developed to support the use of MCDA in healthcare provide some useful considerations for the quantification of patient preferences, namely that researchers should give appropriate consideration to: the heterogeneity of patient preferences, and its relevance to decision makers; the cognitive challenges posed by different elicitation methods; and validity of the results they produce. Furthermore, it is important to consider how the relevance of these considerations varies with the decision being supported. The MCDA literature holds important lessons for how patient preferences should be quantified to support healthcare decision making.

  6. Quantifying uncertainty in health impact assessment: a case-study example on indoor housing ventilation.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2014-01-01

    Quantitative health impact assessment (HIA) is increasingly being used to assess the health impacts attributable to an environmental policy or intervention. As a consequence, there is a need to assess uncertainties in the assessments because of the uncertainty in the HIA models. In this paper, a framework is developed to quantify the uncertainty in the health impacts of environmental interventions and is applied to evaluate the impacts of poor housing ventilation. The paper describes the development of the framework through three steps: (i) selecting the relevant exposure metric and quantifying the evidence of potential health effects of the exposure; (ii) estimating the size of the population affected by the exposure and selecting the associated outcome measure; (iii) quantifying the health impact and its uncertainty. The framework introduces a novel application for the propagation of uncertainty in HIA, based on fuzzy set theory. Fuzzy sets are used to propagate parametric uncertainty in a non-probabilistic space and are applied to calculate the uncertainty in the morbidity burdens associated with three indoor ventilation exposure scenarios: poor, fair and adequate. The case-study example demonstrates how the framework can be used in practice, to quantify the uncertainty in health impact assessment where there is insufficient information to carry out a probabilistic uncertainty analysis. © 2013.

  7. An unsupervised method for quantifying the behavior of paired animals

    NASA Astrophysics Data System (ADS)

    Klibaite, Ugne; Berman, Gordon J.; Cande, Jessica; Stern, David L.; Shaevitz, Joshua W.

    2017-02-01

    Behaviors involving the interaction of multiple individuals are complex and frequently crucial for an animal’s survival. These interactions, ranging across sensory modalities, length scales, and time scales, are often subtle and difficult to characterize. Contextual effects on the frequency of behaviors become even more difficult to quantify when physical interaction between animals interferes with conventional data analysis, e.g. due to visual occlusion. We introduce a method for quantifying behavior in fruit fly interaction that combines high-throughput video acquisition and tracking of individuals with recent unsupervised methods for capturing an animal’s entire behavioral repertoire. We find behavioral differences between solitary flies and those paired with an individual of the opposite sex, identifying specific behaviors that are affected by social and spatial context. Our pipeline allows for a comprehensive description of the interaction between two individuals using unsupervised machine learning methods, and will be used to answer questions about the depth of complexity and variance in fruit fly courtship.

  8. Developing Molecular Methods to Identify and Quantify Ballast Water Organisms: A Test Case with Cnidarians

    DTIC Science & Technology

    2004-04-15

    Developing Molecular Methods to Identify and Quantify Ballast Water Organisms: A Test Case with Cnidarians SERDP Project # CP-1251...2004 4. TITLE AND SUBTITLE Developing Molecular Methods to Identify and Quantify Ballast Water Organisms: A Test Case with Cnidarians 5a. CONTRACT... cnidarians ? 9 Indicators of ballast water exchange 9 Materials and Methods 11 Phase I. Specimens 11 DNA

  9. A 5-mC Dot Blot Assay Quantifying the DNA Methylation Level of Chondrocyte Dedifferentiation In Vitro.

    PubMed

    Jia, Zhaofeng; Liang, Yujie; Ma, Bin; Xu, Xiao; Xiong, Jianyi; Duan, Li; Wang, Daping

    2017-05-17

    The dedifferentiation of hyaline chondrocytes into fibroblastic chondrocytes often accompanies monolayer expansion of chondrocytes in vitro. The global DNA methylation level of chondrocytes is considered to be a suitable biomarker for the loss of the chondrocyte phenotype. However, results based on different experimental methods can be inconsistent. Therefore, it is important to establish a precise, simple, and rapid method to quantify global DNA methylation levels during chondrocyte dedifferentiation. Current genome-wide methylation analysis techniques largely rely on bisulfite genomic sequencing. Due to DNA degradation during bisulfite conversion, these methods typically require a large sample volume. Other methods used to quantify global DNA methylation levels include high-performance liquid chromatography (HPLC). However, HPLC requires complete digestion of genomic DNA. Additionally, the prohibitively high cost of HPLC instruments limits HPLC's wider application. In this study, genomic DNA (gDNA) was extracted from human chondrocytes cultured with varying number of passages. The gDNA methylation level was detected using a methylation-specific dot blot assay. In this dot blot approach, a gDNA mixture containing the methylated DNA to be detected was spotted directly onto an N + membrane as a dot inside a previously drawn circular template pattern. Compared with other gel electrophoresis-based blotting approaches and other complex blotting procedures, the dot blot method saves significant time. In addition, dot blots can detect overall DNA methylation level using a commercially available 5-mC antibody. We found that the DNA methylation level differed between the monolayer subcultures, and therefore could play a key role in chondrocyte dedifferentiation. The 5-mC dot blot is a reliable, simple, and rapid method to detect the general DNA methylation level to evaluate chondrocyte phenotype.

  10. Robot-assisted laparoscopic prostatectomy and previous surgical history: a multidisciplinary approach.

    PubMed

    Bernstein, Adrien N; Lavery, Hugh J; Hobbs, Adele R; Chin, Edward; Samadi, David B

    2013-06-01

    Previous abdominal or prostate surgery can be a significant barrier to subsequent minimally invasive procedures, including radical prostatectomy (RP). This is relevant to a quarter of prostatectomy patients who have had previous surgery. The technological advances of robot-assisted laparoscopic RP (RALP) can mitigate some of these challenges. To that end, our objective was to elucidate the effect of previous surgery on RALP, and to describe a multidisciplinary approach to the previously entered abdomen. One-thousand four-hundred and fourteen RALP patients were identified from a single-surgeon database. Potentially difficult cases were discussed preoperatively and treated in a multidisciplinary fashion with a general surgeon. Operative, pathological, and functional outcomes were analyzed after stratification by previous surgical history. Four-hundred and twenty (30 %) patients underwent previous surgery at least once. Perioperative outcomes were similar among most groups. Previous major abdominal surgery was associated with increased operative time (147 vs. 119 min, p < 0.001), as was the presence of adhesions (120 vs. 154 min, p < 0.001). Incidence of complications was comparable, irrespective of surgical history. Major complications included two enterotomies diagnosed intraoperatively and one patient requiring reoperation. All cases were performed robotically, without conversion to open-RP. There was no difference in biochemical disease-free survival among surgical groups and continence and potency were equivalent between groups. In conclusion, previous abdominal surgery did not affect the safety or feasibility of RALP, with all patients experiencing comparable perioperative, functional, and oncologic outcomes.

  11. Quantifying facial paralysis using the Kinect v2.

    PubMed

    Gaber, Amira; Taher, Mona F; Wahed, Manal Abdel

    2015-01-01

    Assessment of facial paralysis (FP) and quantitative grading of facial asymmetry are essential in order to quantify the extent of the condition as well as to follow its improvement or progression. As such, there is a need for an accurate quantitative grading system that is easy to use, inexpensive and has minimal inter-observer variability. A comprehensive automated system to quantify and grade FP is the main objective of this work. An initial prototype has been presented by the authors. The present research aims to enhance the accuracy and robustness of one of this system's modules: the resting symmetry module. This is achieved by including several modifications to the computation method of the symmetry index (SI) for the eyebrows, eyes and mouth. These modifications are the gamma correction technique, the area of the eyes, and the slope of the mouth. The system was tested on normal subjects and showed promising results. The mean SI of the eyebrows decreased slightly from 98.42% to 98.04% using the modified method while the mean SI for the eyes and mouth increased from 96.93% to 99.63% and from 95.6% to 98.11% respectively while using the modified method. The system is easy to use, inexpensive, automated and fast, has no inter-observer variability and is thus well suited for clinical use.

  12. Evaluation of statistical methods for quantifying fractal scaling in water-quality time series with irregular sampling

    NASA Astrophysics Data System (ADS)

    Zhang, Qian; Harman, Ciaran J.; Kirchner, James W.

    2018-02-01

    River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends because (1) fractal scaling has the potential to lead to false inference about the statistical significance of trends and (2) the abundance of irregularly spaced data in water-quality monitoring networks complicates efforts to quantify fractal scaling. Traditional methods for estimating fractal scaling - in the form of spectral slope (β) or other equivalent scaling parameters (e.g., Hurst exponent) - are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1) they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β = 0) to Brown noise (β = 2) and (2) their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths) in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining autocorrelation, as the interpolation methods consistently underestimate or overestimate β under a wide range of prescribed β values and gap distributions. Second, the widely used Lomb-Scargle spectral method also consistently underestimates β. A previously published modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among all methods for a wide range of

  13. Quantifying phase synchronization using instances of Hilbert phase slips

    NASA Astrophysics Data System (ADS)

    Govindan, R. B.

    2018-07-01

    We propose to quantify phase synchronization between two signals, x(t) and y(t), by calculating variance in the Hilbert phase of y(t) at instances of phase slips exhibited by x(t). The proposed approach is tested on numerically simulated coupled chaotic Roessler systems and second order autoregressive processes. Furthermore we compare the performance of the proposed and original approaches using uterine electromyogram signals and show that both approaches yield consistent results A standard phase synchronization approach, which involves unwrapping the Hilbert phases (ϕ1(t) and ϕ2(t)) of the two signals and analyzing the variance in the | n ṡϕ1(t) - m ṡϕ2(t) | , mod 2 π, (n and m are integers), was used for comparison. The synchronization indexes obtained from the proposed approach and the standard approach agree reasonably well in all of the systems studied in this work. Our results indicate that the proposed approach, unlike the traditional approach, does not require the non-invertible transformations - unwrapping of the phases and calculation of mod 2 π and it can be used to reliably to quantify phase synchrony between two signals.

  14. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data pointsmore » can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.« less

  15. Quantifying the effect of diagenetic recrystallization on the Mg isotopic composition of marine carbonates

    NASA Astrophysics Data System (ADS)

    Chanda, Piyali; Fantle, Matthew S.

    2017-05-01

    The Mg and Sr isotopic compositions (δ26Mg and 87Sr/86Sr) of pore fluids and bulk carbonates from Ocean Drilling Project Site 1171 (South Tasman Rise; 2148.2 m water depth) are reported, in order to evaluate the potential of diagenesis to alter carbonate-based geochemical proxies in an open marine system. Given the trace amounts of Mg in marine carbonates relative to coexisting pore fluids, diagenesis can alter carbonate δ26Mg, a promising proxy for seawater δ26Mg that may help elucidate long-term changes in the global Mg cycle. Constraints on the effect of diagenetic recrystallization on carbonate δ26Mg are therefore critical for accurate proxy interpretations. This study provides context for assessing the fidelity of geochemical proxy-reconstructions using the primary components (i.e., foraminiferal tests and nannofossils) of bulk carbonate sediments. We find that pore fluid δ26Mg values (on the DSM3 scale) at Site 1171 increase systematically with depth (from -0.72‰ to -0.39‰ in the upper ∼260 m), while the δ26Mg of bulk carbonates decrease systematically with depth (from -2.23‰ to -5.00‰ in the upper ∼260 m). This variability is ascribed primarily to carbonate recrystallization, with a small proportion of the variability due to down-hole changes in nannofossil and foraminiferal species composition. The inferred effect of diagenesis on bulk carbonate δ26Mg correlates with down-core changes in Mg/Ca, Sr/Ca, Na/Ca, and 87Sr/86Sr. A depositional reactive-transport model is employed to validate the hypothesis that calcite recrystallization in this system can generate sizeable shifts in carbonate δ26Mg. Model fits to the data suggest a fractionation factor and a partition coefficient that are consistent with previous work, assuming calcite recrystallization rates of ⩽7%/Ma constrained by Sr geochemistry. In addition, either partial dissolution or a distinctly different previous diagenetic regime must be invoked in order to explain aspects of the

  16. Structure for identifying, locating and quantifying physical phenomena

    DOEpatents

    Richardson, John G.

    2006-10-24

    A method and system for detecting, locating and quantifying a physical phenomena such as strain or a deformation in a structure. A minimum resolvable distance along the structure is selected and a quantity of laterally adjacent conductors is determined. Each conductor includes a plurality of segments coupled in series which define the minimum resolvable distance along the structure. When a deformation occurs, changes in the defined energy transmission characteristics along each conductor are compared to determine which segment contains the deformation.

  17. Accelerating Calculations of Reaction Dissipative Particle Dynamics in LAMMPS

    DTIC Science & Technology

    2017-05-17

    order reaction mechanism, the best acceleration was 6.1 times. For a larger, more chemically detailed mechanism, the best acceleration exceeded 60 times...simulations at previously inaccessible scales. A principle feature of DPD-RX is its ability to model chemical reactions within each CG particle. The...change in composition due to chemical reactions is described by a system of ordinary differential equations (ODEs) that are evaluated at each DPD time

  18. Generalization of Figure-Ground Segmentation from Binocular to Monocular Vision in an Embodied Biological Brain Model

    DTIC Science & Technology

    2011-08-01

    Intelligence (AGI). For example, it promises to unlock vast sets of training data , such as Google Images, which have previously been inaccessible to...development of this skill holds great promise for e orts, like Emer, that aim to create an Artifcial General Intelligence (AGI). For example, it promises to...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send

  19. Quantifying loopy network architectures.

    PubMed

    Katifori, Eleni; Magnasco, Marcelo O

    2012-01-01

    Biology presents many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture containing closed loops at many different levels. Although a number of approaches have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework, the hierarchical loop decomposition, that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated graphs, such as artificial models and optimal distribution networks, as well as natural graphs extracted from digitized images of dicotyledonous leaves and vasculature of rat cerebral neocortex. We calculate various metrics based on the asymmetry, the cumulative size distribution and the Strahler bifurcation ratios of the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information (exact location of edges and nodes) from the metric topology (connectivity and edge weight) and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs.

  20. Inertial sensors to quantify the pivot shift test in the treatment of anterior cruciate ligament injury

    PubMed Central

    ZAFFAGNINI, STEFANO; LOPOMO, NICOLA; SIGNORELLI, CECILIA; MUCCIOLI, GIULIO MARIA MARCHEGGIANI; BONANZINGA, TOMMASO; GRASSI, ALBERTO; RAGGI, FEDERICO; VISANI, ANDREA; MARCACCI, MAURILIO

    2014-01-01

    The main purpose of this article was to describe in detail, from the perspective of the clinical end user, a previously presented non-invasive methodology, applied in the treatment of anterior cruciate ligament injury, in which inertial sensors are used to quantify the pivot shift test. The outcomes obtained and relative considerations were compared with findings emerging from a review of the relevant updated literature. The detailed description here provided covers the system, the parameters identified and the testing procedure; it also includes the technical specifications of the hardware, the features introduced in the updated version of the software and the application of the system in clinical practice. The comparison of the technical considerations and clinical results with the updated literature confirmed the system’s optimal ergonomics, good reproducibility and clinical reliability. The novel approach here analyzed has been shown to overcome the weaknesses of other available devices and systems. Therefore, since it can be considered a new paradigm in the quantification of pivot shift test, we can recommend its routine use in clinical practice. PMID:25606555

  1. A method to quantify infection and colonization of holm oak (Quercus ilex) roots by Phytophthora cinnamomi

    PubMed Central

    2012-01-01

    Phytophthora cinnamomi Rands. is an important root rot pathogen widely distributed in the north hemisphere, with a large host range. Among others diseases, it is known to be a principal factor in the decline of holm oak and cork oak, the most important tree species in the “dehesa” ecosystem of south-western Spain. Previously, the focus of studies on P. cinnamomi and holm oak have been on molecular tools for identification, functional responses of the host, together with other physiological and morphological host variables. However, a microscopic index to describe the degree of infection and colonization in the plant tissues has not yet been developed. A colonization or infection index would be a useful tool for studies that examine differences between individuals subjected to different treatments or to individuals belonging to different breeding accessions, together with their specific responses to the pathogen. This work presents a methodology based on the capture and digital treatment of microscopic images, using simple and accessible software, together with a range of variables that quantify the infection and colonization process. PMID:22974221

  2. 49 CFR 175.75 - Quantity limitations and cargo location.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... hazardous material may be loaded in an inaccessible manner. Loaded in an inaccessible manner means cargo that is loaded in such a manner that a crew member or other authorized person cannot handle, and when... loaded in an inaccessible manner. These requirements do not apply to Class 9 and ORM-D materials. (d...

  3. Quantifying the lag time to detect barriers in landscape genetics

    Treesearch

    E. L. Landguth; S. A Cushman; M. K. Schwartz; K. S. McKelvey; M. Murphy; G. Luikart

    2010-01-01

    Understanding how spatial genetic patterns respond to landscape change is crucial for advancing the emerging field of landscape genetics. We quantified the number of generations for new landscape barrier signatures to become detectable and for old signatures to disappear after barrier removal. We used spatially explicit, individualbased simulations to examine the...

  4. Tackling Information Asymmetry in Networks: A New Entropy-Based Ranking Index

    NASA Astrophysics Data System (ADS)

    Barucca, Paolo; Caldarelli, Guido; Squartini, Tiziano

    2018-06-01

    Information is a valuable asset in socio-economic systems, a significant part of which is entailed into the network of connections between agents. The different interlinkages patterns that agents establish may, in fact, lead to asymmetries in the knowledge of the network structure; since this entails a different ability of quantifying relevant, systemic properties (e.g. the risk of contagion in a network of liabilities), agents capable of providing a better estimation of (otherwise) inaccessible network properties, ultimately have a competitive advantage. In this paper, we address the issue of quantifying the information asymmetry of nodes: to this aim, we define a novel index—InfoRank—intended to rank nodes according to their information content. In order to do so, each node ego-network is enforced as a constraint of an entropy-maximization problem and the subsequent uncertainty reduction is used to quantify the node-specific accessible information. We, then, test the performance of our ranking procedure in terms of reconstruction accuracy and show that it outperforms other centrality measures in identifying the "most informative" nodes. Finally, we discuss the socio-economic implications of network information asymmetry.

  5. Quantifying Channel Maintenance Instream Flows: An Approach for Gravel-Bed Streams in the Western United States

    Treesearch

    Larry J. Schmidt; John P. Potyondy

    2004-01-01

    This paper discusses one approach for quantifying channel maintenance instream flow necessary to achieve the Forest Service Organic Act purpose of securing favorable conditions of water flows. The approach is appropriate for quantifying channel maintenance flows on perennial, unregulated, snowmelt-dominated, gravel-bed streams with alluvial reaches. The approach...

  6. Producing and quantifying enriched para-H2.

    PubMed

    Tom, Brian A; Bhasker, Siddhartha; Miyamoto, Yuki; Momose, Takamasa; McCall, Benjamin J

    2009-01-01

    The production of enriched para-H(2) is useful for many scientific applications, but the technology for producing and measuring para-H(2) is not yet widespread. In this note and in the accompanying auxiliary material, we describe the design, construction, and use of a versatile standalone converter that is capable of producing para-H(2) enrichments of up to > or = 99.99% at continuous flow rates of up to 0.4 SLM. We also discuss para-H(2) storage and back conversion rates, and improvements to three techniques (thermal conductance, NMR, and solid hydrogen impurity spectroscopy) used to quantify the para-H(2) enrichment.

  7. Quantifying fast optical signal and event-related potential relationships during a visual oddball task.

    PubMed

    Proulx, Nicole; Samadani, Ali-Akbar; Chau, Tom

    2018-05-16

    Event-related potentials (ERPs) have previously been used to confirm the existence of the fast optical signal (FOS) but validation methods have mainly been limited to exploring the temporal correspondence of FOS peaks to those of ERPs. The purpose of this study was to systematically quantify the relationship between FOS and ERP responses to a visual oddball task in both time and frequency domains. Near-infrared spectroscopy (NIRS) and electroencephalography (EEG) sensors were co-located over the prefrontal cortex while participants performed a visual oddball task. Fifteen participants completed 2 data collection sessions each, where they were instructed to keep a mental count of oddball images. The oddball condition produced a positive ERP at 200 ms followed by a negativity 300-500 ms after image onset in the frontal electrodes. In contrast to previous FOS studies, a FOS response was identified only in DC intensity signals and not in phase delay signals. A decrease in DC intensity was found 150-250 ms after oddball image onset with a 400-trial average in 10 of 15 participants. The latency of the positive 200 ms ERP and the FOS DC intensity decrease were significantly correlated for only 6 (out of 15) participants due to the low signal-to-noise ratio of the FOS response. Coherence values between the FOS and ERP oddball responses were found to be significant in the 3-5 Hz frequency band for 10 participants. A significant Granger causal influence of the ERP on the FOS oddball response was uncovered in the 2-6 Hz frequency band for 7 participants. Collectively, our findings suggest that, for a majority of participants, the ERP and the DC intensity signal of the FOS are spectrally coherent, specifically in narrow frequency bands previously associated with event-related oscillations in the prefrontal cortex. However, these electro-optical relationships were only found in a subset of participants. Further research on enhancing the quality of the event-related FOS

  8. Timing of transients: quantifying reaching times and transient behavior in complex systems

    NASA Astrophysics Data System (ADS)

    Kittel, Tim; Heitzig, Jobst; Webster, Kevin; Kurths, Jürgen

    2017-08-01

    In dynamical systems, one may ask how long it takes for a trajectory to reach the attractor, i.e. how long it spends in the transient phase. Although for a single trajectory the mathematically precise answer may be infinity, it still makes sense to compare different trajectories and quantify which of them approaches the attractor earlier. In this article, we categorize several problems of quantifying such transient times. To treat them, we propose two metrics, area under distance curve and regularized reaching time, that capture two complementary aspects of transient dynamics. The first, area under distance curve, is the distance of the trajectory to the attractor integrated over time. It measures which trajectories are ‘reluctant’, i.e. stay distant from the attractor for long, or ‘eager’ to approach it right away. Regularized reaching time, on the other hand, quantifies the additional time (positive or negative) that a trajectory starting at a chosen initial condition needs to approach the attractor as compared to some reference trajectory. A positive or negative value means that it approaches the attractor by this much ‘earlier’ or ‘later’ than the reference, respectively. We demonstrated their substantial potential for application with multiple paradigmatic examples uncovering new features.

  9. PCB Food Web Dynamics Quantify Nutrient and Energy Flow in Aquatic Ecosystems.

    PubMed

    McLeod, Anne M; Paterson, Gordon; Drouillard, Ken G; Haffner, G Douglas

    2015-11-03

    Measuring in situ nutrient and energy flows in spatially and temporally complex aquatic ecosystems represents a major ecological challenge. Food web structure, energy and nutrient budgets are difficult to measure, and it is becoming more important to quantify both energy and nutrient flow to determine how food web processes and structure are being modified by multiple stressors. We propose that polychlorinated biphenyl (PCB) congeners represent an ideal tracer to quantify in situ energy and nutrient flow between trophic levels. Here, we demonstrate how an understanding of PCB congener bioaccumulation dynamics provides multiple direct measurements of energy and nutrient flow in aquatic food webs. To demonstrate this novel approach, we quantified nitrogen (N), phosphorus (P) and caloric turnover rates for Lake Huron lake trout, and reveal how these processes are regulated by both growth rate and fish life history. Although minimal nutrient recycling was observed in young growing fish, slow growing, older lake trout (>5 yr) recycled an average of 482 Tonnes·yr(-1) of N, 45 Tonnes·yr(-1) of P and assimilated 22 TJ yr(-1) of energy. Compared to total P loading rates of 590 Tonnes·yr(-1), the recycling of primarily bioavailable nutrients by fish plays an important role regulating the nutrient states of oligotrophic lakes.

  10. Development of method for quantifying essential tremor using a small optical device.

    PubMed

    Chen, Kai-Hsiang; Lin, Po-Chieh; Chen, Yu-Jung; Yang, Bing-Shiang; Lin, Chin-Hsien

    2016-06-15

    Clinical assessment scales are the most common means used by physicians to assess tremor severity. Some scientific tools that may be able to replace these scales to objectively assess the severity, such as accelerometers, digital tablets, electromyography (EMG) measurement devices, and motion capture cameras, are currently available. However, most of the operational modes of these tools are relatively complex or are only able to capture part of the clinical information; furthermore, using these tools is sometimes time consuming. Currently, there is no tool available for automatically quantifying tremor severity in clinical environments. We aimed to develop a rapid, objective, and quantitative system for measuring the severity of finger tremor using a small portable optical device (Leap Motion). A single test took 15s to conduct, and three algorithms were proposed to quantify the severity of finger tremor. The system was tested with four patients diagnosed with essential tremor. The proposed algorithms were able to quantify different characteristics of tremor in clinical environments, and could be used as references for future clinical assessments. A portable, easy-to-use, small-sized, and noncontact device (Leap Motion) was used to clinically detect and record finger movement, and three algorithms were proposed to describe tremor amplitudes. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Quantifying Glomerular Permeability of Fluorescent Macromolecules Using 2-Photon Microscopy in Munich Wistar Rats

    PubMed Central

    Sandoval, Ruben M.; Molitoris, Bruce A.

    2013-01-01

    Kidney diseases involving urinary loss of large essential macromolecules, such as serum albumin, have long been thought to be caused by alterations in the permeability barrier comprised of podocytes, vascular endothelial cells, and a basement membrane working in unison. Data from our laboratory using intravital 2-photon microscopy revealed a more permeable glomerular filtration barrier (GFB) than previously thought under physiologic conditions, with retrieval of filtered albumin occurring in an early subset of cells called proximal tubule cells (PTC)1,2,3. Previous techniques used to study renal filtration and establishing the characteristic of the filtration barrier involved micropuncture of the lumen of these early tubular segments with sampling of the fluid content and analysis4. These studies determined albumin concentration in the luminal fluid to be virtually non-existent; corresponding closely to what is normally detected in the urine. However, characterization of dextran polymers with defined sizes by this technique revealed those of a size similar to serum albumin had higher levels in the tubular lumen and urine; suggesting increased permeability5. Herein is a detailed outline of the technique used to directly visualize and quantify glomerular fluorescent albumin permeability in vivo. This method allows for detection of filtered albumin across the filtration barrier into Bowman's space (the initial chamber of urinary filtration); and also allows quantification of albumin reabsorption by proximal tubules and visualization of subsequent albumin transcytosis6. The absence of fluorescent albumin along later tubular segments en route to the bladder highlights the efficiency of the retrieval pathway in the earlier proximal tubule segments. Moreover, when this technique was applied to determine permeability of dextrans having a similar size to albumin virtually identical permeability values were reported2. These observations directly support the need to expand

  12. Development of real-time PCR methods to quantify patulin-producing molds in food products.

    PubMed

    Rodríguez, Alicia; Luque, M Isabel; Andrade, María J; Rodríguez, Mar; Asensio, Miguel A; Córdoba, Juan J

    2011-09-01

    Patulin is a mycotoxin produced by different Penicillium and Aspergillus strains isolated from food products. To improve food safety, the presence of patulin-producing molds in foods should be quantified. In the present work, two real-time (RTi) PCR protocols based on SYBR Green and TaqMan were developed. Thirty four patulin producers and 28 non-producers strains belonging to different species usually reported in food products were used. The patulin production was tested by mycellar electrokinetic capillary electrophoresis (MECE) and high-pressure liquid chromatography-mass spectrometry (HPLC-MS). A primer pair F-idhtrb/R-idhtrb and the probe IDHprobe were designed from the isoepoxydon dehydrogenase (idh) gene, involved in patulin biosynthesis. The functionality of the developed method was demonstrated by the high linear relationship of the standard curves constructed with the idh gene copy number and Ct values for the different patulin producers tested. The ability to quantify patulin producers of the developed SYBR Green and TaqMan assays in artificially inoculated food samples was successful, with a minimum threshold of 10 conidia g(-1) per reaction. The developed methods quantified with high efficiency fungal load in foods. These RTi-PCR protocols, are proposed to be used to quantify patulin-producing molds in food products and to prevent patulin from entering the food chain. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. New approaches to high-resolution mapping of marine vertical structures.

    PubMed

    Robert, Katleen; Huvenne, Veerle A I; Georgiopoulou, Aggeliki; Jones, Daniel O B; Marsh, Leigh; D O Carter, Gareth; Chaumillon, Leo

    2017-08-21

    Vertical walls in marine environments can harbour high biodiversity and provide natural protection from bottom-trawling activities. However, traditional mapping techniques are usually restricted to down-looking approaches which cannot adequately replicate their 3D structure. We combined sideways-looking multibeam echosounder (MBES) data from an AUV, forward-looking MBES data from ROVs and ROV-acquired videos to examine walls from Rockall Bank and Whittard Canyon, Northeast Atlantic. High-resolution 3D point clouds were extracted from each sonar dataset and structure from motion photogrammetry (SfM) was applied to recreate 3D representations of video transects along the walls. With these reconstructions, it was possible to interact with extensive sections of video footage and precisely position individuals. Terrain variables were derived on scales comparable to those experienced by megabenthic individuals. These were used to show differences in environmental conditions between observed and background locations as well as explain spatial patterns in ecological characteristics. In addition, since the SfM 3D reconstructions retained colours, they were employed to separate and quantify live coral colonies versus dead framework. The combination of these new technologies allows us, for the first time, to map the physical 3D structure of previously inaccessible habitats and demonstrates the complexity and importance of vertical structures.

  14. Reconstructing The Star Formation Histories Of Galaxies Through Sed Fitting Using The Dense Basis Method

    NASA Astrophysics Data System (ADS)

    Iyer, Kartheik; Gawiser, Eric

    2017-06-01

    The Dense Basis SED fitting method reveals previously inaccessible information about the number and duration of star formation episodes and the timing of stellar mass assembly as well as uncertainties in these quantities, in addition to accurately recovering traditional SED parameters including M*, SFR and dust attenuation. This is done using basis Star Formation Histories (SFHs) chosen by comparing the goodness-of-fit of mock galaxy SEDs to the goodness-of-reconstruction of their SFHs, trained and validated using three independent datasets of mock galaxies at z=1 from SAMs, Hydrodynamic simulations and stochastic realizations. Of the six parametrizations of SFHs considered, we reject the traditional parametrizations of constant and exponential SFHs and suggest four novel improvements, quantifying the bias and scatter of each parametrization. We then apply the method to a sample of 1100 CANDELS GOODS-S galaxies at 110^9 M_sun, in contrast to current simulations. About 40% of the CANDEL galaxies have SFHs whose maximum occurs at or near the epoch of observation. These results are presented in Iyer and Gawiser (2017, ApJ 838 127), available at https://arxiv.org/abs/1702.04371

  15. Amino acid production exceeds plant nitrogen demand in Siberian tundra

    NASA Astrophysics Data System (ADS)

    Wild, Birgit; Eloy Alves, Ricardo J.; Bárta, Jiři; Čapek, Petr; Gentsch, Norman; Guggenberger, Georg; Hugelius, Gustaf; Knoltsch, Anna; Kuhry, Peter; Lashchinskiy, Nikolay; Mikutta, Robert; Palmtag, Juri; Prommer, Judith; Schnecker, Jörg; Shibistova, Olga; Takriti, Mounir; Urich, Tim; Richter, Andreas

    2018-03-01

    Arctic plant productivity is often limited by low soil N availability. This has been attributed to slow breakdown of N-containing polymers in litter and soil organic matter (SOM) into smaller, available units, and to shallow plant rooting constrained by permafrost and high soil moisture. Using 15N pool dilution assays, we here quantified gross amino acid and ammonium production rates in 97 active layer samples from four sites across the Siberian Arctic. We found that amino acid production in organic layers alone exceeded literature-based estimates of maximum plant N uptake 17-fold and therefore reject the hypothesis that arctic plant N limitation results from slow SOM breakdown. High microbial N use efficiency in organic layers rather suggests strong competition of microorganisms and plants in the dominant rooting zone. Deeper horizons showed lower amino acid production rates per volume, but also lower microbial N use efficiency. Permafrost thaw together with soil drainage might facilitate deeper plant rooting and uptake of previously inaccessible subsoil N, and thereby promote plant productivity in arctic ecosystems. We conclude that changes in microbial decomposer activity, microbial N utilization and plant root density with soil depth interactively control N availability for plants in the Arctic.

  16. Using channelized Hotelling observers to quantify temporal effects of medical liquid crystal displays on detection performance

    NASA Astrophysics Data System (ADS)

    Platiša, Ljiljana; Goossens, Bart; Vansteenkiste, Ewout; Badano, Aldo; Philips, Wilfried

    2010-02-01

    Clinical practice is rapidly moving in the direction of volumetric imaging. Often, radiologists interpret these images in liquid crystal displays at browsing rates of 30 frames per second or higher. However, recent studies suggest that the slow response of the display can compromise image quality. In order to quantify the temporal effect of medical displays on detection performance, we investigate two designs of a multi-slice channelized Hotelling observer (msCHO) model in the task of detecting a single-slice signal in multi-slice simulated images. The design of msCHO models is inspired by simplifying assumptions about how humans observe while viewing in the stack-browsing mode. For comparison, we consider a standard CHO applied only on the slice where the signal is located, recently used in a similar study. We refer to it as a single-slice CHO (ssCHO). Overall, our results confirm previous findings that the slow response of displays degrades the detection performance of the observers. More specifically, the observed performance range of msCHO designs is higher compared to the ssCHO suggesting that the extent and rate of degradation, though significant, may be less drastic than previously estimated by the ssCHO. Especially, the difference between msCHO and ssCHO is more significant for higher browsing speeds than for slow image sequences or static images. This, together with their design criteria driven by the assumptions about humans, makes the msCHO models promising candidates for further studies aimed at building anthropomorphic observer models for the stack-mode image presentation.

  17. A Sustainability Initiative to Quantify Carbon Sequestration by Campus Trees

    ERIC Educational Resources Information Center

    Cox, Helen M.

    2012-01-01

    Over 3,900 trees on a university campus were inventoried by an instructor-led team of geography undergraduates in order to quantify the carbon sequestration associated with biomass growth. The setting of the project is described, together with its logistics, methodology, outcomes, and benefits. This hands-on project provided a team of students…

  18. Quantifying suitable habitat of the threatened western prairie fringed orchid

    Treesearch

    Paige M. Wolken; Carolyn Hull Sieg; Stephen E. Williams

    2001-01-01

    Land managers need accurate and quick techniques to identify suitable habitat of species of interest. For species protected by federal or state laws, identification of suitable habitat is critical for developing a conservation strategy that includes reestablishing populations and altering management to address this need. In this research, we quantified vegetative and...

  19. 14 CFR 121.406 - Credit for previous CRM/DRM training.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Credit for previous CRM/DRM training. 121... previous CRM/DRM training. (a) For flightcrew members, the Administrator may credit CRM training received before March 19, 1998 toward all or part of the initial ground CRM training required by § 121.419. (b...

  20. 14 CFR 121.406 - Credit for previous CRM/DRM training.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Credit for previous CRM/DRM training. 121... previous CRM/DRM training. (a) For flightcrew members, the Administrator may credit CRM training received before March 19, 1998 toward all or part of the initial ground CRM training required by § 121.419. (b...