Sample records for quantified results show

  1. Quantifying the uncertainty in heritability.

    PubMed

    Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph

    2014-05-01

    The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large.

  2. Quantifying the uncertainty in heritability

    PubMed Central

    Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph

    2014-01-01

    The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large. PMID:24670270

  3. Quantifying errors without random sampling.

    PubMed

    Phillips, Carl V; LaPole, Luwanna M

    2003-06-12

    All quantifications of mortality, morbidity, and other health measures involve numerous sources of error. The routine quantification of random sampling error makes it easy to forget that other sources of error can and should be quantified. When a quantification does not involve sampling, error is almost never quantified and results are often reported in ways that dramatically overstate their precision. We argue that the precision implicit in typical reporting is problematic and sketch methods for quantifying the various sources of error, building up from simple examples that can be solved analytically to more complex cases. There are straightforward ways to partially quantify the uncertainty surrounding a parameter that is not characterized by random sampling, such as limiting reported significant figures. We present simple methods for doing such quantifications, and for incorporating them into calculations. More complicated methods become necessary when multiple sources of uncertainty must be combined. We demonstrate that Monte Carlo simulation, using available software, can estimate the uncertainty resulting from complicated calculations with many sources of uncertainty. We apply the method to the current estimate of the annual incidence of foodborne illness in the United States. Quantifying uncertainty from systematic errors is practical. Reporting this uncertainty would more honestly represent study results, help show the probability that estimated values fall within some critical range, and facilitate better targeting of further research.

  4. Quantifying the measurement uncertainty of results from environmental analytical methods.

    PubMed

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  5. Btu accounting: Showing results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, K.E.

    1994-10-01

    In the preceding article in this series last month, the author showed how to calculate the energy consumed to make a pound of product. To realize a payoff, however, the results must be presented in graphs or tables that clearly display what has happened. They must call attention to plant performance and ultimately lead to more efficient use of energy. Energy-consumption reporting is particularly valuable when viewed over a period of time. The author recommend compiling data annually and maintaining a ten-year performance history. Four cases are considered: individual plant performance; site performance for sites having more than one plant;more » company performance, for companies having more than one site; and performance based on product, for identical or similar products made at different plants or sites. Of these, individual plant performance is inherently the most useful. It also serves as the best basis for site, company and product performance reports. A key element in energy accounting is the relating of all energy consumption to a common basis. As developed last month in Part 1 in this series, the author chose Btu[sub meth] (i.e., Btu of methane equivalent, expressed as its higher heating value) for this purpose. It represents the amount of methane that would be needed to replace (in the case of fuels) or generate (in the case of steam and power) the energy being used.« less

  6. Neural basis for generalized quantifier comprehension.

    PubMed

    McMillan, Corey T; Clark, Robin; Moore, Peachie; Devita, Christian; Grossman, Murray

    2005-01-01

    Generalized quantifiers like "all cars" are semantically well understood, yet we know little about their neural representation. Our model of quantifier processing includes a numerosity device, operations that combine number elements and working memory. Semantic theory posits two types of quantifiers: first-order quantifiers identify a number state (e.g. "at least 3") and higher-order quantifiers additionally require maintaining a number state actively in working memory for comparison with another state (e.g. "less than half"). We used BOLD fMRI to test the hypothesis that all quantifiers recruit inferior parietal cortex associated with numerosity, while only higher-order quantifiers recruit prefrontal cortex associated with executive resources like working memory. Our findings showed that first-order and higher-order quantifiers both recruit right inferior parietal cortex, suggesting that a numerosity component contributes to quantifier comprehension. Moreover, only probes of higher-order quantifiers recruited right dorsolateral prefrontal cortex, suggesting involvement of executive resources like working memory. We also observed activation of thalamus and anterior cingulate that may be associated with selective attention. Our findings are consistent with a large-scale neural network centered in frontal and parietal cortex that supports comprehension of generalized quantifiers.

  7. Quantifying edge significance on maintaining global connectivity

    PubMed Central

    Qian, Yuhua; Li, Yebin; Zhang, Min; Ma, Guoshuai; Lu, Furong

    2017-01-01

    Global connectivity is a quite important issue for networks. The failures of some key edges may lead to breakdown of the whole system. How to find them will provide a better understanding on system robustness. Based on topological information, we propose an approach named LE (link entropy) to quantify the edge significance on maintaining global connectivity. Then we compare the LE with the other six acknowledged indices on the edge significance: the edge betweenness centrality, degree product, bridgeness, diffusion importance, topological overlap and k-path edge centrality. Experimental results show that the LE approach outperforms in quantifying edge significance on maintaining global connectivity. PMID:28349923

  8. Quantifying potential recharge in mantled sinkholes using ERT.

    PubMed

    Schwartz, Benjamin F; Schreiber, Madeline E

    2009-01-01

    Potential recharge through thick soils in mantled sinkholes was quantified using differential electrical resistivity tomography (ERT). Conversion of time series two-dimensional (2D) ERT profiles into 2D volumetric water content profiles using a numerically optimized form of Archie's law allowed us to monitor temporal changes in water content in soil profiles up to 9 m in depth. Combining Penman-Monteith daily potential evapotranspiration (PET) and daily precipitation data with potential recharge calculations for three sinkhole transects indicates that potential recharge occurred only during brief intervals over the study period and ranged from 19% to 31% of cumulative precipitation. Spatial analysis of ERT-derived water content showed that infiltration occurred both on sinkhole flanks and in sinkhole bottoms. Results also demonstrate that mantled sinkholes can act as regions of both rapid and slow recharge. Rapid recharge is likely the result of flow through macropores (such as root casts and thin gravel layers), while slow recharge is the result of unsaturated flow through fine-grained sediments. In addition to developing a new method for quantifying potential recharge at the field scale in unsaturated conditions, we show that mantled sinkholes are an important component of storage in a karst system.

  9. Quantifiable Lateral Flow Assay Test Strips

    NASA Technical Reports Server (NTRS)

    2003-01-01

    As easy to read as a home pregnancy test, three Quantifiable Lateral Flow Assay (QLFA) strips used to test water for E. coli show different results. The brightly glowing control line on the far right of each strip indicates that all three tests ran successfully. But the glowing test line on the middle left and bottom strips reveal their samples were contaminated with E. coli bacteria at two different concentrations. The color intensity correlates with concentration of contamination.

  10. Children's interpretations of general quantifiers, specific quantifiers, and generics

    PubMed Central

    Gelman, Susan A.; Leslie, Sarah-Jane; Was, Alexandra M.; Koch, Christina M.

    2014-01-01

    Recently, several scholars have hypothesized that generics are a default mode of generalization, and thus that young children may at first treat quantifiers as if they were generic in meaning. To address this issue, the present experiment provides the first in-depth, controlled examination of the interpretation of generics compared to both general quantifiers ("all Xs", "some Xs") and specific quantifiers ("all of these Xs", "some of these Xs"). We provided children (3 and 5 years) and adults with explicit frequency information regarding properties of novel categories, to chart when "some", "all", and generics are deemed appropriate. The data reveal three main findings. First, even 3-year-olds distinguish generics from quantifiers. Second, when children make errors, they tend to be in the direction of treating quantifiers like generics. Third, children were more accurate when interpreting specific versus general quantifiers. We interpret these data as providing evidence for the position that generics are a default mode of generalization, especially when reasoning about kinds. PMID:25893205

  11. A new index quantifying the precipitation extremes

    NASA Astrophysics Data System (ADS)

    Busuioc, Aristita; Baciu, Madalina; Stoica, Cerasela

    2015-04-01

    Meteorological Administration in Romania. These types of records contain the rainfall intensity (mm/minute) over various intervals for which it remains constant. The maximum intensity for each continuous rain over the May-August interval has been calculated for each year. The corresponding time series over the 1951-2008 period have been analysed in terms of their long term trends and shifts in the mean; the results have been compared to those resulted from other rainfall indices based on daily and hourly data, computed over the same interval such as: total rainfall amount, maximum daily amount, contribution of total hourly amounts exceeding 10mm/day, contribution of daily amounts exceeding the 90th percentile, the 90th, 99th and 99.9th percentiles of 1-hour data . The results show that the proposed index exhibit a coherent and stronger climate signal (significant increase) for all analysed stations compared to the other indices associated to precipitation extremes, which show either no significant change or weaker signal. This finding shows that the proposed index is most appropriate to quantify the climate change signal of the precipitation extremes. We consider that this index is more naturally connected to the maximum intensity of a real rainfall event. The results presented is this study were funded by the Executive Agency for Higher Education, Research, Development and Innovation Funding (UEFISCDI) through the research project CLIMHYDEX, "Changes in climate extremes and associated impact in hydrological events in Romania", code PNII-ID-2011-2-0073 (http://climhydex.meteoromania.ro)

  12. Comb-Push Ultrasound Shear Elastography of Breast Masses: Initial Results Show Promise

    PubMed Central

    Song, Pengfei; Fazzio, Robert T.; Pruthi, Sandhya; Whaley, Dana H.; Chen, Shigao; Fatemi, Mostafa

    2015-01-01

    Purpose or Objective To evaluate the performance of Comb-push Ultrasound Shear Elastography (CUSE) for classification of breast masses. Materials and Methods CUSE is an ultrasound-based quantitative two-dimensional shear wave elasticity imaging technique, which utilizes multiple laterally distributed acoustic radiation force (ARF) beams to simultaneously excite the tissue and induce shear waves. Female patients who were categorized as having suspicious breast masses underwent CUSE evaluations prior to biopsy. An elasticity estimate within the breast mass was obtained from the CUSE shear wave speed map. Elasticity estimates of various types of benign and malignant masses were compared with biopsy results. Results Fifty-four female patients with suspicious breast masses from our ongoing study are presented. Our cohort included 31 malignant and 23 benign breast masses. Our results indicate that the mean shear wave speed was significantly higher in malignant masses (6 ± 1.58 m/s) in comparison to benign masses (3.65 ± 1.36 m/s). Therefore, the stiffness of the mass quantified by the Young’s modulus is significantly higher in malignant masses. According to the receiver operating characteristic curve (ROC), the optimal cut-off value of 83 kPa yields 87.10% sensitivity, 82.61% specificity, and 0.88 for the area under the curve (AUC). Conclusion CUSE has the potential for clinical utility as a quantitative diagnostic imaging tool adjunct to B-mode ultrasound for differentiation of malignant and benign breast masses. PMID:25774978

  13. 13. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF POOR CONSTRUCTION WORK. THOUGH NOT A SERIOUS STRUCTURAL DEFICIENCY, THE 'HONEYCOMB' TEXTURE OF THE CONCRETE SURFACE WAS THE RESULT OF INADEQUATE TAMPING AT THE TIME OF THE INITIAL 'POUR'. - Hume Lake Dam, Sequioa National Forest, Hume, Fresno County, CA

  14. Comb-push ultrasound shear elastography of breast masses: initial results show promise.

    PubMed

    Denis, Max; Mehrmohammadi, Mohammad; Song, Pengfei; Meixner, Duane D; Fazzio, Robert T; Pruthi, Sandhya; Whaley, Dana H; Chen, Shigao; Fatemi, Mostafa; Alizad, Azra

    2015-01-01

    To evaluate the performance of Comb-push Ultrasound Shear Elastography (CUSE) for classification of breast masses. CUSE is an ultrasound-based quantitative two-dimensional shear wave elasticity imaging technique, which utilizes multiple laterally distributed acoustic radiation force (ARF) beams to simultaneously excite the tissue and induce shear waves. Female patients who were categorized as having suspicious breast masses underwent CUSE evaluations prior to biopsy. An elasticity estimate within the breast mass was obtained from the CUSE shear wave speed map. Elasticity estimates of various types of benign and malignant masses were compared with biopsy results. Fifty-four female patients with suspicious breast masses from our ongoing study are presented. Our cohort included 31 malignant and 23 benign breast masses. Our results indicate that the mean shear wave speed was significantly higher in malignant masses (6 ± 1.58 m/s) in comparison to benign masses (3.65 ± 1.36 m/s). Therefore, the stiffness of the mass quantified by the Young's modulus is significantly higher in malignant masses. According to the receiver operating characteristic curve (ROC), the optimal cut-off value of 83 kPa yields 87.10% sensitivity, 82.61% specificity, and 0.88 for the area under the curve (AUC). CUSE has the potential for clinical utility as a quantitative diagnostic imaging tool adjunct to B-mode ultrasound for differentiation of malignant and benign breast masses.

  15. Interpreting Quantifier Scope Ambiguity: Evidence of Heuristic First, Algorithmic Second Processing

    PubMed Central

    Dwivedi, Veena D.

    2013-01-01

    The present work suggests that sentence processing requires both heuristic and algorithmic processing streams, where the heuristic processing strategy precedes the algorithmic phase. This conclusion is based on three self-paced reading experiments in which the processing of two-sentence discourses was investigated, where context sentences exhibited quantifier scope ambiguity. Experiment 1 demonstrates that such sentences are processed in a shallow manner. Experiment 2 uses the same stimuli as Experiment 1 but adds questions to ensure deeper processing. Results indicate that reading times are consistent with a lexical-pragmatic interpretation of number associated with context sentences, but responses to questions are consistent with the algorithmic computation of quantifier scope. Experiment 3 shows the same pattern of results as Experiment 2, despite using stimuli with different lexical-pragmatic biases. These effects suggest that language processing can be superficial, and that deeper processing, which is sensitive to structure, only occurs if required. Implications for recent studies of quantifier scope ambiguity are discussed. PMID:24278439

  16. Quantifying causal emergence shows that macro can beat micro.

    PubMed

    Hoel, Erik P; Albantakis, Larissa; Tononi, Giulio

    2013-12-03

    Causal interactions within complex systems can be analyzed at multiple spatial and temporal scales. For example, the brain can be analyzed at the level of neurons, neuronal groups, and areas, over tens, hundreds, or thousands of milliseconds. It is widely assumed that, once a micro level is fixed, macro levels are fixed too, a relation called supervenience. It is also assumed that, although macro descriptions may be convenient, only the micro level is causally complete, because it includes every detail, thus leaving no room for causation at the macro level. However, this assumption can only be evaluated under a proper measure of causation. Here, we use a measure [effective information (EI)] that depends on both the effectiveness of a system's mechanisms and the size of its state space: EI is higher the more the mechanisms constrain the system's possible past and future states. By measuring EI at micro and macro levels in simple systems whose micro mechanisms are fixed, we show that for certain causal architectures EI can peak at a macro level in space and/or time. This happens when coarse-grained macro mechanisms are more effective (more deterministic and/or less degenerate) than the underlying micro mechanisms, to an extent that overcomes the smaller state space. Thus, although the macro level supervenes upon the micro, it can supersede it causally, leading to genuine causal emergence--the gain in EI when moving from a micro to a macro level of analysis.

  17. 14. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF INADEQUATE TAMPING. THE SIZE OF THE GRANITE AGGREGATE USED IN THE DAMS CONCRETE IS CLEARLY SHOWN. - Hume Lake Dam, Sequioa National Forest, Hume, Fresno County, CA

  18. Quantifying China's regional economic complexity

    NASA Astrophysics Data System (ADS)

    Gao, Jian; Zhou, Tao

    2018-02-01

    China has experienced an outstanding economic expansion during the past decades, however, literature on non-monetary metrics that reveal the status of China's regional economic development are still lacking. In this paper, we fill this gap by quantifying the economic complexity of China's provinces through analyzing 25 years' firm data. First, we estimate the regional economic complexity index (ECI), and show that the overall time evolution of provinces' ECI is relatively stable and slow. Then, after linking ECI to the economic development and the income inequality, we find that the explanatory power of ECI is positive for the former but negative for the latter. Next, we compare different measures of economic diversity and explore their relationships with monetary macroeconomic indicators. Results show that the ECI index and the non-linear iteration based Fitness index are comparative, and they both have stronger explanatory power than other benchmark measures. Further multivariate regressions suggest the robustness of our results after controlling other socioeconomic factors. Our work moves forward a step towards better understanding China's regional economic development and non-monetary macroeconomic indicators.

  19. Quantifiers more or less quantify online: ERP evidence for partial incremental interpretation

    PubMed Central

    Urbach, Thomas P.; Kutas, Marta

    2010-01-01

    Event-related brain potentials were recorded during RSVP reading to test the hypothesis that quantifier expressions are incrementally interpreted fully and immediately. In sentences tapping general knowledge (Farmers grow crops/worms as their primary source of income), Experiment 1 found larger N400s for atypical (worms) than typical objects (crops). Experiment 2 crossed object typicality with non-logical subject-noun phrase quantifiers (most, few). Off-line plausibility ratings exhibited the crossover interaction predicted by full quantifier interpretation: Most farmers grow crops and Few farmers grow worms were rated more plausible than Most farmers grow worms and Few farmers grow crops. Object N400s, although modulated in the expected direction, did not reverse. Experiment 3 replicated these findings with adverbial quantifiers (Farmers often/rarely grow crops/worms). Interpretation of quantifier expressions thus is neither fully immediate nor fully delayed. Furthermore, object atypicality was associated with a frontal slow positivity in few-type/rarely quantifier contexts, suggesting systematic processing differences among quantifier types. PMID:20640044

  20. Quantifying the Effects of Biofilm on the Hydraulic Properties of Unsaturated Soils

    NASA Astrophysics Data System (ADS)

    Volk, E.; Iden, S.; Furman, A.; Durner, W.; Rosenzweig, R.

    2017-12-01

    Quantifying the effects of biofilms on hydraulic properties of unsaturated soils is necessary for predicting water and solute flow in soil with extensive microbial presence. This can be relevant to bioremediation processes, soil aquifer treatment and effluent irrigation. Previous works showed a reduction in the hydraulic conductivity and an increase in water content due to the addition of biofilm analogue materials. The objective of this research is to quantify soil hydraulic properties of unsaturated soil (water retention and hydraulic conductivity) using real soil biofilm. In this work, Hamra soil was incubated with Luria Broth (LB) and biofilm-producing bacteria (Pseudomonas Putida F1). Hydraulic conductivity and water retention were measured by the evaporation method, Dewpoint method and a constant head permeameter. Biofilm was quantified using viable counts and the deficit of TOC. The results show that the presence of biofilms increases soil retention in the `dry' range of the curve and reduces the hydraulic conductivity (see figure). This research shows that biofilms may have a non-negligible effect on flow and transport in unsaturated soils. These findings contribute to modeling water flow in biofilm amended soil.

  1. Talker-specificity and adaptation in quantifier interpretation

    PubMed Central

    Yildirim, Ilker; Degen, Judith; Tanenhaus, Michael K.; Jaeger, T. Florian

    2015-01-01

    Linguistic meaning has long been recognized to be highly context-dependent. Quantifiers like many and some provide a particularly clear example of context-dependence. For example, the interpretation of quantifiers requires listeners to determine the relevant domain and scale. We focus on another type of context-dependence that quantifiers share with other lexical items: talker variability. Different talkers might use quantifiers with different interpretations in mind. We used a web-based crowdsourcing paradigm to study participants’ expectations about the use of many and some based on recent exposure. We first established that the mapping of some and many onto quantities (candies in a bowl) is variable both within and between participants. We then examined whether and how listeners’ expectations about quantifier use adapts with exposure to talkers who use quantifiers in different ways. The results demonstrate that listeners can adapt to talker-specific biases in both how often and with what intended meaning many and some are used. PMID:26858511

  2. Non-Markovianity quantifier of an arbitrary quantum process

    NASA Astrophysics Data System (ADS)

    Debarba, Tiago; Fanchini, Felipe F.

    2017-12-01

    Calculating the degree of non-Markovianity of a quantum process, for a high-dimensional system, is a difficult task given complex maximization problems. Focusing on the entanglement-based measure of non-Markovianity we propose a numerically feasible quantifier for finite-dimensional systems. We define the non-Markovianity measure in terms of a class of entanglement quantifiers named witnessed entanglement which allow us to write several entanglement based measures of non-Markovianity in a unique formalism. In this formalism, we show that the non-Markovianity, in a given time interval, can be witnessed by calculating the expectation value of an observable, making it attractive for experimental investigations. Following this property we introduce a quantifier base on the entanglement witness in an interval of time; we show that measure is a bonafide measure of non-Markovianity. In our example, we use the generalized robustness of entanglement, an entanglement measure that can be readily calculated by a semidefinite programming method, to study impurity atoms coupled to a Bose-Einstein condensate.

  3. Quantifying Uncertainty in Model Predictions for the Pliocene (Plio-QUMP): Initial results

    USGS Publications Warehouse

    Pope, J.O.; Collins, M.; Haywood, A.M.; Dowsett, H.J.; Hunter, S.J.; Lunt, D.J.; Pickering, S.J.; Pound, M.J.

    2011-01-01

    Examination of the mid-Pliocene Warm Period (mPWP; ~. 3.3 to 3.0. Ma BP) provides an excellent opportunity to test the ability of climate models to reproduce warm climate states, thereby assessing our confidence in model predictions. To do this it is necessary to relate the uncertainty in model simulations of mPWP climate to uncertainties in projections of future climate change. The uncertainties introduced by the model can be estimated through the use of a Perturbed Physics Ensemble (PPE). Developing on the UK Met Office Quantifying Uncertainty in Model Predictions (QUMP) Project, this paper presents the results from an initial investigation using the end members of a PPE in a fully coupled atmosphere-ocean model (HadCM3) running with appropriate mPWP boundary conditions. Prior work has shown that the unperturbed version of HadCM3 may underestimate mPWP sea surface temperatures at higher latitudes. Initial results indicate that neither the low sensitivity nor the high sensitivity simulations produce unequivocally improved mPWP climatology relative to the standard. Whilst the high sensitivity simulation was able to reconcile up to 6 ??C of the data/model mismatch in sea surface temperatures in the high latitudes of the Northern Hemisphere (relative to the standard simulation), it did not produce a better prediction of global vegetation than the standard simulation. Overall the low sensitivity simulation was degraded compared to the standard and high sensitivity simulations in all aspects of the data/model comparison. The results have shown that a PPE has the potential to explore weaknesses in mPWP modelling simulations which have been identified by geological proxies, but that a 'best fit' simulation will more likely come from a full ensemble in which simulations that contain the strengths of the two end member simulations shown here are combined. ?? 2011 Elsevier B.V.

  4. Quantifying a cellular automata simulation of electric vehicles

    NASA Astrophysics Data System (ADS)

    Hill, Graeme; Bell, Margaret; Blythe, Phil

    2014-12-01

    Within this work the Nagel-Schreckenberg (NS) cellular automata is used to simulate a basic cyclic road network. Results from SwitchEV, a real world Electric Vehicle trial which has collected more than two years of detailed electric vehicle data, are used to quantify the results of the NS automata, demonstrating similar power consumption behavior to that observed in the experimental results. In particular the efficiency of the electric vehicles reduces as the vehicle density increases, due in part to the reduced efficiency of EVs at low speeds, but also due to the energy consumption inherent in changing speeds. Further work shows the results from introducing spatially restricted speed restriction. In general it can be seen that induced congestion from spatially transient events propagates back through the road network and alters the energy and efficiency profile of the simulated vehicles, both before and after the speed restriction. Vehicles upstream from the restriction show a reduced energy usage and an increased efficiency, and vehicles downstream show an initial large increase in energy usage as they accelerate away from the speed restriction.

  5. Quantifying and tuning entanglement for quantum systems

    NASA Astrophysics Data System (ADS)

    Xu, Qing

    A 2D Ising model with transverse field on a triangular lattice is studied using exact diagonalization. The quantum entanglement of the system is quantified by the entanglement of formation. The ground state property of the system is studied and the quantified entanglement is shown to be closely related to the ground state wavefunction while the singularity in the entanglement as a function of the transverse field is a reasonable indicator of the quantum phase transition. In order to tune the entanglement, one can either include an impurity in the otherwise homogeneous system whose strength is tunable, or one can vary the external transverse field as a tuner. The latter kind of tuning involves complicated dynamical properties of the system. From the study of the dynamics on a comparatively smaller system, we provide ways to tune the entanglement without triggering any decoherence. The finite temperature effect is also discussed. Besides showing above physical results, the realization of the trace-minimization method in our system is provided; the scalability of such method to larger systems is argued.

  6. Gun Shows and Gun Violence: Fatally Flawed Study Yields Misleading Results

    PubMed Central

    Hemenway, David; Webster, Daniel; Pierce, Glenn; Braga, Anthony A.

    2010-01-01

    A widely publicized but unpublished study of the relationship between gun shows and gun violence is being cited in debates about the regulation of gun shows and gun commerce. We believe the study is fatally flawed. A working paper entitled “The Effect of Gun Shows on Gun-Related Deaths: Evidence from California and Texas” outlined this study, which found no association between gun shows and gun-related deaths. We believe the study reflects a limited understanding of gun shows and gun markets and is not statistically powered to detect even an implausibly large effect of gun shows on gun violence. In addition, the research contains serious ascertainment and classification errors, produces results that are sensitive to minor specification changes in key variables and in some cases have no face validity, and is contradicted by 1 of its own authors’ prior research. The study should not be used as evidence in formulating gun policy. PMID:20724672

  7. Gun shows and gun violence: fatally flawed study yields misleading results.

    PubMed

    Wintemute, Garen J; Hemenway, David; Webster, Daniel; Pierce, Glenn; Braga, Anthony A

    2010-10-01

    A widely publicized but unpublished study of the relationship between gun shows and gun violence is being cited in debates about the regulation of gun shows and gun commerce. We believe the study is fatally flawed. A working paper entitled "The Effect of Gun Shows on Gun-Related Deaths: Evidence from California and Texas" outlined this study, which found no association between gun shows and gun-related deaths. We believe the study reflects a limited understanding of gun shows and gun markets and is not statistically powered to detect even an implausibly large effect of gun shows on gun violence. In addition, the research contains serious ascertainment and classification errors, produces results that are sensitive to minor specification changes in key variables and in some cases have no face validity, and is contradicted by 1 of its own authors' prior research. The study should not be used as evidence in formulating gun policy.

  8. Quantifying resilience

    USGS Publications Warehouse

    Allen, Craig R.; Angeler, David G.

    2016-01-01

    Several frameworks to operationalize resilience have been proposed. A decade ago, a special feature focused on quantifying resilience was published in the journal Ecosystems (Carpenter, Westley & Turner 2005). The approach there was towards identifying surrogates of resilience, but few of the papers proposed quantifiable metrics. Consequently, many ecological resilience frameworks remain vague and difficult to quantify, a problem that this special feature aims to address. However, considerable progress has been made during the last decade (e.g. Pope, Allen & Angeler 2014). Although some argue that resilience is best kept as an unquantifiable, vague concept (Quinlan et al. 2016), to be useful for managers, there must be concrete guidance regarding how and what to manage and how to measure success (Garmestani, Allen & Benson 2013; Spears et al. 2015). Ideas such as ‘resilience thinking’ have utility in helping stakeholders conceptualize their systems, but provide little guidance on how to make resilience useful for ecosystem management, other than suggesting an ambiguous, Goldilocks approach of being just right (e.g. diverse, but not too diverse; connected, but not too connected). Here, we clarify some prominent resilience terms and concepts, introduce and synthesize the papers in this special feature on quantifying resilience and identify core unanswered questions related to resilience.

  9. A stochastic approach for quantifying immigrant integration: the Spanish test case

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Barra, Adriano; Contucci, Pierluigi; Sandell, Richard; Vernia, Cecilia

    2014-10-01

    We apply stochastic process theory to the analysis of immigrant integration. Using a unique and detailed data set from Spain, we study the relationship between local immigrant density and two social and two economic immigration quantifiers for the period 1999-2010. As opposed to the classic time-series approach, by letting immigrant density play the role of ‘time’ and the quantifier the role of ‘space,’ it becomes possible to analyse the behavior of the quantifiers by means of continuous time random walks. Two classes of results are then obtained. First, we show that social integration quantifiers evolve following diffusion law, while the evolution of economic quantifiers exhibits ballistic dynamics. Second, we make predictions of best- and worst-case scenarios taking into account large local fluctuations. Our stochastic process approach to integration lends itself to interesting forecasting scenarios which, in the hands of policy makers, have the potential to improve political responses to integration problems. For instance, estimating the standard first-passage time and maximum-span walk reveals local differences in integration performance for different immigration scenarios. Thus, by recognizing the importance of local fluctuations around national means, this research constitutes an important tool to assess the impact of immigration phenomena on municipal budgets and to set up solid multi-ethnic plans at the municipal level as immigration pressures build.

  10. Quantifying Cancer Risk from Radiation.

    PubMed

    Keil, Alexander P; Richardson, David B

    2017-12-06

    Complex statistical models fitted to data from studies of atomic bomb survivors are used to estimate the human health effects of ionizing radiation exposures. We describe and illustrate an approach to estimate population risks from ionizing radiation exposure that relaxes many assumptions about radiation-related mortality. The approach draws on developments in methods for causal inference. The results offer a different way to quantify radiation's effects and show that conventional estimates of the population burden of excess cancer at high radiation doses are driven strongly by projecting outside the range of current data. Summary results obtained using the proposed approach are similar in magnitude to those obtained using conventional methods, although estimates of radiation-related excess cancers differ for many age, sex, and dose groups. At low doses relevant to typical exposures, the strength of evidence in data is surprisingly weak. Statements regarding human health effects at low doses rely strongly on the use of modeling assumptions. © 2017 Society for Risk Analysis.

  11. USGS Regional Groundwater Availability Studies: Quantifying Aquifer Response

    NASA Astrophysics Data System (ADS)

    Reeves, H. W.

    2017-12-01

    The U.S. Geological Survey (USGS) identified six challenges in determining groundwater availability: 1) limited direct measurement, 2) varying response times for different systems, 3) varying spatial scales for different availability questions and aquifer systems, 4) varying tolerance to changes in water levels or outflows, 5) redistribution of stresses and potential return-flow of water pumped from the system, and 6) varying chemical quality of groundwater and the role of quality in determining suitability for different uses. USGS Regional groundwater availability studies are designed to address these challenges. USGS regional groundwater availability studies focus on quantifying the groundwater budget for principal aquifers and determining how this budget has changed in response to pumping or variations in climate. This focus requires relating limited measurements to a quantitative understanding of the temporal and spatial response of regional aquifers. For most principal aquifer studies, aquifer response is quantified using regional groundwater flow models, and USGS regional groundwater availability studies have provided test cases for the development and application of advanced modeling techniques and methods. Results from regional studies from the Lake Michigan Basin and Northern Atlantic Coastal Plain illustrate how different parts of these systems respond differently to pumping with some areas showing large drawdowns and others having much less drawdown but greater capture of discharge. The Central Valley and Mississippi Embayment studies show how extensive pumping and transfer of water have resulted in much more groundwater moving through the aquifer system under current conditions compared to pre-development. These and other results from regional studies will be explored to illustrate how regional groundwater availability and related studies address the six challenges to determining groundwater availability.

  12. Quantifying capital goods for biological treatment of organic waste.

    PubMed

    Brogaard, Line K; Petersen, Per H; Nielsen, Peter D; Christensen, Thomas H

    2015-02-01

    Materials and energy used for construction of anaerobic digestion (AD) and windrow composting plants were quantified in detail. The two technologies were quantified in collaboration with consultants and producers of the parts used to construct the plants. The composting plants were quantified based on the different sizes for the three different types of waste (garden and park waste, food waste and sludge from wastewater treatment) in amounts of 10,000 or 50,000 tonnes per year. The AD plant was quantified for a capacity of 80,000 tonnes per year. Concrete and steel for the tanks were the main materials for the AD plant. For the composting plants, gravel and concrete slabs for the pavement were used in large amounts. To frame the quantification, environmental impact assessments (EIAs) showed that the steel used for tanks at the AD plant and the concrete slabs at the composting plants made the highest contribution to Global Warming. The total impact on Global Warming from the capital goods compared to the operation reported in the literature on the AD plant showed an insignificant contribution of 1-2%. For the composting plants, the capital goods accounted for 10-22% of the total impact on Global Warming from composting. © The Author(s) 2015.

  13. Quantifying renewable groundwater stress with GRACE

    NASA Astrophysics Data System (ADS)

    Richey, Alexandra S.; Thomas, Brian F.; Lo, Min-Hui; Reager, John T.; Famiglietti, James S.; Voss, Katalyn; Swenson, Sean; Rodell, Matthew

    2015-07-01

    Groundwater is an increasingly important water supply source globally. Understanding the amount of groundwater used versus the volume available is crucial to evaluate future water availability. We present a groundwater stress assessment to quantify the relationship between groundwater use and availability in the world's 37 largest aquifer systems. We quantify stress according to a ratio of groundwater use to availability, which we call the Renewable Groundwater Stress ratio. The impact of quantifying groundwater use based on nationally reported groundwater withdrawal statistics is compared to a novel approach to quantify use based on remote sensing observations from the Gravity Recovery and Climate Experiment (GRACE) satellite mission. Four characteristic stress regimes are defined: Overstressed, Variable Stress, Human-dominated Stress, and Unstressed. The regimes are a function of the sign of use (positive or negative) and the sign of groundwater availability, defined as mean annual recharge. The ability to mitigate and adapt to stressed conditions, where use exceeds sustainable water availability, is a function of economic capacity and land use patterns. Therefore, we qualitatively explore the relationship between stress and anthropogenic biomes. We find that estimates of groundwater stress based on withdrawal statistics are unable to capture the range of characteristic stress regimes, especially in regions dominated by sparsely populated biome types with limited cropland. GRACE-based estimates of use and stress can holistically quantify the impact of groundwater use on stress, resulting in both greater magnitudes of stress and more variability of stress between regions.

  14. Quantifying renewable groundwater stress with GRACE

    PubMed Central

    Richey, Alexandra S.; Thomas, Brian F.; Lo, Min‐Hui; Reager, John T.; Voss, Katalyn; Swenson, Sean; Rodell, Matthew

    2015-01-01

    Abstract Groundwater is an increasingly important water supply source globally. Understanding the amount of groundwater used versus the volume available is crucial to evaluate future water availability. We present a groundwater stress assessment to quantify the relationship between groundwater use and availability in the world's 37 largest aquifer systems. We quantify stress according to a ratio of groundwater use to availability, which we call the Renewable Groundwater Stress ratio. The impact of quantifying groundwater use based on nationally reported groundwater withdrawal statistics is compared to a novel approach to quantify use based on remote sensing observations from the Gravity Recovery and Climate Experiment (GRACE) satellite mission. Four characteristic stress regimes are defined: Overstressed, Variable Stress, Human‐dominated Stress, and Unstressed. The regimes are a function of the sign of use (positive or negative) and the sign of groundwater availability, defined as mean annual recharge. The ability to mitigate and adapt to stressed conditions, where use exceeds sustainable water availability, is a function of economic capacity and land use patterns. Therefore, we qualitatively explore the relationship between stress and anthropogenic biomes. We find that estimates of groundwater stress based on withdrawal statistics are unable to capture the range of characteristic stress regimes, especially in regions dominated by sparsely populated biome types with limited cropland. GRACE‐based estimates of use and stress can holistically quantify the impact of groundwater use on stress, resulting in both greater magnitudes of stress and more variability of stress between regions. PMID:26900185

  15. Quantifying phase synchronization using instances of Hilbert phase slips

    NASA Astrophysics Data System (ADS)

    Govindan, R. B.

    2018-07-01

    We propose to quantify phase synchronization between two signals, x(t) and y(t), by calculating variance in the Hilbert phase of y(t) at instances of phase slips exhibited by x(t). The proposed approach is tested on numerically simulated coupled chaotic Roessler systems and second order autoregressive processes. Furthermore we compare the performance of the proposed and original approaches using uterine electromyogram signals and show that both approaches yield consistent results A standard phase synchronization approach, which involves unwrapping the Hilbert phases (ϕ1(t) and ϕ2(t)) of the two signals and analyzing the variance in the | n ṡϕ1(t) - m ṡϕ2(t) | , mod 2 π, (n and m are integers), was used for comparison. The synchronization indexes obtained from the proposed approach and the standard approach agree reasonably well in all of the systems studied in this work. Our results indicate that the proposed approach, unlike the traditional approach, does not require the non-invertible transformations - unwrapping of the phases and calculation of mod 2 π and it can be used to reliably to quantify phase synchrony between two signals.

  16. Quantifying MLI Thermal Conduction in Cryogenic Applications from Experimental Data

    NASA Astrophysics Data System (ADS)

    Ross, R. G., Jr.

    2015-12-01

    Multilayer Insulation (MLI) uses stacks of low-emittance metalized sheets combined with low-conduction spacer features to greatly reduce the heat transfer to cryogenic applications from higher temperature surrounds. However, as the hot-side temperature decreases from room temperature to cryogenic temperatures, the level of radiant heat transfer drops as the fourth power of the temperature, while the heat transfer by conduction only falls off linearly. This results in cryogenic MLI being dominated by conduction, a quantity that is extremely sensitive to MLI blanket construction and very poorly quantified in the literature. To develop useful quantitative data on cryogenic blanket conduction, multilayer nonlinear heat transfer models are used to analyze extensive heat transfer data measured by Lockheed Palo Alto on their cryogenic dewar MLI and measured by JPL on their spacecraft MLI. The data-fitting aspect of the modeling allows the radiative and conductive thermal properties of the tested blankets to be explicitly quantified. Results are presented showing that MLI conductance varies by a factor of 600 between spacecraft MLI and Lockheed's best cryogenic MLI.

  17. Showing Value in Newborn Screening: Challenges in Quantifying the Effectiveness and Cost-Effectiveness of Early Detection of Phenylketonuria and Cystic Fibrosis

    PubMed Central

    Grosse, Scott D.

    2015-01-01

    Decision makers sometimes request information on the cost savings, cost-effectiveness, or cost-benefit of public health programs. In practice, quantifying the health and economic benefits of population-level screening programs such as newborn screening (NBS) is challenging. It requires that one specify the frequencies of health outcomes and events, such as hospitalizations, for a cohort of children with a given condition under two different scenarios—with or without NBS. Such analyses also assume that everything else, including treatments, is the same between groups. Lack of comparable data for representative screened and unscreened cohorts that are exposed to the same treatments following diagnosis can result in either under- or over-statement of differences. Accordingly, the benefits of early detection may be understated or overstated. This paper illustrates these common problems through a review of past economic evaluations of screening for two historically significant conditions, phenylketonuria and cystic fibrosis. In both examples qualitative judgments about the value of prompt identification and early treatment to an affected child were more influential than specific numerical estimates of lives or costs saved. PMID:26702401

  18. Quantifier Comprehension in Corticobasal Degeneration

    ERIC Educational Resources Information Center

    McMillan, Corey T.; Clark, Robin; Moore, Peachie; Grossman, Murray

    2006-01-01

    In this study, we investigated patients with focal neurodegenerative diseases to examine a formal linguistic distinction between classes of generalized quantifiers, like "some X" and "less than half of X." Our model of quantifier comprehension proposes that number knowledge is required to understand both first-order and higher-order quantifiers.…

  19. Quantifying uncertainties in precipitation measurement

    NASA Astrophysics Data System (ADS)

    Chen, H. Z. D.

    2017-12-01

    The scientific community have a long history of utilizing precipitation data for climate model design. However, precipitation record and its model contains more uncertainty than its temperature counterpart. Literature research have shown precipitation measurements to be highly influenced by its surrounding environment, and weather stations are traditionally situated in open areas and subject to various limitations. As a result, this restriction limits the ability of the scientific community to fully close the loop on the water cycle. Horizontal redistribution have been shown to be a major factor influencing precipitation measurements. Efforts have been placed on reducing its effect on the monitoring apparatus. However, the amount of factors contributing to this uncertainty is numerous and difficult to fully capture. As a result, noise factor remains high in precipitation data. This study aims to quantify all uncertainties in precipitation data by factoring out horizontal redistribution by measuring them directly. Horizontal contribution of precipitation will be quantified by measuring precipitation at different heights, with one directly shadowing the other. The above collection represents traditional precipitation data, whereas the bottom measurements sums up the overall error term at given location. Measurements will be recorded and correlated with nearest available wind measurements to quantify its impact on traditional precipitation record. Collections at different locations will also be compared to see whether this phenomenon is location specific or if a general trend can be derived. We aim to demonstrate a new way to isolate the noise component in traditional precipitation data via empirical measurements. By doing so, improve the overall quality of historic precipitation record. As a result, provide a more accurate information for the design and calibration of large scale climate modeling.

  20. Properties and relative measure for quantifying quantum synchronization

    NASA Astrophysics Data System (ADS)

    Li, Wenlin; Zhang, Wenzhao; Li, Chong; Song, Heshan

    2017-07-01

    Although quantum synchronization phenomena and corresponding measures have been widely discussed recently, it is still an open question how to characterize directly the influence of nonlocal correlation, which is the key distinction for identifying classical and quantum synchronizations. In this paper, we present basic postulates for quantifying quantum synchronization based on the related theory in Mari's work [Phys. Rev. Lett. 111, 103605 (2013), 10.1103/PhysRevLett.111.103605], and we give a general formula of a quantum synchronization measure with clear physical interpretations. By introducing Pearson's parameter, we show that the obvious characteristics of our measure are the relativity and monotonicity. As an example, the measure is applied to describe synchronization among quantum optomechanical systems under a Markovian bath. We also show the potential by quantifying generalized synchronization and discrete variable synchronization with this measure.

  1. Quantifying distinct associations on different temporal scales: comparison of DCCA and Pearson methods

    NASA Astrophysics Data System (ADS)

    Piao, Lin; Fu, Zuntao

    2016-11-01

    Cross-correlation between pairs of variables takes multi-time scale characteristic, and it can be totally different on different time scales (changing from positive correlation to negative one), e.g., the associations between mean air temperature and relative humidity over regions to the east of Taihang mountain in China. Therefore, how to correctly unveil these correlations on different time scales is really of great importance since we actually do not know if the correlation varies with scales in advance. Here, we compare two methods, i.e. Detrended Cross-Correlation Analysis (DCCA for short) and Pearson correlation, in quantifying scale-dependent correlations directly to raw observed records and artificially generated sequences with known cross-correlation features. Studies show that 1) DCCA related methods can indeed quantify scale-dependent correlations, but not Pearson method; 2) the correlation features from DCCA related methods are robust to contaminated noises, however, the results from Pearson method are sensitive to noise; 3) the scale-dependent correlation results from DCCA related methods are robust to the amplitude ratio between slow and fast components, while Pearson method may be sensitive to the amplitude ratio. All these features indicate that DCCA related methods take some advantages in correctly quantifying scale-dependent correlations, which results from different physical processes.

  2. Quantifying distinct associations on different temporal scales: comparison of DCCA and Pearson methods.

    PubMed

    Piao, Lin; Fu, Zuntao

    2016-11-09

    Cross-correlation between pairs of variables takes multi-time scale characteristic, and it can be totally different on different time scales (changing from positive correlation to negative one), e.g., the associations between mean air temperature and relative humidity over regions to the east of Taihang mountain in China. Therefore, how to correctly unveil these correlations on different time scales is really of great importance since we actually do not know if the correlation varies with scales in advance. Here, we compare two methods, i.e. Detrended Cross-Correlation Analysis (DCCA for short) and Pearson correlation, in quantifying scale-dependent correlations directly to raw observed records and artificially generated sequences with known cross-correlation features. Studies show that 1) DCCA related methods can indeed quantify scale-dependent correlations, but not Pearson method; 2) the correlation features from DCCA related methods are robust to contaminated noises, however, the results from Pearson method are sensitive to noise; 3) the scale-dependent correlation results from DCCA related methods are robust to the amplitude ratio between slow and fast components, while Pearson method may be sensitive to the amplitude ratio. All these features indicate that DCCA related methods take some advantages in correctly quantifying scale-dependent correlations, which results from different physical processes.

  3. Diesel Emissions Quantifier (DEQ)

    EPA Pesticide Factsheets

    .The Diesel Emissions Quantifier (Quantifier) is an interactive tool to estimate emission reductions and cost effectiveness. Publications EPA-420-F-13-008a (420f13008a), EPA-420-B-10-035 (420b10023), EPA-420-B-10-034 (420b10034)

  4. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  5. Quantifying Transmission.

    PubMed

    Woolhouse, Mark

    2017-07-01

    Transmissibility is the defining characteristic of infectious diseases. Quantifying transmission matters for understanding infectious disease epidemiology and designing evidence-based disease control programs. Tracing individual transmission events can be achieved by epidemiological investigation coupled with pathogen typing or genome sequencing. Individual infectiousness can be estimated by measuring pathogen loads, but few studies have directly estimated the ability of infected hosts to transmit to uninfected hosts. Individuals' opportunities to transmit infection are dependent on behavioral and other risk factors relevant given the transmission route of the pathogen concerned. Transmission at the population level can be quantified through knowledge of risk factors in the population or phylogeographic analysis of pathogen sequence data. Mathematical model-based approaches require estimation of the per capita transmission rate and basic reproduction number, obtained by fitting models to case data and/or analysis of pathogen sequence data. Heterogeneities in infectiousness, contact behavior, and susceptibility can have substantial effects on the epidemiology of an infectious disease, so estimates of only mean values may be insufficient. For some pathogens, super-shedders (infected individuals who are highly infectious) and super-spreaders (individuals with more opportunities to transmit infection) may be important. Future work on quantifying transmission should involve integrated analyses of multiple data sources.

  6. Evaluation of two methods for quantifying passeriform lice

    PubMed Central

    Koop, Jennifer A. H.; Clayton, Dale H.

    2013-01-01

    Two methods commonly used to quantify ectoparasites on live birds are visual examination and dust-ruffling. Visual examination provides an estimate of ectoparasite abundance based on an observer’s timed inspection of various body regions on a bird. Dust-ruffling involves application of insecticidal powder to feathers that are then ruffled to dislodge ectoparasites onto a collection surface where they can then be counted. Despite the common use of these methods in the field, the proportion of actual ectoparasites they account for has only been tested with Rock Pigeons (Columba livia), a relatively large-bodied species (238–302 g) with dense plumage. We tested the accuracy of the two methods using European Starlings (Sturnus vulgaris; ~75 g). We first quantified the number of lice (Brueelia nebulosa) on starlings using visual examination, followed immediately by dust-ruffling. Birds were then euthanized and the proportion of lice accounted for by each method was compared to the total number of lice on each bird as determined with a body-washing method. Visual examination and dust-ruffling each accounted for a relatively small proportion of total lice (14% and 16%, respectively), but both were still significant predictors of abundance. The number of lice observed by visual examination accounted for 68% of the variation in total abundance. Similarly, the number of lice recovered by dust-ruffling accounted for 72% of the variation in total abundance. Our results show that both methods can be used to reliably quantify the abundance of lice on European Starlings and other similar-sized passerines. PMID:24039328

  7. Quantifying Paris CO2 urban dome: a first synthesis of results from the CO2-Megaparis project (2009-2013)

    NASA Astrophysics Data System (ADS)

    Xueref-Remy, Irène; Dieudonné, Elsa; Ammoura, Lamia; Cellier, Pierre; Gibert, Fabien; Lac, Christine; Lauvaux, Thomas; Lopez, Morgan; Pal, Sandip; Perrussel, Olivier; Puygrenier, Vincent; Ramonet, Michel; Schmidt, Martina; Thiruchittampalam, Balendra; Vuillemin, Cyrille

    2013-04-01

    About 80% of global CO2 emissions come from punctual sources such as megacities. Among those, Paris is the third megacity in Europe. However, the estimates of urban CO2 emissions are based on activity proxies and benchmarked emission factors, leading to uncertainties as high as several tenths of percents in some sectors of bottom-up CO2 emissions inventories. Since 2009, the CO2-Megaparis project aims to quantify CO2 emissions from Paris using a top-down approach based on a synergy between atmospheric observations and modeling. A mini-network of 3 stations was developed by LSCE in Paris agglomeration within the infrastructure of the air quality monitoring agency of Paris region, AIRPARIF, completing 2 other stations from the ICOS network leaded at LSCE. The mean CO2 concentration dome over Paris in the mid-afternoon over 1 year of data is about 2.2 ppm, and is strongly wind speed and direction dependent. Analysis of correlations between CO2, CO and 14C02 was carried out and a comparison to available inventories will be presented. Direct modeling of CO2 at a very fine resolution (2x2 km2, 1h) was performed by CNRM and matched well with observations. Results from inverse modeling will be presented. Furthermore, we conducted a campaign using lidar facilities showing that due to the effect of the urban heat island, the boundary layer height (a key parameter in assessing CO2 fluxes from the atmospheric approach) is 10 to 40% time higher in Paris than in surrounding rural areas. Also, a sonic anemometer and a 10 Hz CO2 analyzer were deployed to assess CO2 fluxes from observations, as well as CO2 flux analyzers on crops. Using the data from this instrumentation, a mass balance calculation was carried out and allowed the identification and quantification of Paris CO2 traffic plume to a rural region, about 100 km south of Paris, that matched well with inventories. Finally, an attempt of defining the strengths and weaknesses of the atmospheric approach to quantify urban CO2

  8. Quantifying errors in trace species transport modeling.

    PubMed

    Prather, Michael J; Zhu, Xin; Strahan, Susan E; Steenrod, Stephen D; Rodriguez, Jose M

    2008-12-16

    One expectation when computationally solving an Earth system model is that a correct answer exists, that with adequate physical approximations and numerical methods our solutions will converge to that single answer. With such hubris, we performed a controlled numerical test of the atmospheric transport of CO(2) using 2 models known for accurate transport of trace species. Resulting differences were unexpectedly large, indicating that in some cases, scientific conclusions may err because of lack of knowledge of the numerical errors in tracer transport models. By doubling the resolution, thereby reducing numerical error, both models show some convergence to the same answer. Now, under realistic conditions, we identify a practical approach for finding the correct answer and thus quantifying the advection error.

  9. Energetic arousal and language: predictions from the computational theory of quantifiers processing.

    PubMed

    Zajenkowski, Marcin

    2013-10-01

    The author examines the relationship between energetic arousal (EA) and the processing of sentences containing natural-language quantifiers. Previous studies and theories have shown that energy may differentially affect various cognitive functions. Recent investigations devoted to quantifiers strongly support the theory that various types of quantifiers involve different cognitive functions in the sentence-picture verification task. In the present study, 201 students were presented with a sentence-picture verification task consisting of simple propositions containing a quantifier that referred to the color of a car on display. Color pictures of cars accompanied the propositions. In addition, the level of participants' EA was measured before and after the verification task. It was found that EA and performance on proportional quantifiers (e.g., "More than half of the cars are red") are in an inverted U-shaped relationship. This result may be explained by the fact that proportional sentences engage working memory to a high degree, and previous models of EA-cognition associations have been based on the assumption that tasks that require parallel attentional and memory processes are best performed when energy is moderate. The research described in the present article has several applications, as it shows the optimal human conditions for verbal comprehension. For instance, it may be important in workplace design to control the level of arousal experienced by office staff when work is mostly related to the processing of complex texts. Energy level may be influenced by many factors, such as noise, time of day, or thermal conditions.

  10. The Fallacy of Quantifying Risk

    DTIC Science & Technology

    2012-09-01

    Defense AT&L: September–October 2012 18 The Fallacy of Quantifying Risk David E. Frick, Ph.D. Frick is a 35-year veteran of the Department of...a key to risk analysis was “choosing the right technique” of quantifying risk . The weakness in this argument stems not from the assertion that one...of information about the enemy), yet achiev- ing great outcomes. Attempts at quantifying risk are not, in and of themselves, objectionable. Prudence

  11. Mountain torrents: Quantifying vulnerability and assessing uncertainties

    PubMed Central

    Totschnig, Reinhold; Fuchs, Sven

    2013-01-01

    Vulnerability assessment for elements at risk is an important component in the framework of risk assessment. The vulnerability of buildings affected by torrent processes can be quantified by vulnerability functions that express a mathematical relationship between the degree of loss of individual elements at risk and the intensity of the impacting process. Based on data from the Austrian Alps, we extended a vulnerability curve for residential buildings affected by fluvial sediment transport processes to other torrent processes and other building types. With respect to this goal to merge different data based on different processes and building types, several statistical tests were conducted. The calculation of vulnerability functions was based on a nonlinear regression approach applying cumulative distribution functions. The results suggest that there is no need to distinguish between different sediment-laden torrent processes when assessing vulnerability of residential buildings towards torrent processes. The final vulnerability functions were further validated with data from the Italian Alps and different vulnerability functions presented in the literature. This comparison showed the wider applicability of the derived vulnerability functions. The uncertainty inherent to regression functions was quantified by the calculation of confidence bands. The derived vulnerability functions may be applied within the framework of risk management for mountain hazards within the European Alps. The method is transferable to other mountain regions if the input data needed are available. PMID:27087696

  12. COMPLEXITY&APPROXIMABILITY OF QUANTIFIED&STOCHASTIC CONSTRAINT SATISFACTION PROBLEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, H. B.; Marathe, M. V.; Stearns, R. E.

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S and T be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SATc(S).) Here, we study simultaneously the complexity of decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. We present simple yet general techniques to characterize simultaneously, the complexity ormore » efficient approximability of a number of versions/variants of the problems SAT(S), Q-SAT(S), S-SAT(S),MAX-Q-SAT(S) etc., for many different such D,C ,S, T. These versions/variants include decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. Our unified approach is based on the following two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic represent ability. Some of the results extend the earlier results in [Pa85,LMP99,CF+93,CF+94O]u r techniques and results reported here also provide significant steps towards obtaining dichotomy theorems, for a number of the problems above, including the problems MAX-&-SAT( S), and MAX-S-SAT(S). The discovery of such dichotomy theorems, for unquantified formulas, has received significant recent attention in the literature [CF+93,CF+94,Cr95,KSW97]« less

  13. Incremental comprehension of spoken quantifier sentences: Evidence from brain potentials.

    PubMed

    Freunberger, Dominik; Nieuwland, Mante S

    2016-09-01

    Do people incrementally incorporate the meaning of quantifier expressions to understand an unfolding sentence? Most previous studies concluded that quantifiers do not immediately influence how a sentence is understood based on the observation that online N400-effects differed from offline plausibility judgments. Those studies, however, used serial visual presentation (SVP), which involves unnatural reading. In the current ERP-experiment, we presented spoken positive and negative quantifier sentences ("Practically all/practically no postmen prefer delivering mail, when the weather is good/bad during the day"). Different from results obtained in a previously reported SVP-study (Nieuwland, 2016) sentence truth-value N400 effects occurred in positive and negative quantifier sentences alike, reflecting fully incremental quantifier comprehension. This suggests that the prosodic information available during spoken language comprehension supports the generation of online predictions for upcoming words and that, at least for quantifier sentences, comprehension of spoken language may proceed more incrementally than comprehension during SVP reading. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  14. Quantifying cell mono-layer cultures by video imaging.

    PubMed

    Miller, K S; Hook, L A

    1996-04-01

    A method is described in which the relative number of adherent cells in multi-well tissue-culture plates is assayed by staining the cells with Giemsa and capturing the image of the stained cells with a video camera and charged-coupled device. The resultant image is quantified using the associated video imaging software. The method is shown to be sensitive and reproducible and should be useful for studies where quantifying relative cell numbers and/or proliferation in vitro is required.

  15. Quantifying Anthropogenic Stress on Groundwater Resources.

    PubMed

    Ashraf, Batool; AghaKouchak, Amir; Alizadeh, Amin; Mousavi Baygi, Mohammad; R Moftakhari, Hamed; Mirchi, Ali; Anjileli, Hassan; Madani, Kaveh

    2017-10-10

    This study explores a general framework for quantifying anthropogenic influences on groundwater budget based on normalized human outflow (h out ) and inflow (h in ). The framework is useful for sustainability assessment of groundwater systems and allows investigating the effects of different human water abstraction scenarios on the overall aquifer regime (e.g., depleted, natural flow-dominated, and human flow-dominated). We apply this approach to selected regions in the USA, Germany and Iran to evaluate the current aquifer regime. We subsequently present two scenarios of changes in human water withdrawals and return flow to the system (individually and combined). Results show that approximately one-third of the selected aquifers in the USA, and half of the selected aquifers in Iran are dominated by human activities, while the selected aquifers in Germany are natural flow-dominated. The scenario analysis results also show that reduced human withdrawals could help with regime change in some aquifers. For instance, in two of the selected USA aquifers, a decrease in anthropogenic influences by ~20% may change the condition of depleted regime to natural flow-dominated regime. We specifically highlight a trending threat to the sustainability of groundwater in northwest Iran and California, and the need for more careful assessment and monitoring practices as well as strict regulations to mitigate the negative impacts of groundwater overexploitation.

  16. Quantifying Bell nonlocality with the trace distance

    NASA Astrophysics Data System (ADS)

    Brito, S. G. A.; Amaral, B.; Chaves, R.

    2018-02-01

    Measurements performed on distant parts of an entangled quantum state can generate correlations incompatible with classical theories respecting the assumption of local causality. This is the phenomenon known as quantum nonlocality that, apart from its fundamental role, can also be put to practical use in applications such as cryptography and distributed computing. Clearly, developing ways of quantifying nonlocality is an important primitive in this scenario. Here, we propose to quantify the nonlocality of a given probability distribution via its trace distance to the set of classical correlations. We show that this measure is a monotone under the free operations of a resource theory and, furthermore, that it can be computed efficiently with a linear program. We put our framework to use in a variety of relevant Bell scenarios also comparing the trace distance to other standard measures in the literature.

  17. Chimpanzees (Pan troglodytes) and bonobos (Pan paniscus) quantify split solid objects.

    PubMed

    Cacchione, Trix; Hrubesch, Christine; Call, Josep

    2013-01-01

    Recent research suggests that gorillas' and orangutans' object representations survive cohesion violations (e.g., a split of a solid object into two halves), but that their processing of quantities may be affected by them. We assessed chimpanzees' (Pan troglodytes) and bonobos' (Pan paniscus) reactions to various fission events in the same series of action tasks modelled after infant studies previously run on gorillas and orangutans (Cacchione and Call in Cognition 116:193-203, 2010b). Results showed that all four non-human great ape species managed to quantify split objects but that their performance varied as a function of the non-cohesiveness produced in the splitting event. Spatial ambiguity and shape invariance had the greatest impact on apes' ability to represent and quantify objects. Further, we observed species differences with gorillas performing lower than other species. Finally, we detected a substantial age effect, with ape infants below 6 years of age being outperformed by both juvenile/adolescent and adult apes.

  18. Fat stigmatization in television shows and movies: a content analysis.

    PubMed

    Himes, Susan M; Thompson, J Kevin

    2007-03-01

    To examine the phenomenon of fat stigmatization messages presented in television shows and movies, a content analysis was used to quantify and categorize fat-specific commentary and humor. Fat stigmatization vignettes were identified using a targeted sampling procedure, and 135 scenes were excised from movies and television shows. The material was coded by trained raters. Reliability indices were uniformly high for the seven categories (percentage agreement ranged from 0.90 to 0.98; kappas ranged from 0.66 to 0.94). Results indicated that fat stigmatization commentary and fat humor were often verbal, directed toward another person, and often presented directly in the presence of the overweight target. Results also indicated that male characters were three times more likely to engage in fat stigmatization commentary or fat humor than female characters. To our knowledge, these findings provide the first information regarding the specific gender, age, and types of fat stigmatization that occur frequently in movies and television shows. The stimuli should prove useful in future research examining the role of individual difference factors (e.g., BMI) in the reaction to viewing such vignettes.

  19. Quantifying evolutionary dynamics from variant-frequency time series

    NASA Astrophysics Data System (ADS)

    Khatri, Bhavin S.

    2016-09-01

    From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series.

  20. Quantifying evolutionary dynamics from variant-frequency time series.

    PubMed

    Khatri, Bhavin S

    2016-09-12

    From Kimura's neutral theory of protein evolution to Hubbell's neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher's angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series.

  1. Quantify fluid saturation in fractures by light transmission technique and its application

    NASA Astrophysics Data System (ADS)

    Ye, S.; Zhang, Y.; Wu, J.

    2016-12-01

    The Dense Non-Aqueous Phase Liquids (DNAPLs) migration in transparent and rough fractures with variable aperture was studied experimentally using a light transmission technique. The migration of trichloroethylene (TCE) in variable-aperture fractures (20 cm wide x 32.5 cm high) showed that a TCE blob moved downward with snap-off events in four packs with apertures from 100 μm to 1000 μm, and that the pattern presented a single and tortuous cluster with many fingers in a pack with two apertures of 100 μm and 500 μm. The variable apertures in the fractures were measured by light transmission. A light intensity-saturation (LIS) model based on light transmission was used to quantify DNAPL saturation in the fracture system. Known volumes of TCE, were added to the chamber and these amounts were compared to the results obtained by LIS model. Strong correlation existed between results obtained based on LIS model and the known volumes of T CE. Sensitivity analysis showed that the aperture was more sensitive than parameter C2 of LIS model. LIS model was also used to measure dyed TCE saturation in air sparging experiment. The results showed that the distribution and amount of TCE significantly influenced the efficient of air sparging. The method developed here give a way to quantify fluid saturation in two-phase system in fractured medium, and provide a non-destructive, non-intrusive tool to investigate changes in DNAPL architecture and flow characteristics in laboratory experiments. Keywords: light transmission, fluid saturation, fracture, variable aperture AcknowledgementsFunding for this research from NSFC Project No. 41472212.

  2. IMPAIRED VERBAL COMPREHENSION OF QUANTIFIERS IN CORTICOBASAL SYNDROME

    PubMed Central

    Troiani, Vanessa; Clark, Robin; Grossman, Murray

    2011-01-01

    Objective Patients with Corticobasal Syndrome (CBS) have atrophy in posterior parietal cortex. This region of atrophy has been previously linked with their quantifier comprehension difficulty, but previous studies used visual stimuli, making it difficult to account for potential visuospatial deficits in CBS patients. The current study evaluated comprehension of generalized quantifiers using strictly verbal materials. Method CBS patients, a brain-damaged control group (consisting of Alzheimer's Disease and frontotemporal dementia), and age-matched controls participated in this study. We assessed familiar temporal, spatial, and monetary domains of verbal knowledge comparatively. Judgment accuracy was only evaluated in statements for which patients demonstrated accurate factual knowledge about the target domain. Results We found that patients with CBS are significantly impaired in their ability to evaluate quantifiers compared to healthy seniors and a brain-damaged control group, even in this strictly visual task. This impairment was seen in the vast majority of individual CBS patients. Conclusions These findings offer additional evidence of quantifier impairment in CBS patients and emphasize that this impairment cannot be attributed to potential spatial processing impairments in patients with parietal disease. PMID:21381823

  3. Quantifying food intake in socially housed monkeys: social status effects on caloric consumption

    PubMed Central

    Wilson, Mark E.; Fisher, Jeff; Fischer, Andrew; Lee, Vanessa; Harris, Ruth B.; Bartness, Timothy J.

    2008-01-01

    Obesity results from a number of factors including socio-environmental influences and rodent models show that several different stressors increase the preference for calorically dense foods leading to an obese phenotype. We present here a non-human primate model using socially housed adult female macaques living in long-term stable groups given access to diets of different caloric density. Consumption of a low fat (LFD; 15% of calories from fat) and a high fat diet (HFD; 45% of calories from fat) was quantified by means of a custom-built, automated feeder that dispensed a pellet of food when activated by a radiofrequency chip implanted subcutaneously in the animal’s wrist. Socially subordinate females showed indices of chronic psychological stress having reduced glucocorticoid negative feedback and higher frequencies of anxiety-like behavior. Twenty-four hour intakes of both the LFD and HFD were significantly greater in subordinates than dominates, an effect that persisted whether standard monkey chow (13% of calories from fat) was present or absent. Furthermore, although dominants restricted their food intake to daylight, subordinates continued to feed at night. Total caloric intake was significantly correlated with body weight change. Collectively, these results show that food intake can be reliably quantified in non-human primates living in complex social environments and suggest that socially-subordinate females consume more calories, suggesting this ethologically relevant model may help understand how psychosocial stress changes food preferences and consumption leading to obesity. PMID:18486158

  4. A graph-theoretic method to quantify the airline route authority

    NASA Technical Reports Server (NTRS)

    Chan, Y.

    1979-01-01

    The paper introduces a graph-theoretic method to quantify the legal statements in route certificate which specifies the airline routing restrictions. All the authorized nonstop and multistop routes, including the shortest time routes, can be obtained, and the method suggests profitable route structure alternatives to airline analysts. This method to quantify the C.A.B. route authority was programmed in a software package, Route Improvement Synthesis and Evaluation, and demonstrated in a case study with a commercial airline. The study showed the utility of this technique in suggesting route alternatives and the possibility of improvements in the U.S. route system.

  5. Quantifying viruses and bacteria in wastewater—Results, interpretation methods, and quality control

    USGS Publications Warehouse

    Francy, Donna S.; Stelzer, Erin A.; Bushon, Rebecca N.; Brady, Amie M.G.; Mailot, Brian E.; Spencer, Susan K.; Borchardt, Mark A.; Elber, Ashley G.; Riddell, Kimberly R.; Gellner, Terry M.

    2011-01-01

    Membrane bioreactors (MBR), used for wastewater treatment in Ohio and elsewhere in the United States, have pore sizes small enough to theoretically reduce concentrations of protozoa and bacteria, but not viruses. Sampling for viruses in wastewater is seldom done and not required. Instead, the bacterial indicators Escherichia coli (E. coli) and fecal coliforms are the required microbial measures of effluents for wastewater-discharge permits. Information is needed on the effectiveness of MBRs in removing human enteric viruses from wastewaters, particularly as compared to conventional wastewater treatment before and after disinfection. A total of 73 regular and 28 quality-control (QC) samples were collected at three MBR and two conventional wastewater plants in Ohio during 23 regular and 3 QC sampling trips in 2008-10. Samples were collected at various stages in the treatment processes and analyzed for bacterial indicators E. coli, fecal coliforms, and enterococci by membrane filtration; somatic and F-specific coliphage by the single agar layer (SAL) method; adenovirus, enterovirus, norovirus GI and GII, rotavirus, and hepatitis A virus by molecular methods; and viruses by cell culture. While addressing the main objective of the study-comparing removal of viruses and bacterial indicators in MBR and conventional plants-it was realized that work was needed to identify data analysis and quantification methods for interpreting enteric virus and QC data. Therefore, methods for quantifying viruses, qualifying results, and applying QC data to interpretations are described in this report. During each regular sampling trip, samples were collected (1) before conventional or MBR treatment (post-preliminary), (2) after secondary or MBR treatment (post-secondary or post-MBR), (3) after tertiary treatment (one conventional plant only), and (4) after disinfection (post-disinfection). Glass-wool fiber filtration was used to concentrate enteric viruses from large volumes, and small

  6. Quantifying the Beauty of Words: A Neurocognitive Poetics Perspective

    PubMed Central

    Jacobs, Arthur M.

    2017-01-01

    In this paper I would like to pave the ground for future studies in Computational Stylistics and (Neuro-)Cognitive Poetics by describing procedures for predicting the subjective beauty of words. A set of eight tentative word features is computed via Quantitative Narrative Analysis (QNA) and a novel metric for quantifying word beauty, the aesthetic potential is proposed. Application of machine learning algorithms fed with this QNA data shows that a classifier of the decision tree family excellently learns to split words into beautiful vs. ugly ones. The results shed light on surface and semantic features theoretically relevant for affective-aesthetic processes in literary reading and generate quantitative predictions for neuroaesthetic studies of verbal materials. PMID:29311877

  7. Quantifying the Beauty of Words: A Neurocognitive Poetics Perspective.

    PubMed

    Jacobs, Arthur M

    2017-01-01

    In this paper I would like to pave the ground for future studies in Computational Stylistics and (Neuro-)Cognitive Poetics by describing procedures for predicting the subjective beauty of words. A set of eight tentative word features is computed via Quantitative Narrative Analysis (QNA) and a novel metric for quantifying word beauty, the aesthetic potential is proposed. Application of machine learning algorithms fed with this QNA data shows that a classifier of the decision tree family excellently learns to split words into beautiful vs. ugly ones. The results shed light on surface and semantic features theoretically relevant for affective-aesthetic processes in literary reading and generate quantitative predictions for neuroaesthetic studies of verbal materials.

  8. Quantifying facial paralysis using the Kinect v2.

    PubMed

    Gaber, Amira; Taher, Mona F; Wahed, Manal Abdel

    2015-01-01

    Assessment of facial paralysis (FP) and quantitative grading of facial asymmetry are essential in order to quantify the extent of the condition as well as to follow its improvement or progression. As such, there is a need for an accurate quantitative grading system that is easy to use, inexpensive and has minimal inter-observer variability. A comprehensive automated system to quantify and grade FP is the main objective of this work. An initial prototype has been presented by the authors. The present research aims to enhance the accuracy and robustness of one of this system's modules: the resting symmetry module. This is achieved by including several modifications to the computation method of the symmetry index (SI) for the eyebrows, eyes and mouth. These modifications are the gamma correction technique, the area of the eyes, and the slope of the mouth. The system was tested on normal subjects and showed promising results. The mean SI of the eyebrows decreased slightly from 98.42% to 98.04% using the modified method while the mean SI for the eyes and mouth increased from 96.93% to 99.63% and from 95.6% to 98.11% respectively while using the modified method. The system is easy to use, inexpensive, automated and fast, has no inter-observer variability and is thus well suited for clinical use.

  9. Quantifying CO2 Emissions from Individual Power Plants using OCO-2 Observations

    NASA Astrophysics Data System (ADS)

    Nassar, R.; Hill, T. G.; McLinden, C. A.; Wunch, D.; Jones, D. B. A.; Crisp, D.

    2017-12-01

    In order to better manage anthropogenic CO2 emissions, improved methods of quantifying emissions are needed at all spatial scales from the national level down to the facility level. Although the Orbiting Carbon Observatory 2 (OCO-2) satellite was not designed for monitoring power plant emissions, we show that in select cases, CO2 observations from OCO-2 can be used to quantify daily CO2 emissions from individual mid- to large-sized coal power plants by fitting the data to plume model simulations. Emission estimates for US power plants are within 1-13% of reported daily emission values enabling application of the approach to international sites that lack detailed emission information. These results affirm that a constellation of future CO2 imaging satellites, optimized for point sources, could be used for the Monitoring, Reporting and Verification (MRV) of CO2 emissions from individual power plants to support the implementation of climate policies.

  10. Measuring political polarization: Twitter shows the two sides of Venezuela

    NASA Astrophysics Data System (ADS)

    Morales, A. J.; Borondo, J.; Losada, J. C.; Benito, R. M.

    2015-03-01

    We say that a population is perfectly polarized when divided in two groups of the same size and opposite opinions. In this paper, we propose a methodology to study and measure the emergence of polarization from social interactions. We begin by proposing a model to estimate opinions in which a minority of influential individuals propagate their opinions through a social network. The result of the model is an opinion probability density function. Next, we propose an index to quantify the extent to which the resulting distribution is polarized. Finally, we apply the proposed methodology to a Twitter conversation about the late Venezuelan president, Hugo Chávez, finding a good agreement between our results and offline data. Hence, we show that our methodology can detect different degrees of polarization, depending on the structure of the network.

  11. Testing Delays Resulting in Increased Identification Accuracy in Line-Ups and Show-Ups.

    ERIC Educational Resources Information Center

    Dekle, Dawn J.

    1997-01-01

    Investigated time delays (immediate, two-three days, one week) between viewing a staged theft and attempting an eyewitness identification. Compared lineups to one-person showups in a laboratory analogue involving 412 subjects. Results show that across all time delays, participants maintained a higher identification accuracy with the showup…

  12. Quantifying and qualifying the use of topical anesthetics in retinopathy of prematurity examinations.

    PubMed

    Ahmed, Masih; Forcina, Blake; Bonsall, Dean

    2016-04-01

    The American Academy of Pediatrics advocates efforts to minimize discomfort and systemic effect of retinopathy of prematurity (ROP) examinations. Although many ophthalmologists use topical anesthetics, many do not believe them necessary. We present the results of the first survey to quantify the use of topical anesthetics in ROP examinations by clinicians who screen for ROP. The results show that although use of topical anesthetic is common, it is not universal. Copyright © 2016 American Association for Pediatric Ophthalmology and Strabismus. Published by Elsevier Inc. All rights reserved.

  13. Results From Mars Show Electrostatic Charging of the Mars Pathfinder Sojourner Rover

    NASA Technical Reports Server (NTRS)

    Kolecki, Joseph C.; Siebert, Mark W.

    1998-01-01

    flighata. Electrical charging of vehicles and, one day, astronauts moving across the Martian surface may have moderate to severe consequences if large potential differences develop. The observations from Sojourner point to just such a possibility. It is desirable to quantify these results. The various lander/rover missions being planned for the upcoming decade provide the means for doing so. They should, therefore, carry instruments that will not only measure vehicle charging but characterize all the natural and induced electrical phenomena occurring in the environment and assess their impact on future missions.

  14. Quantifying collagen orientation in breast tissue biopsies using SLIM (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Majeed, Hassaan; Okoro, Chukwuemeka; Balla, Andre; Toussaint, Kimani C.; Popescu, Gabriel

    2017-02-01

    Breast cancer is a major public health problem worldwide, being the most common type of cancer among women according to the World Health Organization (WHO). The WHO has further stressed the importance of an early determination of the disease course through prognostic markers. Recent studies have shown that the alignment of collagen fibers in tumor adjacent stroma correlate with poorer health outcomes in patients. Such studies have typically been carried out using Second-Harmonic Generation (SHG) microscopy. SHG images are very useful for quantifying collagen fiber orientation due their specificity to non-centrosymmetric structures in tissue, leading to high contrast in collagen rich areas. However, the imaging throughput in SHG microscopy is limited by its point scanning geometry. In this work, we show that SLIM, a wide-field high-throughput QPI technique, can be used to obtain the same information on collagen fiber orientation as is obtainable through SHG microscopy. We imaged a tissue microarray containing both benign and malignant cores using both SHG microscopy and SLIM. The cellular (non-collagenous) structures in the SLIM images were next segmented out using an algorithm developed in-house. Using the previously published Fourier Transform Second Harmonic Generation (FT-SHG) tool, the fiber orientations in SHG and segmented SLIM images were then quantified. The resulting histograms of fiber orientation angles showed that both SHG and SLIM generate similar measurements of collagen fiber orientation. The SLIM modality, however, can generate these results at much higher throughput due to its wide-field, whole-slide scanning capabilities.

  15. Quantifying the ice-albedo feedback through decoupling

    NASA Astrophysics Data System (ADS)

    Kravitz, B.; Rasch, P. J.

    2017-12-01

    The ice-albedo feedback involves numerous individual components, whereby warming induces sea ice melt, inducing reduced surface albedo, inducing increased surface shortwave absorption, causing further warming. Here we attempt to quantify the sea ice albedo feedback using an analogue of the "partial radiative perturbation" method, but where the governing mechanisms are directly decoupled in a climate model. As an example, we can isolate the insulating effects of sea ice on surface energy and moisture fluxes by allowing sea ice thickness to change but fixing Arctic surface albedo, or vice versa. Here we present results from such idealized simulations using the Community Earth System Model in which individual components are successively fixed, effectively decoupling the ice-albedo feedback loop. We isolate the different components of this feedback, including temperature change, sea ice extent/thickness, and air-sea exchange of heat and moisture. We explore the interactions between these different components, as well as the strengths of the total feedback in the decoupled feedback loop, to quantify contributions from individual pieces. We also quantify the non-additivity of the effects of the components as a means of investigating the dominant sources of nonlinearity in the ice-albedo feedback.

  16. Project ES3: attempting to quantify and measure the level of stress.

    PubMed

    Aguiló, Jordi; Ferrer-Salvans, Pau; García-Rozo, Antonio; Armario, Antonio; Corbí, Ángel; Cambra, Francisco J; Bailón, Raquel; González-Marcos, Ana; Caja, Gerardo; Aguiló, Sira; López-Antón, Raúl; Arza-Valdés, Adriana; Garzón-Rey, Jorge M

    2015-11-01

    The WHO has qualified stress as a 'world epidemic' due to its increasingly greater incidence on health. The work described in this paper represents an attempt to objectively quantify the level of stress. The aim of the method developed here is to measure how close or how far a subject is from a situation that can be considered 'normal' in medical and social terms. The literature on the pathophysiology of stress and its methods of study in experiments on both animals and humans was reviewed. Nine prospective observational studies were undertaken with different types of subjects and stressors covering the different types of stress. The results of the literature review made it possible to identify the different types of stress, the indicators that yield significant results, the psychometric tests and the well-documented 'stressors'. This material was then used to design the general method and the details of the nine clinical trials. The preliminary results obtained in some of the studies were used to validate the indicators as well as the efficacy of the techniques used experimentally to diminish stress or to produce it. The early results obtained in the experimental trials show that we are on the right path towards defining and validating multivariable markers for quantifying levels of stress and also suggest that the method can be applied in a similar way to the study of mental disorders.

  17. Quantifying arm nonuse in individuals poststroke.

    PubMed

    Han, Cheol E; Kim, Sujin; Chen, Shuya; Lai, Yi-Hsuan; Lee, Jeong-Yoon; Osu, Rieko; Winstein, Carolee J; Schweighofer, Nicolas

    2013-06-01

    Arm nonuse, defined as the difference between what the individual can do when constrained to use the paretic arm and what the individual does when given a free choice to use either arm, has not yet been quantified in individuals poststroke. (1) To quantify nonuse poststroke and (2) to develop and test a novel, simple, objective, reliable, and valid instrument, the Bilateral Arm Reaching Test (BART), to quantify arm use and nonuse poststroke. First, we quantify nonuse with the Quality of Movement (QOM) subscale of the Actual Amount of Use Test (AAUT) by subtracting the AAUT QOM score in the spontaneous use condition from the AAUT QOM score in a subsequent constrained use condition. Second, we quantify arm use and nonuse with BART by comparing reaching performance to visual targets projected over a 2D horizontal hemi-work space in a spontaneous-use condition (in which participants are free to use either arm at each trial) with reaching performance in a constrained-use condition. All participants (N = 24) with chronic stroke and with mild to moderate impairment exhibited nonuse with the AAUT QOM. Nonuse with BART had excellent test-retest reliability and good external validity. BART is the first instrument that can be used repeatedly and practically in the clinic to quantify the effects of neurorehabilitation on arm use and nonuse and in the laboratory for advancing theoretical knowledge about the recovery of arm use and the development of nonuse and "learned nonuse" after stroke.

  18. Quantifying light-dependent circadian disruption in humans and animal models.

    PubMed

    Rea, Mark S; Figueiro, Mariana G

    2014-12-01

    Although circadian disruption is an accepted term, little has been done to develop methods to quantify the degree of disruption or entrainment individual organisms actually exhibit in the field. A variety of behavioral, physiological and hormonal responses vary in amplitude over a 24-h period and the degree to which these circadian rhythms are synchronized to the daily light-dark cycle can be quantified with a technique known as phasor analysis. Several studies have been carried out using phasor analysis in an attempt to measure circadian disruption exhibited by animals and by humans. To perform these studies, species-specific light measurement and light delivery technologies had to be developed based upon a fundamental understanding of circadian phototransduction mechanisms in the different species. When both nocturnal rodents and diurnal humans, experienced different species-specific light-dark shift schedules, they showed, based upon phasor analysis of the light-dark and activity-rest patterns, similar levels of light-dependent circadian disruption. Indeed, both rodents and humans show monotonically increasing and quantitatively similar levels of light-dependent circadian disruption with increasing shift-nights per week. Thus, phasor analysis provides a method for quantifying circadian disruption in the field and in the laboratory as well as a bridge between ecological measurements of circadian entrainment in humans and parametric studies of circadian disruption in animal models, including nocturnal rodents.

  19. Quantifying, Visualizing, and Monitoring Lead Optimization.

    PubMed

    Maynard, Andrew T; Roberts, Christopher D

    2016-05-12

    Although lead optimization (LO) is by definition a process, process-centric analysis and visualization of this important phase of pharmaceutical R&D has been lacking. Here we describe a simple statistical framework to quantify and visualize the progression of LO projects so that the vital signs of LO convergence can be monitored. We refer to the resulting visualizations generated by our methodology as the "LO telemetry" of a project. These visualizations can be automated to provide objective, holistic, and instantaneous analysis and communication of LO progression. This enhances the ability of project teams to more effectively drive LO process, while enabling management to better coordinate and prioritize LO projects. We present the telemetry of five LO projects comprising different biological targets and different project outcomes, including clinical compound selection, termination due to preclinical safety/tox, and termination due to lack of tractability. We demonstrate that LO progression is accurately captured by the telemetry. We also present metrics to quantify LO efficiency and tractability.

  20. Shakespeare and other English Renaissance authors as characterized by Information Theory complexity quantifiers

    NASA Astrophysics Data System (ADS)

    Rosso, Osvaldo A.; Craig, Hugh; Moscato, Pablo

    2009-03-01

    We introduce novel Information Theory quantifiers in a computational linguistic study that involves a large corpus of English Renaissance literature. The 185 texts studied (136 plays and 49 poems in total), with first editions that range from 1580 to 1640, form a representative set of its period. Our data set includes 30 texts unquestionably attributed to Shakespeare; in addition we also included A Lover’s Complaint, a poem which generally appears in Shakespeare collected editions but whose authorship is currently in dispute. Our statistical complexity quantifiers combine the power of Jensen-Shannon’s divergence with the entropy variations as computed from a probability distribution function of the observed word use frequencies. Our results show, among other things, that for a given entropy poems display higher complexity than plays, that Shakespeare’s work falls into two distinct clusters in entropy, and that his work is remarkable for its homogeneity and for its closeness to overall means.

  1. Quantifying changes in lens biomechanical properties due to cold cataract with optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Zhang, Hongqiu; Wu, Chen; Singh, Manmohan; Larin, Kirill V.

    2018-02-01

    Cataract is the most prevalent cause of visual impairment worldwide. Cataracts can be formed due to trauma, radiation, drug abuse, or low temperatures. Thus, early detection of cataract can be immensely helpful for preserving visual acuity by ensuring that the appropriate therapeutic procedures are performed at earlier stages of disease onset and progression. In this work, we utilized a phase-sensitive optical coherence elastography (OCE) system to quantify changes in biomechanical properties of porcine lenses in vitro with induced cold cataracts. The results show significant increase in lens Young's modulus due to formation of the cold cataract (from 35 kPa to 60 kPa). These results show that OCE can assess lenticular biomechanical properties and may be useful for detecting and, potentially, characterizing cataracts.

  2. Correlation between quantified breast densities from digital mammography and 18F-FDG PET uptake.

    PubMed

    Lakhani, Paras; Maidment, Andrew D A; Weinstein, Susan P; Kung, Justin W; Alavi, Abass

    2009-01-01

    To correlate breast density quantified from digital mammograms with mean and maximum standardized uptake values (SUVs) from positron emission tomography (PET). This was a prospective study that included 56 women with a history of suspicion of breast cancer (mean age 49.2 +/- 9.3 years), who underwent 18F-fluoro-2-deoxyglucose (FDG)-PET imaging of their breasts as well as digital mammography. A computer thresholding algorithm was applied to the contralateral nonmalignant breasts to quantitatively estimate the breast density on digital mammograms. The breasts were also classified into one of four Breast Imaging Reporting and Data System categories for density. Comparisons between SUV and breast density were made using linear regression and the Student's t-test. Linear regression of mean SUV versus average breast density showed a positive relationship with a Pearson's correlation coefficient of R(2) = 0.83. The quantified breast densities and mean SUVs were significantly greater for mammographically dense than nondense breasts (p < 0.0001 for both). The average quantified densities and mean SUVs of the breasts were significantly greater for premenopausal than postmenopausal patients (p < 0.05). 8/51 (16%) of the patients had maximum SUVs that equaled 1.6 or greater. There is a positive linear correlation between quantified breast density on digital mammography and FDG uptake on PET. Menopausal status affects the metabolic activity of normal breast tissue, resulting in higher SUVs in pre- versus postmenopausal patients.

  3. Terrestrial laser scanning to quantify above-ground biomass of structurally complex coastal wetland vegetation

    NASA Astrophysics Data System (ADS)

    Owers, Christopher J.; Rogers, Kerrylee; Woodroffe, Colin D.

    2018-05-01

    Above-ground biomass represents a small yet significant contributor to carbon storage in coastal wetlands. Despite this, above-ground biomass is often poorly quantified, particularly in areas where vegetation structure is complex. Traditional methods for providing accurate estimates involve harvesting vegetation to develop mangrove allometric equations and quantify saltmarsh biomass in quadrats. However broad scale application of these methods may not capture structural variability in vegetation resulting in a loss of detail and estimates with considerable uncertainty. Terrestrial laser scanning (TLS) collects high resolution three-dimensional point clouds capable of providing detailed structural morphology of vegetation. This study demonstrates that TLS is a suitable non-destructive method for estimating biomass of structurally complex coastal wetland vegetation. We compare volumetric models, 3-D surface reconstruction and rasterised volume, and point cloud elevation histogram modelling techniques to estimate biomass. Our results show that current volumetric modelling approaches for estimating TLS-derived biomass are comparable to traditional mangrove allometrics and saltmarsh harvesting. However, volumetric modelling approaches oversimplify vegetation structure by under-utilising the large amount of structural information provided by the point cloud. The point cloud elevation histogram model presented in this study, as an alternative to volumetric modelling, utilises all of the information within the point cloud, as opposed to sub-sampling based on specific criteria. This method is simple but highly effective for both mangrove (r2 = 0.95) and saltmarsh (r2 > 0.92) vegetation. Our results provide evidence that application of TLS in coastal wetlands is an effective non-destructive method to accurately quantify biomass for structurally complex vegetation.

  4. Quantifying Scheduling Challenges for Exascale System Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mondragon, Oscar; Bridges, Patrick G.; Jones, Terry R

    2015-01-01

    The move towards high-performance computing (HPC) ap- plications comprised of coupled codes and the need to dra- matically reduce data movement is leading to a reexami- nation of time-sharing vs. space-sharing in HPC systems. In this paper, we discuss and begin to quantify the perfor- mance impact of a move away from strict space-sharing of nodes for HPC applications. Specifically, we examine the po- tential performance cost of time-sharing nodes between ap- plication components, we determine whether a simple coor- dinated scheduling mechanism can address these problems, and we research how suitable simple constraint-based opti- mization techniques are for solvingmore » scheduling challenges in this regime. Our results demonstrate that current general- purpose HPC system software scheduling and resource al- location systems are subject to significant performance de- ciencies which we quantify for six representative applica- tions. Based on these results, we discuss areas in which ad- ditional research is needed to meet the scheduling challenges of next-generation HPC systems.« less

  5. Processing of Numerical and Proportional Quantifiers

    ERIC Educational Resources Information Center

    Shikhare, Sailee; Heim, Stefan; Klein, Elise; Huber, Stefan; Willmes, Klaus

    2015-01-01

    Quantifier expressions like "many" and "at least" are part of a rich repository of words in language representing magnitude information. The role of numerical processing in comprehending quantifiers was studied in a semantic truth value judgment task, asking adults to quickly verify sentences about visual displays using…

  6. Approach to quantify human dermal skin aging using multiphoton laser scanning microscopy

    NASA Astrophysics Data System (ADS)

    Puschmann, Stefan; Rahn, Christian-Dennis; Wenck, Horst; Gallinat, Stefan; Fischer, Frank

    2012-03-01

    Extracellular skin structures in human skin are impaired during intrinsic and extrinsic aging. Assessment of these dermal changes is conducted by subjective clinical evaluation and histological and molecular analysis. We aimed to develop a new parameter for the noninvasive quantitative determination of dermal skin alterations utilizing the high-resolution three-dimensional multiphoton laser scanning microscopy (MPLSM) technique. To quantify structural differences between chronically sun-exposed and sun-protected human skin, the respective collagen-specific second harmonic generation and the elastin-specific autofluorescence signals were recorded in young and elderly volunteers using the MPLSM technique. After image processing, the elastin-to-collagen ratio (ELCOR) was calculated. Results show that the ELCOR parameter of volar forearm skin significantly increases with age. For elderly volunteers, the ELCOR value calculated for the chronically sun-exposed temple area is significantly augmented compared to the sun-protected upper arm area. Based on the MPLSM technology, we introduce the ELCOR parameter as a new means to quantify accurately age-associated alterations in the extracellular matrix.

  7. Quantifying utricular stimulation during natural behavior

    PubMed Central

    Rivera, Angela R. V.; Davis, Julian; Grant, Wally; Blob, Richard W.; Peterson, Ellengene; Neiman, Alexander B.; Rowe, Michael

    2012-01-01

    The use of natural stimuli in neurophysiological studies has led to significant insights into the encoding strategies used by sensory neurons. To investigate these encoding strategies in vestibular receptors and neurons, we have developed a method for calculating the stimuli delivered to a vestibular organ, the utricle, during natural (unrestrained) behaviors, using the turtle as our experimental preparation. High-speed digital video sequences are used to calculate the dynamic gravito-inertial (GI) vector acting on the head during behavior. X-ray computed tomography (CT) scans are used to determine the orientation of the otoconial layer (OL) of the utricle within the head, and the calculated GI vectors are then rotated into the plane of the OL. Thus, the method allows us to quantify the spatio-temporal structure of stimuli to the OL during natural behaviors. In the future, these waveforms can be used as stimuli in neurophysiological experiments to understand how natural signals are encoded by vestibular receptors and neurons. We provide one example of the method which shows that turtle feeding behaviors can stimulate the utricle at frequencies higher than those typically used in vestibular studies. This method can be adapted to other species, to other vestibular end organs, and to other methods of quantifying head movements. PMID:22753360

  8. Scalar Quantifiers: Logic, Acquisition, and Processing

    ERIC Educational Resources Information Center

    Geurts, Bart; Katsos, Napoleon; Cummins, Chris; Moons, Jonas; Noordman, Leo

    2010-01-01

    Superlative quantifiers ("at least 3", "at most 3") and comparative quantifiers ("more than 2", "fewer than 4") are traditionally taken to be interdefinable: the received view is that "at least n" and "at most n" are equivalent to "more than n-1" and "fewer than n+1",…

  9. Quantifying Drosophila food intake: comparative analysis of current methodology

    PubMed Central

    Deshpande, Sonali A.; Carvalho, Gil B.; Amador, Ariadna; Phillips, Angela M.; Hoxha, Sany; Lizotte, Keith J.; Ja, William W.

    2014-01-01

    Food intake is a fundamental parameter in animal studies. Despite the prevalent use of Drosophila in laboratory research, precise measurements of food intake remain challenging in this model organism. Here, we compare several common Drosophila feeding assays: the Capillary Feeder (CAFE), food-labeling with a radioactive tracer or a colorimetric dye, and observations of proboscis extension (PE). We show that the CAFE and radioisotope-labeling provide the most consistent results, have the highest sensitivity, and can resolve differences in feeding that dye-labeling and PE fail to distinguish. We conclude that performing the radiolabeling and CAFE assays in parallel is currently the best approach for quantifying Drosophila food intake. Understanding the strengths and limitations of food intake methodology will greatly advance Drosophila studies of nutrition, behavior, and disease. PMID:24681694

  10. Quantifying voids effecting delamination in carbon/epoxy composites: static and fatigue fracture behavior

    NASA Astrophysics Data System (ADS)

    Hakim, I.; May, D.; Abo Ras, M.; Meyendorf, N.; Donaldson, S.

    2016-04-01

    On the present work, samples of carbon fiber/epoxy composites with different void levels were fabricated using hand layup vacuum bagging process by varying the pressure. Thermal nondestructive methods: thermal conductivity measurement, pulse thermography, pulse phase thermography and lock-in-thermography, and mechanical testing: modes I and II interlaminar fracture toughness were conducted. Comparing the parameters resulted from the thermal nondestructive testing revealed that voids lead to reductions in thermal properties in all directions of composites. The results of mode I and mode II interlaminar fracture toughness showed that voids lead to reductions in interlaminar fracture toughness. The parameters resulted from thermal nondestructive testing were correlated to the results of mode I and mode II interlaminar fracture toughness and voids were quantified.

  11. Quantifying Complexity in Quantum Phase Transitions via Mutual Information Complex Networks

    NASA Astrophysics Data System (ADS)

    Valdez, Marc Andrew; Jaschke, Daniel; Vargas, David L.; Carr, Lincoln D.

    2017-12-01

    We quantify the emergent complexity of quantum states near quantum critical points on regular 1D lattices, via complex network measures based on quantum mutual information as the adjacency matrix, in direct analogy to quantifying the complexity of electroencephalogram or functional magnetic resonance imaging measurements of the brain. Using matrix product state methods, we show that network density, clustering, disparity, and Pearson's correlation obtain the critical point for both quantum Ising and Bose-Hubbard models to a high degree of accuracy in finite-size scaling for three classes of quantum phase transitions, Z2, mean field superfluid to Mott insulator, and a Berzinskii-Kosterlitz-Thouless crossover.

  12. Quantifying ubiquitin signaling.

    PubMed

    Ordureau, Alban; Münch, Christian; Harper, J Wade

    2015-05-21

    Ubiquitin (UB)-driven signaling systems permeate biology, and are often integrated with other types of post-translational modifications (PTMs), including phosphorylation. Flux through such pathways is dictated by the fractional stoichiometry of distinct modifications and protein assemblies as well as the spatial organization of pathway components. Yet, we rarely understand the dynamics and stoichiometry of rate-limiting intermediates along a reaction trajectory. Here, we review how quantitative proteomic tools and enrichment strategies are being used to quantify UB-dependent signaling systems, and to integrate UB signaling with regulatory phosphorylation events, illustrated with the PINK1/PARKIN pathway. A key feature of ubiquitylation is that the identity of UB chain linkage types can control downstream processes. We also describe how proteomic and enzymological tools can be used to identify and quantify UB chain synthesis and linkage preferences. The emergence of sophisticated quantitative proteomic approaches will set a new standard for elucidating biochemical mechanisms of UB-driven signaling systems. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Quantifying Ubiquitin Signaling

    PubMed Central

    Ordureau, Alban; Münch, Christian; Harper, J. Wade

    2015-01-01

    Ubiquitin (UB)-driven signaling systems permeate biology, and are often integrated with other types of post-translational modifications (PTMs), most notably phosphorylation. Flux through such pathways is typically dictated by the fractional stoichiometry of distinct regulatory modifications and protein assemblies as well as the spatial organization of pathway components. Yet, we rarely understand the dynamics and stoichiometry of rate-limiting intermediates along a reaction trajectory. Here, we review how quantitative proteomic tools and enrichment strategies are being used to quantify UB-dependent signaling systems, and to integrate UB signaling with regulatory phosphorylation events. A key regulatory feature of ubiquitylation is that the identity of UB chain linkage types can control downstream processes. We also describe how proteomic and enzymological tools can be used to identify and quantify UB chain synthesis and linkage preferences. The emergence of sophisticated quantitative proteomic approaches will set a new standard for elucidating biochemical mechanisms of UB-driven signaling systems. PMID:26000850

  14. Quantifying the Impact of Scenic Environments on Health

    NASA Astrophysics Data System (ADS)

    Seresinhe, Chanuki Illushka; Preis, Tobias; Moat, Helen Susannah

    2015-11-01

    Few people would deny an intuitive sense of increased wellbeing when spending time in beautiful locations. Here, we ask: can we quantify the relationship between environmental aesthetics and human health? We draw on data from Scenic-Or-Not, a website that crowdsources ratings of “scenicness” for geotagged photographs across Great Britain, in combination with data on citizen-reported health from the Census for England and Wales. We find that inhabitants of more scenic environments report better health, across urban, suburban and rural areas, even when taking core socioeconomic indicators of deprivation into account, such as income, employment and access to services. Our results provide evidence in line with the striking hypothesis that the aesthetics of the environment may have quantifiable consequences for our wellbeing.

  15. Digital PCR for Quantifying Norovirus in Oysters Implicated in Outbreaks, France.

    PubMed

    Polo, David; Schaeffer, Julien; Fournet, Nelly; Le Saux, Jean-Claude; Parnaudeau, Sylvain; McLeod, Catherine; Le Guyader, Françoise S

    2016-12-01

    Using samples from oysters clearly implicated in human disease, we quantified norovirus levels by using digital PCR. Concentrations varied from 43 to 1,170 RNA copies/oyster. The analysis of frozen samples from the production area showed the presence of norovirus 2 weeks before consumption.

  16. Quantifying the three-dimensional facial morphology of the laboratory rat with a focus on the vibrissae

    PubMed Central

    2018-01-01

    The morphology of an animal’s face will have large effects on the sensory information it can acquire. Here we quantify the arrangement of cranial sensory structures of the rat, with special emphasis on the mystacial vibrissae (whiskers). Nearly all mammals have vibrissae, which are generally arranged in rows and columns across the face. The vibrissae serve a wide variety of important behavioral functions, including navigation, climbing, wake following, anemotaxis, and social interactions. To date, however, there are few studies that compare the morphology of vibrissal arrays across species, or that describe the arrangement of the vibrissae relative to other facial sensory structures. The few studies that do exist have exploited the whiskers’ grid-like arrangement to quantify array morphology in terms of row and column identity. However, relying on whisker identity poses a challenge for comparative research because different species have different numbers and arrangements of whiskers. The present work introduces an approach to quantify vibrissal array morphology regardless of the number of rows and columns, and to quantify the array’s location relative to other sensory structures. We use the three-dimensional locations of the whisker basepoints as fundamental parameters to generate equations describing the length, curvature, and orientation of each whisker. Results show that in the rat, whisker length varies exponentially across the array, and that a hard limit on intrinsic curvature constrains the whisker height-to-length ratio. Whiskers are oriented to “fan out” approximately equally in dorsal-ventral and rostral-caudal directions. Quantifying positions of the other sensory structures relative to the whisker basepoints shows remarkable alignment to the somatosensory cortical homunculus, an alignment that would not occur for other choices of coordinate systems (e.g., centered on the midpoint of the eyes). We anticipate that the quantification of facial

  17. The comprehension and production of quantifiers in isiXhosa-speaking Grade 1 learners

    PubMed Central

    Southwood, Frenette

    2016-01-01

    Background Quantifiers form part of the discourse-internal linguistic devices that children need to access and produce narratives and other classroom discourse. Little is known about the development - especially the prodiction - of quantifiers in child language, specifically in speakers of an African language. Objectives The study aimed to ascertain how well Grade 1 isiXhosa first language (L1) learners perform at the beginning and at the end of Grade 1 on quantifier comprehension and production tasks. Method Two low socioeconomic groups of L1 isiXhosa learners with either isiXhosa or English as language of learning and teaching (LOLT) were tested in February and November of their Grade 1 year with tasks targeting several quantifiers. Results The isiXhosa LOLT group comprehended no/none, any and all fully either in February or then in November of Grade 1, and they produced all assessed quantifiers in February of Grade 1. For the English LOLT group, neither the comprehension nor the production of quantifiers was mastered by the end of Grade 1, although there was a significant increase in both their comprehension and production scores. Conclusion The English LOLT group made significant progress in comprehension and production of quantifiers, but still performed worse than peers who had their L1 as LOLT. Generally, children with no or very little prior knowledge of the LOLT need either, (1) more deliberate exposure to quantifier-rich language or, (2) longer exposure to general classroom language before quantifiers can be expected to be mastered sufficiently to allow access to quantifier-related curriculum content. PMID:27245132

  18. Primary Overcorrection of the Unilateral Cleft Nasal Deformity: Quantifying the Results.

    PubMed

    Lonic, Daniel; Morris, David E; Lo, Lun-Jou

    2016-02-01

    Because primary nasal correction by the time of lip repair has been incorporated into the treatment approach, many patients have benefitted from this combined procedure. However, primary nasal correction cannot guarantee an excellent result. Although overcorrection has been mentioned as a treatment rationale of the unilateral cleft lip nasal deformity, a detailed approach and quantitative evidence of the rationale are rare. This study evaluates whether overcorrection in the primary repair results in a quantitative improvement in nasal appearance. In this retrospective study, the inclusion criteria were patients with complete unilateral cleft lip and palate who underwent primary lip and nose repair by the age of 3 to 4 months. Primary nasal overcorrection was achieved by application of muscle to septal base suture, alar cinching suture and Tajima reversed U incision method. Patients were further divided into an overcorrected (n = 19) and nonovercorrected group (n = 19). The following parameters were identified on basilar photos of all patients taken at least 12 months after repair, ratios of cleft to noncleft side in each patient were taken and the mean for each parameter calculated: Ac angle (ACA/ACA'), alar height (AH/AH'), alar width (AW/AW'), nostril height (NH/NH`), nostril width (NW/NW'), and columellar deviation from the midline (CD/NW). The means of the overcorrected and nonovercorrected groups were then compared using the t test. From all investigated measuremens, Alar height (AH/AH': overcorrected, 0.983 to nonovercorrected, 0.941; P = 0.03) and nostril height ratio (NH/NH') (NH/NH': covercorrected, 0.897 to nonovercorrected, 0.680; P = 0.003) showed statistically significant differences favoring the overcorrected group at least 12 months after surgery. Primary nasal overcorrection including muscle to columella base suture, alar cinch suture, and Tajima method resulted in quantitatively more long-term symmetric alae and nostril height compared to

  19. Detecting and Quantifying Topography in Neural Maps

    PubMed Central

    Yarrow, Stuart; Razak, Khaleel A.; Seitz, Aaron R.; Seriès, Peggy

    2014-01-01

    Topographic maps are an often-encountered feature in the brains of many species, yet there are no standard, objective procedures for quantifying topography. Topographic maps are typically identified and described subjectively, but in cases where the scale of the map is close to the resolution limit of the measurement technique, identifying the presence of a topographic map can be a challenging subjective task. In such cases, an objective topography detection test would be advantageous. To address these issues, we assessed seven measures (Pearson distance correlation, Spearman distance correlation, Zrehen's measure, topographic product, topological correlation, path length and wiring length) by quantifying topography in three classes of cortical map model: linear, orientation-like, and clusters. We found that all but one of these measures were effective at detecting statistically significant topography even in weakly-ordered maps, based on simulated noisy measurements of neuronal selectivity and sparse sampling of the maps. We demonstrate the practical applicability of these measures by using them to examine the arrangement of spatial cue selectivity in pallid bat A1. This analysis shows that significantly topographic arrangements of interaural intensity difference and azimuth selectivity exist at the scale of individual binaural clusters. PMID:24505279

  20. Quantifying Qualitative Learning.

    ERIC Educational Resources Information Center

    Bogus, Barbara

    1995-01-01

    A teacher at an alternative school for at-risk students discusses the development of student assessment that increases students' self-esteem, convinces students that learning is fun, and prepares students to return to traditional school settings. She found that allowing students to participate in the assessment process successfully quantified the…

  1. Process-morphology scaling relations quantify self-organization in capillary densified nanofiber arrays.

    PubMed

    Kaiser, Ashley L; Stein, Itai Y; Cui, Kehang; Wardle, Brian L

    2018-02-07

    Capillary-mediated densification is an inexpensive and versatile approach to tune the application-specific properties and packing morphology of bulk nanofiber (NF) arrays, such as aligned carbon nanotubes. While NF length governs elasto-capillary self-assembly, the geometry of cellular patterns formed by capillary densified NFs cannot be precisely predicted by existing theories. This originates from the recently quantified orders of magnitude lower than expected NF array effective axial elastic modulus (E), and here we show via parametric experimentation and modeling that E determines the width, area, and wall thickness of the resulting cellular pattern. Both experiments and models show that further tuning of the cellular pattern is possible by altering the NF-substrate adhesion strength, which could enable the broad use of this facile approach to predictably pattern NF arrays for high value applications.

  2. Native trees show conservative water use relative to invasive: results from a removal experiment in a Hawaiian wet forest

    Treesearch

    M.A. Cavaleri; R. Ostertag; S. Cordell; L. and Sack

    2014-01-01

    While the supply of freshwater is expected to decline in many regions in the coming decades, invasive plant species, often 'high water spenders', are greatly expanding their ranges worldwide. In this study, we quantified the ecohydrological differences between native and invasive trees and also the effects of woody invasive removal on plot-level water use in...

  3. Quantifiers are incrementally interpreted in context, more than less

    PubMed Central

    Urbach, Thomas P.; DeLong, Katherine A.; Kutas, Marta

    2015-01-01

    Language interpretation is often assumed to be incremental. However, our studies of quantifier expressions in isolated sentences found N400 event-related brain potential (ERP) evidence for partial but not full immediate quantifier interpretation (Urbach & Kutas, 2010). Here we tested similar quantifier expressions in pragmatically supporting discourse contexts (Alex was an unusual toddler. Most/Few kids prefer sweets/vegetables…) while participants made plausibility judgments (Experiment 1) or read for comprehension (Experiment 2). Control Experiments 3A (plausibility) and 3B (comprehension) removed the discourse contexts. Quantifiers always modulated typical and/or atypical word N400 amplitudes. However, only the real-time N400 effects only in Experiment 2 mirrored offline quantifier and typicality crossover interaction effects for plausibility ratings and cloze probabilities. We conclude that quantifier expressions can be interpreted fully and immediately, though pragmatic and task variables appear to impact the speed and/or depth of quantifier interpretation. PMID:26005285

  4. Quantifying Proportional Variability

    PubMed Central

    Heath, Joel P.; Borowski, Peter

    2013-01-01

    Real quantities can undergo such a wide variety of dynamics that the mean is often a meaningless reference point for measuring variability. Despite their widespread application, techniques like the Coefficient of Variation are not truly proportional and exhibit pathological properties. The non-parametric measure Proportional Variability (PV) [1] resolves these issues and provides a robust way to summarize and compare variation in quantities exhibiting diverse dynamical behaviour. Instead of being based on deviation from an average value, variation is simply quantified by comparing the numbers to each other, requiring no assumptions about central tendency or underlying statistical distributions. While PV has been introduced before and has already been applied in various contexts to population dynamics, here we present a deeper analysis of this new measure, derive analytical expressions for the PV of several general distributions and present new comparisons with the Coefficient of Variation, demonstrating cases in which PV is the more favorable measure. We show that PV provides an easily interpretable approach for measuring and comparing variation that can be generally applied throughout the sciences, from contexts ranging from stock market stability to climate variation. PMID:24386334

  5. Crisis of Japanese Vascular Flora Shown By Quantifying Extinction Risks for 1618 Taxa

    PubMed Central

    Kadoya, Taku; Takenaka, Akio; Ishihama, Fumiko; Fujita, Taku; Ogawa, Makoto; Katsuyama, Teruo; Kadono, Yasuro; Kawakubo, Nobumitsu; Serizawa, Shunsuke; Takahashi, Hideki; Takamiya, Masayuki; Fujii, Shinji; Matsuda, Hiroyuki; Muneda, Kazuo; Yokota, Masatsugu; Yonekura, Koji; Yahara, Tetsukazu

    2014-01-01

    Although many people have expressed alarm that we are witnessing a mass extinction, few projections have been quantified, owing to limited availability of time-series data on threatened organisms, especially plants. To quantify the risk of extinction, we need to monitor changes in population size over time for as many species as possible. Here, we present the world's first quantitative projection of plant species loss at a national level, with stochastic simulations based on the results of population censuses of 1618 threatened plant taxa in 3574 map cells of ca. 100 km2. More than 500 lay botanists helped monitor those taxa in 1994–1995 and in 2003–2004. We projected that between 370 and 561 vascular plant taxa will go extinct in Japan during the next century if past trends of population decline continue. This extinction rate is approximately two to three times the global rate. Using time-series data, we show that existing national protected areas (PAs) covering ca. 7% of Japan will not adequately prevent population declines: even core PAs can protect at best <60% of local populations from decline. Thus, the Aichi Biodiversity Target to expand PAs to 17% of land (and inland water) areas, as committed to by many national governments, is not enough: only 29.2% of currently threatened species will become non-threatened under the assumption that probability of protection success by PAs is 0.5, which our assessment shows is realistic. In countries where volunteers can be organized to monitor threatened taxa, censuses using our method should be able to quantify how fast we are losing species and to assess how effective current conservation measures such as PAs are in preventing species extinction. PMID:24922311

  6. Quantifying Evaporation in a Permeable Pavement System

    EPA Science Inventory

    Studies quantifying evaporation from permeable pavement systems are limited to a few laboratory studies and one field application. This research quantifies evaporation for a larger-scale field application by measuring the water balance from lined permeable pavement sections. Th...

  7. Quantifying the sensitivity of post-glacial sea level change to laterally varying viscosity

    NASA Astrophysics Data System (ADS)

    Crawford, Ophelia; Al-Attar, David; Tromp, Jeroen; Mitrovica, Jerry X.; Austermann, Jacqueline; Lau, Harriet C. P.

    2018-05-01

    We present a method for calculating the derivatives of measurements of glacial isostatic adjustment (GIA) with respect to the viscosity structure of the Earth and the ice sheet history. These derivatives, or kernels, quantify the linearised sensitivity of measurements to the underlying model parameters. The adjoint method is used to enable efficient calculation of theoretically exact sensitivity kernels within laterally heterogeneous earth models that can have a range of linear or non-linear viscoelastic rheologies. We first present a new approach to calculate GIA in the time domain, which, in contrast to the more usual formulation in the Laplace domain, is well suited to continuously varying earth models and to the use of the adjoint method. Benchmarking results show excellent agreement between our formulation and previous methods. We illustrate the potential applications of the kernels calculated in this way through a range of numerical calculations relative to a spherically symmetric background model. The complex spatial patterns of the sensitivities are not intuitive, and this is the first time that such effects are quantified in an efficient and accurate manner.

  8. A comparative analysis of alternative approaches for quantifying nonlinear dynamics in cardiovascular system.

    PubMed

    Chen, Yun; Yang, Hui

    2013-01-01

    Heart rate variability (HRV) analysis has emerged as an important research topic to evaluate autonomic cardiac function. However, traditional time and frequency-domain analysis characterizes and quantify only linear and stationary phenomena. In the present investigation, we made a comparative analysis of three alternative approaches (i.e., wavelet multifractal analysis, Lyapunov exponents and multiscale entropy analysis) for quantifying nonlinear dynamics in heart rate time series. Note that these extracted nonlinear features provide information about nonlinear scaling behaviors and the complexity of cardiac systems. To evaluate the performance, we used 24-hour HRV recordings from 54 healthy subjects and 29 heart failure patients, available in PhysioNet. Three nonlinear methods are evaluated not only individually but also in combination using three classification algorithms, i.e., linear discriminate analysis, quadratic discriminate analysis and k-nearest neighbors. Experimental results show that three nonlinear methods capture nonlinear dynamics from different perspectives and the combined feature set achieves the best performance, i.e., sensitivity 97.7% and specificity 91.5%. Collectively, nonlinear HRV features are shown to have the promise to identify the disorders in autonomic cardiovascular function.

  9. Quantifying the Urban and Rural Nutrient Fluxes to Lake Erie Using a Paired Watershed Approach

    NASA Astrophysics Data System (ADS)

    Hopkins, M.; Beck, M.; Rossi, E.; Luh, N.; Allen-King, R. M.; Lowry, C.

    2016-12-01

    Excess nutrients have a detrimental impact on the water quality of Lake Erie, specifically nitrate and phosphate, which can lead to toxic algae blooms. Algae blooms have negatively impacted Lake Erie, which is the main source of drinking water for many coastal Great Lake communities. In 2014 the city of Toledo, Ohio was forced to shut down its water treatment plant due to these toxic algae blooms. The objective of this research is to quantify surface water nutrient fluxes to the eastern basin of Lake Erie using a paired watershed approach. Three different western New York watersheds that feed Lake Erie were chosen based on land use and areal extent: one small urban, one small rural, and one large rural. These paired watersheds were chosen to represent a range of sources of potential nutrient loading to the lake. Biweekly water samples were taken from the streams during the 2015-2016 winter to summer seasonal transition to quantify springtime snow melt effects on nutrient fluxes. These results were compared to the previous year samples, collected over the summer of 2015, which represented wetter conditions. Phosphorous levels were assessed using the ascorbic acid colorimetric assay, while nitrate was analyzed by anion-exchange chromatography. Stream gaging was used to obtain flow measurements and establish a rating curve, which was incorporated to quantify seasonal nutrient fluxes entering the lake. Patterns in the nutrient levels show higher level of nutrients in the rural watersheds with a decrease in concentration over the winter to spring transition. However, nutrient patterns in the urban stream show relatively constant patters of nutrient flux, which is independent of seasonal transition or stream discharge. A comparison of wet and dry seasons shows higher nutrient concentrations during summers with greater rainfall. By identifying the largest contributors of each nutrient, we can better allocate limited attenuation resources.

  10. Incremental generation of answers during the comprehension of questions with quantifiers.

    PubMed

    Bott, Oliver; Augurzky, Petra; Sternefeld, Wolfgang; Ulrich, Rolf

    2017-09-01

    The paper presents a study on the online interpretation of quantified questions involving complex domain restriction, for instance, are all triangles blue that are in the circle. Two probe reaction time (RT) task experiments were conducted to study the incremental nature of answer generation while manipulating visual contexts and response hand overlap between tasks. We manipulated the contexts in such a way that the incremental answer to the question changed from 'yes' to 'no' or remained the same before and after encountering the extraposed relative clause. The findings of both experiments provide evidence for incremental answer preparation but only if the context did not involve the risk of answer revision. Our results show that preliminary output from incremental semantic interpretation results in response priming that facilitates congruent responses in the probe RT task. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Quantifying the underlying landscape and paths of cancer

    PubMed Central

    Li, Chunhe; Wang, Jin

    2014-01-01

    Cancer is a disease regulated by the underlying gene networks. The emergence of normal and cancer states as well as the transformation between them can be thought of as a result of the gene network interactions and associated changes. We developed a global potential landscape and path framework to quantify cancer and associated processes. We constructed a cancer gene regulatory network based on the experimental evidences and uncovered the underlying landscape. The resulting tristable landscape characterizes important biological states: normal, cancer and apoptosis. The landscape topography in terms of barrier heights between stable state attractors quantifies the global stability of the cancer network system. We propose two mechanisms of cancerization: one is by the changes of landscape topography through the changes in regulation strengths of the gene networks. The other is by the fluctuations that help the system to go over the critical barrier at fixed landscape topography. The kinetic paths from least action principle quantify the transition processes among normal state, cancer state and apoptosis state. The kinetic rates provide the quantification of transition speeds among normal, cancer and apoptosis attractors. By the global sensitivity analysis of the gene network parameters on the landscape topography, we uncovered some key gene regulations determining the transitions between cancer and normal states. This can be used to guide the design of new anti-cancer tactics, through cocktail strategy of targeting multiple key regulation links simultaneously, for preventing cancer occurrence or transforming the early cancer state back to normal state. PMID:25232051

  12. Quantifying multiple telecouplings using an integrated suite of spatially-explicit tools

    NASA Astrophysics Data System (ADS)

    Tonini, F.; Liu, J.

    2016-12-01

    Telecoupling is an interdisciplinary research umbrella concept that enables natural and social scientists to understand and generate information for managing how humans and nature can sustainably coexist worldwide. To systematically study telecoupling, it is essential to build a comprehensive set of spatially-explicit tools for describing and quantifying multiple reciprocal socioeconomic and environmental interactions between a focal area and other areas. Here we introduce the Telecoupling Toolbox, a new free and open-source set of tools developed to map and identify the five major interrelated components of the telecoupling framework: systems, flows, agents, causes, and effects. The modular design of the toolbox allows the integration of existing tools and software (e.g. InVEST) to assess synergies and tradeoffs associated with policies and other local to global interventions. We show applications of the toolbox using a number of representative studies that address a variety of scientific and management issues related to telecouplings throughout the world. The results suggest that the toolbox can thoroughly map and quantify multiple telecouplings under various contexts while providing users with an easy-to-use interface. It provides a powerful platform to address globally important issues, such as land use and land cover change, species invasion, migration, flows of ecosystem services, and international trade of goods and products.

  13. Quantifying transfer after perceptual-motor sequence learning: how inflexible is implicit learning?

    PubMed

    Sanchez, Daniel J; Yarnik, Eric N; Reber, Paul J

    2015-03-01

    Studies of implicit perceptual-motor sequence learning have often shown learning to be inflexibly tied to the training conditions during learning. Since sequence learning is seen as a model task of skill acquisition, limits on the ability to transfer knowledge from the training context to a performance context indicates important constraints on skill learning approaches. Lack of transfer across contexts has been demonstrated by showing that when task elements are changed following training, this leads to a disruption in performance. These results have typically been taken as suggesting that the sequence knowledge relies on integrated representations across task elements (Abrahamse, Jiménez, Verwey, & Clegg, Psychon Bull Rev 17:603-623, 2010a). Using a relatively new sequence learning task, serial interception sequence learning, three experiments are reported that quantify this magnitude of performance disruption after selectively manipulating individual aspects of motor performance or perceptual information. In Experiment 1, selective disruption of the timing or order of sequential actions was examined using a novel response manipulandum that allowed for separate analysis of these two motor response components. In Experiments 2 and 3, transfer was examined after selective disruption of perceptual information that left the motor response sequence intact. All three experiments provided quantifiable estimates of partial transfer to novel contexts that suggest some level of information integration across task elements. However, the ability to identify quantifiable levels of successful transfer indicates that integration is not all-or-none and that measurement sensitivity is a key in understanding sequence knowledge representations.

  14. Quantifying transfer after perceptual-motor sequence learning: how inflexible is implicit learning?

    PubMed Central

    Sanchez, Daniel J.; Yarnik, Eric N.

    2015-01-01

    Studies of implicit perceptual-motor sequence learning have often shown learning to be inflexibly tied to the training conditions during learning. Since sequence learning is seen as a model task of skill acquisition, limits on the ability to transfer knowledge from the training context to a performance context indicates important constraints on skill learning approaches. Lack of transfer across contexts has been demonstrated by showing that when task elements are changed following training, this leads to a disruption in performance. These results have typically been taken as suggesting that the sequence knowledge relies on integrated representations across task elements (Abrahamse, Jiménez, Verwey, & Clegg, Psychon Bull Rev 17:603–623, 2010a). Using a relatively new sequence learning task, serial interception sequence learning, three experiments are reported that quantify this magnitude of performance disruption after selectively manipulating individual aspects of motor performance or perceptual information. In Experiment 1, selective disruption of the timing or order of sequential actions was examined using a novel response manipulandum that allowed for separate analysis of these two motor response components. In Experiments 2 and 3, transfer was examined after selective disruption of perceptual information that left the motor response sequence intact. All three experiments provided quantifiable estimates of partial transfer to novel contexts that suggest some level of information integration across task elements. However, the ability to identify quantifiable levels of successful transfer indicates that integration is not all-or-none and that measurement sensitivity is a key in understanding sequence knowledge representations. PMID:24668505

  15. Using multilevel models to quantify heterogeneity in resource selection

    USGS Publications Warehouse

    Wagner, Tyler; Diefenbach, Duane R.; Christensen, Sonja; Norton, Andrew S.

    2011-01-01

    Models of resource selection are being used increasingly to predict or model the effects of management actions rather than simply quantifying habitat selection. Multilevel, or hierarchical, models are an increasingly popular method to analyze animal resource selection because they impose a relatively weak stochastic constraint to model heterogeneity in habitat use and also account for unequal sample sizes among individuals. However, few studies have used multilevel models to model coefficients as a function of predictors that may influence habitat use at different scales or quantify differences in resource selection among groups. We used an example with white-tailed deer (Odocoileus virginianus) to illustrate how to model resource use as a function of distance to road that varies among deer by road density at the home range scale. We found that deer avoidance of roads decreased as road density increased. Also, we used multilevel models with sika deer (Cervus nippon) and white-tailed deer to examine whether resource selection differed between species. We failed to detect differences in resource use between these two species and showed how information-theoretic and graphical measures can be used to assess how resource use may have differed. Multilevel models can improve our understanding of how resource selection varies among individuals and provides an objective, quantifiable approach to assess differences or changes in resource selection.

  16. Quantifying prosthetic gait deviation using simple outcome measures

    PubMed Central

    Kark, Lauren; Odell, Ross; McIntosh, Andrew S; Simmons, Anne

    2016-01-01

    AIM: To develop a subset of simple outcome measures to quantify prosthetic gait deviation without needing three-dimensional gait analysis (3DGA). METHODS: Eight unilateral, transfemoral amputees and 12 unilateral, transtibial amputees were recruited. Twenty-eight able-bodied controls were recruited. All participants underwent 3DGA, the timed-up-and-go test and the six-minute walk test (6MWT). The lower-limb amputees also completed the Prosthesis Evaluation Questionnaire. Results from 3DGA were summarised using the gait deviation index (GDI), which was subsequently regressed, using stepwise regression, against the other measures. RESULTS: Step-length (SL), self-selected walking speed (SSWS) and the distance walked during the 6MWT (6MWD) were significantly correlated with GDI. The 6MWD was the strongest, single predictor of the GDI, followed by SL and SSWS. The predictive ability of the regression equations were improved following inclusion of self-report data related to mobility and prosthetic utility. CONCLUSION: This study offers a practicable alternative to quantifying kinematic deviation without the need to conduct complete 3DGA. PMID:27335814

  17. Quantifying Neonatal Sucking Performance: Promise of New Methods

    PubMed Central

    Capilouto, Gilson J.; Cunningham, Tommy J.; Mullineaux, David R.; Tamilia, Eleonora; Papadelis, Christos; Giannone, Peter J.

    2017-01-01

    Neonatal feeding has been traditionally understudied so guidelines and evidence-based support for common feeding practices are limited. A major contributing factor to the paucity of evidence-based practice in this area has been the lack of simple-to-use, low-cost tools for monitoring sucking performance. We describe new methods for quantifying neonatal sucking performance that hold significant clinical and research promise. We present early results from an ongoing study investigating neonatal sucking as a marker of risk for adverse neurodevelopmental outcomes. We include quantitative measures of sucking performance to better understand how movement variability evolves during skill acquisition. Results showed the coefficient of variation of suck duration was significantly different between preterm neonates at high risk for developmental concerns (HRPT) and preterm neonates at low risk for developmental concerns (LRPT). For HRPT, results indicated the coefficient of variation of suck smoothness increased from initial feeding to discharge and remained significantly greater than healthy full-term newborns (FT) at discharge. There was no significant difference in our measures between FT and LRPT at discharge. Our findings highlight the need to include neonatal sucking assessment as part of routine clinical care in order to capture the relative risk of adverse neurodevelopmental outcomes at discharge. PMID:28324904

  18. Quantifying Neonatal Sucking Performance: Promise of New Methods.

    PubMed

    Capilouto, Gilson J; Cunningham, Tommy J; Mullineaux, David R; Tamilia, Eleonora; Papadelis, Christos; Giannone, Peter J

    2017-04-01

    Neonatal feeding has been traditionally understudied so guidelines and evidence-based support for common feeding practices are limited. A major contributing factor to the paucity of evidence-based practice in this area has been the lack of simple-to-use, low-cost tools for monitoring sucking performance. We describe new methods for quantifying neonatal sucking performance that hold significant clinical and research promise. We present early results from an ongoing study investigating neonatal sucking as a marker of risk for adverse neurodevelopmental outcomes. We include quantitative measures of sucking performance to better understand how movement variability evolves during skill acquisition. Results showed the coefficient of variation of suck duration was significantly different between preterm neonates at high risk for developmental concerns (HRPT) and preterm neonates at low risk for developmental concerns (LRPT). For HRPT, results indicated the coefficient of variation of suck smoothness increased from initial feeding to discharge and remained significantly greater than healthy full-term newborns (FT) at discharge. There was no significant difference in our measures between FT and LRPT at discharge. Our findings highlight the need to include neonatal sucking assessment as part of routine clinical care in order to capture the relative risk of adverse neurodevelopmental outcomes at discharge. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  19. Entropy Transfer between Residue Pairs and Allostery in Proteins: Quantifying Allosteric Communication in Ubiquitin.

    PubMed

    Hacisuleyman, Aysima; Erman, Burak

    2017-01-01

    It has recently been proposed by Gunasakaran et al. that allostery may be an intrinsic property of all proteins. Here, we develop a computational method that can determine and quantify allosteric activity in any given protein. Based on Schreiber's transfer entropy formulation, our approach leads to an information transfer landscape for the protein that shows the presence of entropy sinks and sources and explains how pairs of residues communicate with each other using entropy transfer. The model can identify the residues that drive the fluctuations of others. We apply the model to Ubiquitin, whose allosteric activity has not been emphasized until recently, and show that there are indeed systematic pathways of entropy and information transfer between residues that correlate well with the activities of the protein. We use 600 nanosecond molecular dynamics trajectories for Ubiquitin and its complex with human polymerase iota and evaluate entropy transfer between all pairs of residues of Ubiquitin and quantify the binding susceptibility changes upon complex formation. We explain the complex formation propensities of Ubiquitin in terms of entropy transfer. Important residues taking part in allosteric communication in Ubiquitin predicted by our approach are in agreement with results of NMR relaxation dispersion experiments. Finally, we show that time delayed correlation of fluctuations of two interacting residues possesses an intrinsic causality that tells which residue controls the interaction and which one is controlled. Our work shows that time delayed correlations, entropy transfer and causality are the required new concepts for explaining allosteric communication in proteins.

  20. Entropy Transfer between Residue Pairs and Allostery in Proteins: Quantifying Allosteric Communication in Ubiquitin

    PubMed Central

    2017-01-01

    It has recently been proposed by Gunasakaran et al. that allostery may be an intrinsic property of all proteins. Here, we develop a computational method that can determine and quantify allosteric activity in any given protein. Based on Schreiber's transfer entropy formulation, our approach leads to an information transfer landscape for the protein that shows the presence of entropy sinks and sources and explains how pairs of residues communicate with each other using entropy transfer. The model can identify the residues that drive the fluctuations of others. We apply the model to Ubiquitin, whose allosteric activity has not been emphasized until recently, and show that there are indeed systematic pathways of entropy and information transfer between residues that correlate well with the activities of the protein. We use 600 nanosecond molecular dynamics trajectories for Ubiquitin and its complex with human polymerase iota and evaluate entropy transfer between all pairs of residues of Ubiquitin and quantify the binding susceptibility changes upon complex formation. We explain the complex formation propensities of Ubiquitin in terms of entropy transfer. Important residues taking part in allosteric communication in Ubiquitin predicted by our approach are in agreement with results of NMR relaxation dispersion experiments. Finally, we show that time delayed correlation of fluctuations of two interacting residues possesses an intrinsic causality that tells which residue controls the interaction and which one is controlled. Our work shows that time delayed correlations, entropy transfer and causality are the required new concepts for explaining allosteric communication in proteins. PMID:28095404

  1. Quantifying site-specific physical heterogeneity within an estuarine seascape

    USGS Publications Warehouse

    Kennedy, Cristina G.; Mather, Martha E.; Smith, Joseph M.

    2017-01-01

    Quantifying physical heterogeneity is essential for meaningful ecological research and effective resource management. Spatial patterns of multiple, co-occurring physical features are rarely quantified across a seascape because of methodological challenges. Here, we identified approaches that measured total site-specific heterogeneity, an often overlooked aspect of estuarine ecosystems. Specifically, we examined 23 metrics that quantified four types of common physical features: (1) river and creek confluences, (2) bathymetric variation including underwater drop-offs, (3) land features such as islands/sandbars, and (4) major underwater channel networks. Our research at 40 sites throughout Plum Island Estuary (PIE) provided solutions to two problems. The first problem was that individual metrics that measured heterogeneity of a single physical feature showed different regional patterns. We solved this first problem by combining multiple metrics for a single feature using a within-physical feature cluster analysis. With this approach, we identified sites with four different types of confluences and three different types of underwater drop-offs. The second problem was that when multiple physical features co-occurred, new patterns of total site-specific heterogeneity were created across the seascape. This pattern of total heterogeneity has potential ecological relevance to structure-oriented predators. To address this second problem, we identified sites with similar types of total physical heterogeneity using an across-physical feature cluster analysis. Then, we calculated an additive heterogeneity index, which integrated all physical features at a site. Finally, we tested if site-specific additive heterogeneity index values differed for across-physical feature clusters. In PIE, the sites with the highest additive heterogeneity index values were clustered together and corresponded to sites where a fish predator, adult striped bass (Morone saxatilis), aggregated in a

  2. Quantifying Groundwater Model Uncertainty

    NASA Astrophysics Data System (ADS)

    Hill, M. C.; Poeter, E.; Foglia, L.

    2007-12-01

    approach is attainable through universal model analysis software such as UCODE-2005, PEST, and joint use of these programs, which allow many aspects of a model to be defined as parameters. (2) Use highly parameterized models to quantify aspects of (e). While promising, this approach implicitly includes parameterizations that may be considered unreasonable if investigated explicitly, so that resulting measures of uncertainty may be too large. (3) Use a combination of inferential and global methods that can be facilitated using the new software MMA (Multi-Model Analysis), which is constructed using the JUPITER API. Here we consider issues related to the model discrimination criteria calculated by MMA.

  3. Quantifying quantum coherence with quantum Fisher information.

    PubMed

    Feng, X N; Wei, L F

    2017-11-14

    Quantum coherence is one of the old but always important concepts in quantum mechanics, and now it has been regarded as a necessary resource for quantum information processing and quantum metrology. However, the question of how to quantify the quantum coherence has just been paid the attention recently (see, e.g., Baumgratz et al. PRL, 113. 140401 (2014)). In this paper we verify that the well-known quantum Fisher information (QFI) can be utilized to quantify the quantum coherence, as it satisfies the monotonicity under the typical incoherent operations and the convexity under the mixing of the quantum states. Differing from most of the pure axiomatic methods, quantifying quantum coherence by QFI could be experimentally testable, as the bound of the QFI is practically measurable. The validity of our proposal is specifically demonstrated with the typical phase-damping and depolarizing evolution processes of a generic single-qubit state, and also by comparing it with the other quantifying methods proposed previously.

  4. Resolving and quantifying overlapped chromatographic bands by transmutation

    PubMed

    Malinowski

    2000-09-15

    A new chemometric technique called "transmutation" is developed for the purpose of sharpening overlapped chromatographic bands in order to quantify the components. The "transmutation function" is created from the chromatogram of the pure component of interest, obtained from the same instrument, operating under the same experimental conditions used to record the unresolved chromatogram of the sample mixture. The method is used to quantify mixtures containing toluene, ethylbenzene, m-xylene, naphthalene, and biphenyl from unresolved chromatograms previously reported. The results are compared to those obtained using window factor analysis, rank annihilation factor analysis, and matrix regression analysis. Unlike the latter methods, the transmutation method is not restricted to two-dimensional arrays of data, such as those obtained from HPLC/DAD, but is also applicable to chromatograms obtained from single detector experiments. Limitations of the method are discussed.

  5. Analyzing complex networks evolution through Information Theory quantifiers

    NASA Astrophysics Data System (ADS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martín Gómez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Niño/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  6. Quantifying CO2 Emissions From Individual Power Plants From Space

    NASA Astrophysics Data System (ADS)

    Nassar, Ray; Hill, Timothy G.; McLinden, Chris A.; Wunch, Debra; Jones, Dylan B. A.; Crisp, David

    2017-10-01

    In order to better manage anthropogenic CO2 emissions, improved methods of quantifying emissions are needed at all spatial scales from the national level down to the facility level. Although the Orbiting Carbon Observatory 2 (OCO-2) satellite was not designed for monitoring power plant emissions, we show that in some cases, CO2 observations from OCO-2 can be used to quantify daily CO2 emissions from individual middle- to large-sized coal power plants by fitting the data to plume model simulations. Emission estimates for U.S. power plants are within 1-17% of reported daily emission values, enabling application of the approach to international sites that lack detailed emission information. This affirms that a constellation of future CO2 imaging satellites, optimized for point sources, could monitor emissions from individual power plants to support the implementation of climate policies.

  7. Effect of soil structure on the growth of bacteria in soil quantified using CARD-FISH

    NASA Astrophysics Data System (ADS)

    Juyal, Archana; Eickhorst, Thilo; Falconer, Ruth; Otten, Wilfred

    2014-05-01

    It has been reported that compaction of soil due to use of heavy machinery has resulted in the reduction of crop yield. Compaction affects the physical properties of soil such as bulk density, soil strength and porosity. This causes an alteration in the soil structure which limits the mobility of nutrients, water and air infiltration and root penetration in soil. Several studies have been conducted to explore the effect of soil compaction on plant growth and development. However, there is scant information on the effect of soil compaction on the microbial community and its activities in soil. Understanding the effect of soil compaction on microbial community is essential as microbial activities are very sensitive to abrupt environmental changes in soil. Therefore, the aim of this work was to investigate the effect of soil structure on growth of bacteria in soil. The bulk density of soil was used as a soil physical parameter to quantify the effect of soil compaction. To detect and quantify bacteria in soil the method of catalyzed reporter deposition-fluorescence in situ hybridization (CARD-FISH) was used. This technique results in high intensity fluorescent signals which make it easy to quantify bacteria against high levels of autofluorescence emitted by soil particles and organic matter. In this study, bacterial strains Pseudomonas fluorescens SBW25 and Bacillus subtilis DSM10 were used. Soils of aggregate size 2-1mm were packed at five different bulk densities in polyethylene rings (4.25 cm3).The soil rings were sampled at four different days. Results showed that the total number of bacteria counts was reduced significantly (P

  8. Quantifying Uncertainties in Land Surface Microwave Emissivity Retrievals

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2012-01-01

    Uncertainties in the retrievals of microwave land surface emissivities were quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including SSM/I, TMI and AMSR-E, were studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors in the retrievals. Generally these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 14% (312 K) over desert and 17% (320 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.52% (26 K). In particular, at 85.0/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are mostly likely caused by rain/cloud contamination, which can lead to random errors up to 1017 K under the most severe conditions.

  9. Gains and Pitfalls of Quantifier Elimination as a Teaching Tool

    ERIC Educational Resources Information Center

    Oldenburg, Reinhard

    2015-01-01

    Quantifier Elimination is a procedure that allows simplification of logical formulas that contain quantifiers. Many mathematical concepts are defined in terms of quantifiers and especially in calculus their use has been identified as an obstacle in the learning process. The automatic deduction provided by quantifier elimination thus allows…

  10. Deaf Learners' Knowledge of English Universal Quantifiers

    ERIC Educational Resources Information Center

    Berent, Gerald P.; Kelly, Ronald R.; Porter, Jeffrey E.; Fonzi, Judith

    2008-01-01

    Deaf and hearing students' knowledge of English sentences containing universal quantifiers was compared through their performance on a 50-item, multiple-picture task that required students to decide whether each of five pictures represented a possible meaning of a target sentence. The task assessed fundamental knowledge of quantifier sentences,…

  11. Quantifying gait patterns in Parkinson's disease

    NASA Astrophysics Data System (ADS)

    Romero, Mónica; Atehortúa, Angélica; Romero, Eduardo

    2017-11-01

    Parkinson's disease (PD) is constituted by a set of motor symptoms, namely tremor, rigidity, and bradykinesia, which are usually described but not quantified. This work proposes an objective characterization of PD gait patterns by approximating the single stance phase a single grounded pendulum. This model estimates the force generated by the gait during the single support from gait data. This force describes the motion pattern for different stages of the disease. The model was validated using recorded videos of 8 young control subjects, 10 old control subjects and 10 subjects with Parkinson's disease in different stages. The estimated force showed differences among stages of Parkinson disease, observing a decrease of the estimated force for the advanced stages of this illness.

  12. Seasonal Variations of Quantified Organic Compounds in PM10 over Seoul

    NASA Astrophysics Data System (ADS)

    Choi, N.; Lee, J.; Kim, Y. P.

    2014-12-01

    The concentrations of 87 individual organic matters in the PM10 samples, systematically collected on the roof of the School of Public Health building at Seoul National University (mixed commercial and residential area), Seoul, South Korea on a daily basis from April 2010 to April 2011, were quantified by mean of Gas Chromatography/Mass Spectrometry (GC/MS). The daily average concentrations of five organic groups, alkanes, PAHs, fatty acid, DCAs, and sugars were ranged from 498.40 ng m3 to 10.20 μg m3. The seasonal concentrations of the total quantified organic species were 1.73 μg m3 (Spring), 2.04 μg m3 (Summer), 3.11 μg m3 (Fall), and 3.60 μg m3 (Winter), respectively. All the organic groups showed higher average concentration in winter than in summer. However, some organic compounds among fatty acids, DCAs, and sugars showed reverse pattern. The seasonal concentration patterns and episode variation of individual organic compounds were studied to clarify the emission characteristics of organic matters in PM10.

  13. Cross-linguistic patterns in the acquisition of quantifiers.

    PubMed

    Katsos, Napoleon; Cummins, Chris; Ezeizabarrena, Maria-José; Gavarró, Anna; Kuvač Kraljević, Jelena; Hrzica, Gordana; Grohmann, Kleanthes K; Skordi, Athina; Jensen de López, Kristine; Sundahl, Lone; van Hout, Angeliek; Hollebrandse, Bart; Overweg, Jessica; Faber, Myrthe; van Koert, Margreet; Smith, Nafsika; Vija, Maigi; Zupping, Sirli; Kunnari, Sari; Morisseau, Tiffany; Rusieshvili, Manana; Yatsushiro, Kazuko; Fengler, Anja; Varlokosta, Spyridoula; Konstantzou, Katerina; Farby, Shira; Guasti, Maria Teresa; Vernice, Mirta; Okabe, Reiko; Isobe, Miwa; Crosthwaite, Peter; Hong, Yoonjee; Balčiūnienė, Ingrida; Ahmad Nizar, Yanti Marina; Grech, Helen; Gatt, Daniela; Cheong, Win Nee; Asbjørnsen, Arve; Torkildsen, Janne von Koss; Haman, Ewa; Miękisz, Aneta; Gagarina, Natalia; Puzanova, Julia; Anđelković, Darinka; Savić, Maja; Jošić, Smiljana; Slančová, Daniela; Kapalková, Svetlana; Barberán, Tania; Özge, Duygu; Hassan, Saima; Chan, Cecilia Yuet Hung; Okubo, Tomoya; van der Lely, Heather; Sauerland, Uli; Noveck, Ira

    2016-08-16

    Learners of most languages are faced with the task of acquiring words to talk about number and quantity. Much is known about the order of acquisition of number words as well as the cognitive and perceptual systems and cultural practices that shape it. Substantially less is known about the acquisition of quantifiers. Here, we consider the extent to which systems and practices that support number word acquisition can be applied to quantifier acquisition and conclude that the two domains are largely distinct in this respect. Consequently, we hypothesize that the acquisition of quantifiers is constrained by a set of factors related to each quantifier's specific meaning. We investigate competence with the expressions for "all," "none," "some," "some…not," and "most" in 31 languages, representing 11 language types, by testing 768 5-y-old children and 536 adults. We found a cross-linguistically similar order of acquisition of quantifiers, explicable in terms of four factors relating to their meaning and use. In addition, exploratory analyses reveal that language- and learner-specific factors, such as negative concord and gender, are significant predictors of variation.

  14. Quantifying protein-protein interactions in high throughput using protein domain microarrays.

    PubMed

    Kaushansky, Alexis; Allen, John E; Gordus, Andrew; Stiffler, Michael A; Karp, Ethan S; Chang, Bryan H; MacBeath, Gavin

    2010-04-01

    Protein microarrays provide an efficient way to identify and quantify protein-protein interactions in high throughput. One drawback of this technique is that proteins show a broad range of physicochemical properties and are often difficult to produce recombinantly. To circumvent these problems, we have focused on families of protein interaction domains. Here we provide protocols for constructing microarrays of protein interaction domains in individual wells of 96-well microtiter plates, and for quantifying domain-peptide interactions in high throughput using fluorescently labeled synthetic peptides. As specific examples, we will describe the construction of microarrays of virtually every human Src homology 2 (SH2) and phosphotyrosine binding (PTB) domain, as well as microarrays of mouse PDZ domains, all produced recombinantly in Escherichia coli. For domains that mediate high-affinity interactions, such as SH2 and PTB domains, equilibrium dissociation constants (K(D)s) for their peptide ligands can be measured directly on arrays by obtaining saturation binding curves. For weaker binding domains, such as PDZ domains, arrays are best used to identify candidate interactions, which are then retested and quantified by fluorescence polarization. Overall, protein domain microarrays provide the ability to rapidly identify and quantify protein-ligand interactions with minimal sample consumption. Because entire domain families can be interrogated simultaneously, they provide a powerful way to assess binding selectivity on a proteome-wide scale and provide an unbiased perspective on the connectivity of protein-protein interaction networks.

  15. Quantifying microwear on experimental Mistassini quartzite scrapers: preliminary results of exploratory research using LSCM and scale-sensitive fractal analysis.

    PubMed

    Stemp, W James; Lerner, Harry J; Kristant, Elaine H

    2013-01-01

    Although previous use-wear studies involving quartz and quartzite have been undertaken by archaeologists, these are comparatively few in number. Moreover, there has been relatively little effort to quantify use-wear on stone tools made from quartzite. The purpose of this article is to determine the effectiveness of a measurement system, laser scanning confocal microscopy (LSCM), to document the surface roughness or texture of experimental Mistassini quartzite scrapers used on two different contact materials (fresh and dry deer hide). As in previous studies using LSCM on chert, flint, and obsidian, this exploratory study incorporates a mathematical algorithm that permits the discrimination of surface roughness based on comparisons at multiple scales. Specifically, we employ measures of relative area (RelA) coupled with the F-test to discriminate used from unused stone tool surfaces, as well as surfaces of quartzite scrapers used on dry and fresh deer hide. Our results further demonstrate the effect of raw material variation on use-wear formation and its documentation using LSCM and RelA. © Wiley Periodicals, Inc.

  16. Incorporating both physical and kinetic limitations in quantifying dissolved oxygen flux to aquatic sediments

    USGS Publications Warehouse

    O'Connor, B.L.; Hondzo, Miki; Harvey, J.W.

    2009-01-01

    Traditionally, dissolved oxygen (DO) fluxes have been calculated using the thin-film theory with DO microstructure data in systems characterized by fine sediments and low velocities. However, recent experimental evidence of fluctuating DO concentrations near the sediment-water interface suggests that turbulence and coherent motions control the mass transfer, and the surface renewal theory gives a more mechanistic model for quantifying fluxes. Both models involve quantifying the mass transfer coefficient (k) and the relevant concentration difference (??C). This study compared several empirical models for quantifying k based on both thin-film and surface renewal theories, as well as presents a new method for quantifying ??C (dynamic approach) that is consistent with the observed DO concentration fluctuations near the interface. Data were used from a series of flume experiments that includes both physical and kinetic uptake limitations of the flux. Results indicated that methods for quantifying k and ??C using the surface renewal theory better estimated the DO flux across a range of fluid-flow conditions. ?? 2009 ASCE.

  17. Quantifying effects of cyclic stretch on cell-collagen substrate adhesiveness of vascular endothelial cells.

    PubMed

    Omidvar, Ramin; Tafazzoli-Shadpour, Mohammad; Mahmoodi-Nobar, Farbod; Azadi, Shohreh; Khani, Mohammad-Mehdi

    2018-05-01

    Vascular endothelium is continuously subjected to mechanical stimulation in the form of shear forces due to blood flow as well as tensile forces as a consequence of blood pressure. Such stimuli influence endothelial behavior and regulate cell-tissue interaction for an optimized functionality. This study aimed to quantify influence of cyclic stretch on the adhesive property and stiffness of endothelial cells. The 10% cyclic stretch with frequency of 1 Hz was applied to a layer of endothelial cells cultured on a polydimethylsiloxane substrate. Cell-substrate adhesion of endothelial cells was examined by the novel approach of atomic force microscope-based single-cell force spectroscopy and cell stiffness was measured by atomic force microscopy. Furthermore, the adhesive molecular bonds were evaluated using modified Hertz contact theory. Our results show that overall adhesion of endothelial cells with substrate decreased after cyclic stretch while they became stiffer. Based on the experimental results and theoretical modeling, the decrease in the number of molecular bonds after cyclic stretch was quantified. In conclusion, in vitro cyclic stretch caused alterations in both adhesive capacity and elastic modulus of endothelial cells through mechanotransductive pathways as two major determinants of the function of these cells within the cardiovascular system.

  18. Using Hyperpolarized 129Xe MRI to Quantify the Pulmonary Ventilation Distribution

    PubMed Central

    He, Mu; Driehuys, Bastiaan; Que, Loretta G.; Huang, Yuh-Chin T.

    2017-01-01

    Background Ventilation heterogeneity is impossible to detect with spirometry. Alternatively, pulmonary ventilation can be imaged 3-dimensionally using inhaled 129Xe MRI. To date such images have been quantified primarily based on ventilation defects. Here, we introduce a robust means to transform 129Xe MRI scans such that the underlying ventilation distribution and its heterogeneity can be quantified. Methods Quantitative 129Xe ventilation MRI was conducted in 12 younger (24.7±5.2 yrs), and 10 older (62.2±7.2 yrs) healthy individuals, as well as 9 younger (25.9±6.4 yrs) and 10 older (63.2±6.1 yrs) asthmatics. The younger healthy population was used to establish a reference ventilation distribution and thresholds for 6 intensity bins. These were used to display and quantify regions of ventilation defect (VDR), low ventilation (LVR) and high ventilation (HVR). Results The ventilation distribution in young subjects was roughly Gaussian with a mean and SD of 0.52±0.18, resulting in VDR=2.1±1.3%, LVR=15.6±5.4% and HVR=17.4±3.1%. Older healthy volunteers exhibited a significantly right-skewed distribution (0.46±0.20, p=0.034), resulting in significantly increased VDR (7.0±4.8%, p=0.008) and LVR (24.5±11.5%, p=0.025). In the asthmatics, VDR and LVR increased in the older population, and HVR was significantly reduced (13.5±4.6% vs 18.9±4.5%, p=0.009). Quantitative 129Xe MRI also revealed different ventilation distribution patterns in response to albuterol in two asthmatics with normal FEV1. Conclusions Quantitative 129Xe MRI provides a robust and objective means to display and quantify the pulmonary ventilation distribution, even in subjects who have airway function impairment not appreciated by spirometry. PMID:27617823

  19. Quantifying individual performance in Cricket — A network analysis of batsmen and bowlers

    NASA Astrophysics Data System (ADS)

    Mukherjee, Satyam

    2014-01-01

    Quantifying individual performance in the game of Cricket is critical for team selection in International matches. The number of runs scored by batsmen and wickets taken by bowlers serves as a natural way of quantifying the performance of a cricketer. Traditionally the batsmen and bowlers are rated on their batting or bowling average respectively. However, in a game like Cricket it is always important the manner in which one scores the runs or claims a wicket. Scoring runs against a strong bowling line-up or delivering a brilliant performance against a team with a strong batting line-up deserves more credit. A player’s average is not able to capture this aspect of the game. In this paper we present a refined method to quantify the ‘quality’ of runs scored by a batsman or wickets taken by a bowler. We explore the application of Social Network Analysis (SNA) to rate the players in a team performance. We generate a directed and weighted network of batsmen-bowlers using the player-vs-player information available for Test cricket and ODI cricket. Additionally we generate a network of batsmen and bowlers based on the dismissal record of batsmen in the history of cricket-Test (1877-2011) and ODI (1971-2011). Our results show that M. Muralitharan is the most successful bowler in the history of Cricket. Our approach could potentially be applied in domestic matches to judge a player’s performance which in turn paves the way for a balanced team selection for International matches.

  20. Quantifying consumption rates of dissolved oxygen along bed forms

    NASA Astrophysics Data System (ADS)

    Boano, Fulvio; De Falco, Natalie; Arnon, Shai

    2016-04-01

    Streambed interfaces represent hotspots for nutrient transformations because they host different microbial species, and the evaluation of these reaction rates is important to assess the fate of nutrients in riverine environments. In this work we analyze a series of flume experiments on oxygen demand in dune-shaped hyporheic sediments under losing and gaining flow conditions. We employ a new modeling code to quantify oxygen consumption rates from observed vertical profiles of oxygen concentration. The code accounts for transport by molecular diffusion and water advection, and automatically determines the reaction rates that provide the best fit between observed and modeled concentration values. The results show that reaction rates are not uniformly distributed across the streambed, in agreement with the expected behavior predicted by hyporheic exchange theory. Oxygen consumption was found to be highly influenced by the presence of gaining or losing flow conditions, which controlled the delivery of labile DOC to streambed microorganisms.

  1. Fuzzy Entropy Method for Quantifying Supply Chain Networks Complexity

    NASA Astrophysics Data System (ADS)

    Zhang, Jihui; Xu, Junqin

    Supply chain is a special kind of complex network. Its complexity and uncertainty makes it very difficult to control and manage. Supply chains are faced with a rising complexity of products, structures, and processes. Because of the strong link between a supply chain’s complexity and its efficiency the supply chain complexity management becomes a major challenge of today’s business management. The aim of this paper is to quantify the complexity and organization level of an industrial network working towards the development of a ‘Supply Chain Network Analysis’ (SCNA). By measuring flows of goods and interaction costs between different sectors of activity within the supply chain borders, a network of flows is built and successively investigated by network analysis. The result of this study shows that our approach can provide an interesting conceptual perspective in which the modern supply network can be framed, and that network analysis can handle these issues in practice.

  2. Quantifying sleep architecture dynamics and individual differences using big data and Bayesian networks

    PubMed Central

    Shelton, Christian; Mednick, Sara C.

    2018-01-01

    The pattern of sleep stages across a night (sleep architecture) is influenced by biological, behavioral, and clinical variables. However, traditional measures of sleep architecture such as stage proportions, fail to capture sleep dynamics. Here we quantify the impact of individual differences on the dynamics of sleep architecture and determine which factors or set of factors best predict the next sleep stage from current stage information. We investigated the influence of age, sex, body mass index, time of day, and sleep time on static (e.g. minutes in stage, sleep efficiency) and dynamic measures of sleep architecture (e.g. transition probabilities and stage duration distributions) using a large dataset of 3202 nights from a non-clinical population. Multi-level regressions show that sex effects duration of all Non-Rapid Eye Movement (NREM) stages, and age has a curvilinear relationship for Wake After Sleep Onset (WASO) and slow wave sleep (SWS) minutes. Bayesian network modeling reveals sleep architecture depends on time of day, total sleep time, age and sex, but not BMI. Older adults, and particularly males, have shorter bouts (more fragmentation) of Stage 2, SWS, and they transition less frequently to these stages. Additionally, we showed that the next sleep stage and its duration can be optimally predicted by the prior 2 stages and age. Our results demonstrate the potential benefit of big data and Bayesian network approaches in quantifying static and dynamic architecture of normal sleep. PMID:29641599

  3. Quantifying sleep architecture dynamics and individual differences using big data and Bayesian networks.

    PubMed

    Yetton, Benjamin D; McDevitt, Elizabeth A; Cellini, Nicola; Shelton, Christian; Mednick, Sara C

    2018-01-01

    The pattern of sleep stages across a night (sleep architecture) is influenced by biological, behavioral, and clinical variables. However, traditional measures of sleep architecture such as stage proportions, fail to capture sleep dynamics. Here we quantify the impact of individual differences on the dynamics of sleep architecture and determine which factors or set of factors best predict the next sleep stage from current stage information. We investigated the influence of age, sex, body mass index, time of day, and sleep time on static (e.g. minutes in stage, sleep efficiency) and dynamic measures of sleep architecture (e.g. transition probabilities and stage duration distributions) using a large dataset of 3202 nights from a non-clinical population. Multi-level regressions show that sex effects duration of all Non-Rapid Eye Movement (NREM) stages, and age has a curvilinear relationship for Wake After Sleep Onset (WASO) and slow wave sleep (SWS) minutes. Bayesian network modeling reveals sleep architecture depends on time of day, total sleep time, age and sex, but not BMI. Older adults, and particularly males, have shorter bouts (more fragmentation) of Stage 2, SWS, and they transition less frequently to these stages. Additionally, we showed that the next sleep stage and its duration can be optimally predicted by the prior 2 stages and age. Our results demonstrate the potential benefit of big data and Bayesian network approaches in quantifying static and dynamic architecture of normal sleep.

  4. Skin collagen can be accurately quantified through noninvasive optical method: Validation on a swine study.

    PubMed

    Tzeng, S-Y; Kuo, T-Y; Hu, S-B; Chen, Y-W; Lin, Y-L; Chu, K-Y; Tseng, S-H

    2018-02-01

    Diffuse reflectance spectroscopy (DRS) is a noninvasive optical technology characterized by relatively low system cost and high efficiency. In our previous study, we quantified the relative concentration of collagen for the individual keloid patient. However, no actual value of collagen concentration can prove the reliability of collagen detection by our DRS system. Skin-mimicking phantoms were prepared using different collagen and coffee concentrations, and their chromophore concentrations were quantified using the DRS system to analyze the influence of collagen and other chromophores. Moreover, we used the animal study to compare the DRS system with the collagen evaluation of biopsy section by second-harmonic generation (SHG) microscopy at four different skin parts. In the phantom study, the result showed that coffee chromophore did not severely interfere with collagen concentration recovery. In the animal study, a positive correlation (r=.902) between the DRS system and collagen evaluation with SHG microscopy was found. We have demonstrated that the DRS system can quantify the actual values of collagen concentration and excluded the interference of other chromophores in skin-mimicking phantoms. Furthermore, a high positive correlation was found in the animal study with SHG microscopy. We consider that the DRS is a potential technique and can evaluate skin condition objectively. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Quantifying the evolution of flow boiling bubbles by statistical testing and image analysis: toward a general model.

    PubMed

    Xiao, Qingtai; Xu, Jianxin; Wang, Hua

    2016-08-16

    A new index, the estimate of the error variance, which can be used to quantify the evolution of the flow patterns when multiphase components or tracers are difficultly distinguishable, was proposed. The homogeneity degree of the luminance space distribution behind the viewing windows in the direct contact boiling heat transfer process was explored. With image analysis and a linear statistical model, the F-test of the statistical analysis was used to test whether the light was uniform, and a non-linear method was used to determine the direction and position of a fixed source light. The experimental results showed that the inflection point of the new index was approximately equal to the mixing time. The new index has been popularized and applied to a multiphase macro mixing process by top blowing in a stirred tank. Moreover, a general quantifying model was introduced for demonstrating the relationship between the flow patterns of the bubble swarms and heat transfer. The results can be applied to investigate other mixing processes that are very difficult to recognize the target.

  6. Quantifying the evolution of flow boiling bubbles by statistical testing and image analysis: toward a general model

    PubMed Central

    Xiao, Qingtai; Xu, Jianxin; Wang, Hua

    2016-01-01

    A new index, the estimate of the error variance, which can be used to quantify the evolution of the flow patterns when multiphase components or tracers are difficultly distinguishable, was proposed. The homogeneity degree of the luminance space distribution behind the viewing windows in the direct contact boiling heat transfer process was explored. With image analysis and a linear statistical model, the F-test of the statistical analysis was used to test whether the light was uniform, and a non-linear method was used to determine the direction and position of a fixed source light. The experimental results showed that the inflection point of the new index was approximately equal to the mixing time. The new index has been popularized and applied to a multiphase macro mixing process by top blowing in a stirred tank. Moreover, a general quantifying model was introduced for demonstrating the relationship between the flow patterns of the bubble swarms and heat transfer. The results can be applied to investigate other mixing processes that are very difficult to recognize the target. PMID:27527065

  7. A novel quantified bitterness evaluation model for traditional Chinese herbs based on an animal ethology principle.

    PubMed

    Han, Xue; Jiang, Hong; Han, Li; Xiong, Xi; He, Yanan; Fu, Chaomei; Xu, Runchun; Zhang, Dingkun; Lin, Junzhi; Yang, Ming

    2018-03-01

    Traditional Chinese herbs (TCH) are currently gaining attention in disease prevention and health care plans. However, their general bitter taste hinders their use. Despite the development of a variety of taste evaluation methods, it is still a major challenge to establish a quantitative detection technique that is objective, authentic and sensitive. Based on the two-bottle preference test (TBP), we proposed a novel quantitative strategy using a standardized animal test and a unified quantitative benchmark. To reduce the difference of results, the methodology of TBP was optimized. The relationship between the concentration of quinine and animal preference index (PI) was obtained. Then the PI of TCH was measured through TBP, and bitterness results were converted into a unified numerical system using the relationship of concentration and PI. To verify the authenticity and sensitivity of quantified results, human sensory testing and electronic tongue testing were applied. The quantified results showed a good discrimination ability. For example, the bitterness of Coptidis Rhizoma was equal to 0.0579 mg/mL quinine, and Nelumbinis Folium was equal to 0.0001 mg/mL. The validation results proved that the new assessment method for TCH was objective and reliable. In conclusion, this study provides an option for the quantification of bitterness and the evaluation of taste masking effects.

  8. Quantifying the role of forest soil and bedrock in the acid neutralization of surface water in steep hillslopes.

    PubMed

    Asano, Yuko; Uchida, Taro

    2005-02-01

    The role of soil and bedrock in acid neutralizing processes has been difficult to quantify because of hydrological and biogeochemical uncertainties. To quantify those roles, hydrochemical observations were conducted at two hydrologically well-defined, steep granitic hillslopes in the Tanakami Mountains of Japan. These paired hillslopes are similar except for their soils; Fudoji is leached of base cations (base saturation <6%), while Rachidani is covered with fresh soil (base saturation >30%), because the erosion rate is 100-1000 times greater. The results showed that (1) soil solution pH at the soil-bedrock interface at Fudoji (4.3) was significantly lower than that of Rachidani (5.5), (2) the hillslope discharge pH in both hillslopes was similar (6.7-6.8), and (3) at Fudoji, 60% of the base cations leaching from the hillslope were derived from bedrock, whereas only 20% were derived from bedrock in Rachidani. Further, previously published results showed that the stream pH could not be predicted from the acid deposition rate and soil base saturation status. These results demonstrate that bedrock plays an especially important role when the overlying soil has been leached of base cations. These results indicate that while the status of soil acidification is a first-order control on vulnerability to surface water acidification, in some cases such as at Fudoji, subsurface interaction with the bedrock determines the sensitivity of surface water to acidic deposition.

  9. Quantifying the buildup in extent and complexity of free exploration in mice

    PubMed Central

    Benjamini, Yoav; Fonio, Ehud; Galili, Tal; Havkin, Gregor Z.; Golani, Ilan

    2011-01-01

    To obtain a perspective on an animal's own functional world, we study its behavior in situations that allow the animal to regulate the growth rate of its behavior and provide us with the opportunity to quantify its moment-by-moment developmental dynamics. Thus, we are able to show that mouse exploratory behavior consists of sequences of repeated motion: iterative processes that increase in extent and complexity, whose presumed function is a systematic active management of input acquired during the exploration of a novel environment. We use this study to demonstrate our approach to quantifying behavior: targeting aspects of behavior that are shown to be actively managed by the animal, and using measures that are discriminative across strains and treatments and replicable across laboratories. PMID:21383149

  10. Quantifying Uncertainties in Land-Surface Microwave Emissivity Retrievals

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2013-01-01

    Uncertainties in the retrievals of microwaveland-surface emissivities are quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including the Special Sensor Microwave Imager, the Tropical Rainfall Measuring Mission Microwave Imager, and the Advanced Microwave Scanning Radiometer for Earth Observing System, are studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land-surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors inthe retrievals. Generally, these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 1%-4% (3-12 K) over desert and 1%-7% (3-20 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.5%-2% (2-6 K). In particular, at 85.5/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are most likely caused by rain/cloud contamination, which can lead to random errors up to 10-17 K under the most severe conditions.

  11. Quantifying noise in optical tweezers by allan variance.

    PubMed

    Czerwinski, Fabian; Richardson, Andrew C; Oddershede, Lene B

    2009-07-20

    Much effort is put into minimizing noise in optical tweezers experiments because noise and drift can mask fundamental behaviours of, e.g., single molecule assays. Various initiatives have been taken to reduce or eliminate noise but it has been difficult to quantify their effect. We propose to use Allan variance as a simple and efficient method to quantify noise in optical tweezers setups.We apply the method to determine the optimal measurement time, frequency, and detection scheme, and quantify the effect of acoustic noise in the lab. The method can also be used on-the-fly for determining optimal parameters of running experiments.

  12. Quantifiers More or Less Quantify On-Line: ERP Evidence for Partial Incremental Interpretation

    ERIC Educational Resources Information Center

    Urbach, Thomas P.; Kutas, Marta

    2010-01-01

    Event-related brain potentials were recorded during RSVP reading to test the hypothesis that quantifier expressions are incrementally interpreted fully and immediately. In sentences tapping general knowledge ("Farmers grow crops/worms as their primary source of income"), Experiment 1 found larger N400s for atypical ("worms") than typical objects…

  13. Heart rate measurement as a tool to quantify sedentary behavior.

    PubMed

    Åkerberg, Anna; Koshmak, Gregory; Johansson, Anders; Lindén, Maria

    2015-01-01

    Sedentary work is very common today. The aim of this pilot study was to attempt to differentiate between typical work situations and to investigate the possibility to break sedentary behavior, based on physiological measurement among office workers. Ten test persons used one heart rate based activity monitor (Linkura), one pulse oximeter device (Wrist) and one movement based activity wristband (Fitbit Flex), in different working situations. The results showed that both heart rate devices, Linkura and Wrist, were able to detect differences in heart rate between the different working situations (resting, sitting, standing, slow walk and medium fast walk). The movement based device, Fitbit Flex, was only able to separate differences in steps between slow walk and medium fast walk. It can be concluded that heart rate measurement is a promising tool for quantifying and separating different working situations, such as sitting, standing and walking.

  14. Analysis to Quantify Significant Contribution

    EPA Pesticide Factsheets

    This Technical Support Document provides information that supports EPA’s analysis to quantify upwind state emissions that significantly contribute to nonattainment or interfere with maintenance of National Ambient Air Quality Standards in downwind states.

  15. A Generalizable Methodology for Quantifying User Satisfaction

    NASA Astrophysics Data System (ADS)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  16. Effect of stocking rate on growing juvenile sunshine bass in an outdoor biofloc production system: study shows potential to intensify production of sunshine bass fingerlings

    USDA-ARS?s Scientific Manuscript database

    A dose–response study was conducted in an outdoor biofloc system to begin quantifying the stocking rate production function for sunshine bass advanced fingerlings. Results showed the potential of the outdoor biofloc system to intensify production of advanced sunshine bass fingerlings, but feed cons...

  17. Quantifying the dilution effect for models in ecological epidemiology.

    PubMed

    Roberts, M G; Heesterbeek, J A P

    2018-03-01

    The dilution effect , where an increase in biodiversity results in a reduction in the prevalence of an infectious disease, has been the subject of speculation and controversy. Conversely, an amplification effect occurs when increased biodiversity is related to an increase in prevalence. We explore the conditions under which these effects arise, using multi species compartmental models that integrate ecological and epidemiological interactions. We introduce three potential metrics for quantifying dilution and amplification, one based on infection prevalence in a focal host species, one based on the size of the infected subpopulation of that species and one based on the basic reproduction number. We introduce our approach in the simplest epidemiological setting with two species, and show that the existence and strength of a dilution effect is influenced strongly by the choices made to describe the system and the metric used to gauge the effect. We show that our method can be generalized to any number of species and to more complicated ecological and epidemiological dynamics. Our method allows a rigorous analysis of ecological systems where dilution effects have been postulated, and contributes to future progress in understanding the phenomenon of dilution in the context of infectious disease dynamics and infection risk. © 2018 The Author(s).

  18. Quantifying forest LAI succession in sub-tropical forests using time-series of Landsat data, 1987 -2015

    NASA Astrophysics Data System (ADS)

    Wu, Q.; Song, J.; Wang, J.; Chen, S.; Yu, B.; Liao, L.

    2016-12-01

    Monitoring the dynamics of leaf area index (LAI) throughout the life-cycle of forests (from seeding to maturity) is vital for simulating forest growth and quantifying carbon sequestration. However, all current global LAI produts show extremely low accuracy in forests and the coarse spatial resolution(nearly 1-km) mismatch with the spatial scale of forest inventory plots (nearly 26m*26m). To date, several studies have explored the possibility of satellite data to classify forest succession or predict stand age. And a few studies have explored the potential of using long term Landsat data to monitor the growing trend of forests, but no studies have quantified the inter-annual and intra-annual LAI dynamics along with forest succession. Vegetation indexes are not perfect variables in quantifying forest foliage dynamics. Hallet (1995) suggested remote sensing of biophysical characteristics should shift away from direct inference from vegetation indices toward more physically based algorithms. This work intends to be a pioneer example for improving the accuracy of forests LAI and providing temporal-spatial matching LAI datasets for monitoring forest processes. We integrates the Geometric-Optical and Radiative Transfer (GORT) model with the Physiological Principles Predicting Growth (3-PG) model to improve the estimation of the forest canopy LAI dynamics. Reflectance time-series data from 1987 to 2015 were collected and preprocessed for forests in southern China, using all available Landsat data (with <80% cloud). Effective LAI and true LAI were field measured to validate our results using various instruments, including digital hemispheric photographs (DHP), LAI-2000 Plant Canopy Analyzer (LI-COR), and Tracing radiation and Architecture of Canopies (TRAC). Results show that the relationship between spectral metrics of satellite images and forest LAI is clear in early stages before maturity. 3-PG provide accurate inter-annual trend of forest LAI, while satellite images

  19. Quantifying and improving the efficiency of Gamma Knife treatment plans for brain metastases: results of a 1-year audit.

    PubMed

    Wright, Gavin; Hatfield, Paul; Loughrey, Carmel; Reiner, Beatrice; Bownes, Peter

    2014-12-01

    A method for quantifying the efficiency of Gamma Knife treatment plans for metastases was previously implemented by the authors to retrospectively identify the least efficient plans and has provided insights into improved planning strategies. The aim of the current work was to ascertain whether those insights led to improved treatment plans. Following completion of the initial study, a 1-year audit of metastasis plans created at St. James's Institute of Oncology was carried out. Audited recent plans were compared with the earlier plans of the initial study, in terms of their efficiency and dosimetric quality. The statistical significance of any differences between relevant plan parameters was quantified by Mann-Whitney U-tests. Comparisons were made between all plans and repeated for a reduced set of plans from which the smallest lesions treated with a single 4-mm shot were excluded. The plan parameters compared were a plan efficiency index (PEI), the number of shots, Paddick conformity index (PCI), gradient index (GI), and percent coverage (of the lesion by the prescription isodose). A total of 157 metastatic lesions were included in the audit and were compared with 241 in the initial study. In a comparison of all cases, the audited plans achieved a higher median PEI score than did the earlier plans from the initial study (1.08 vs 1.02), indicating improved efficiency of the audited plans. When the smallest lesions (for which there was little scope for varying plan strategy) were discounted, the improvement in median PEI score was greater (1.23 vs 1.03, p < 0.001). This improvement in efficiency corresponds to an estimated mean (maximum) time saving of 15% (66%) per lesion (11 minutes [64 minutes] on the day of treatment). The modified planning strategy yielding these efficiency improvements did not rely on the use of significantly fewer shots (median 11 vs 11 shots, p = 0.924), nor did it result in significant detriment to dosimetric quality (median coverage 99

  20. Quantifying seasonal variation of leaf area index using near-infrared digital camera in a rice paddy

    NASA Astrophysics Data System (ADS)

    Hwang, Y.; Ryu, Y.; Kim, J.

    2017-12-01

    Digital camera has been widely used to quantify leaf area index (LAI). Numerous simple and automatic methods have been proposed to improve the digital camera based LAI estimates. However, most studies in rice paddy relied on arbitrary thresholds or complex radiative transfer models to make binary images. Moreover, only a few study reported continuous, automatic observation of LAI over the season in rice paddy. The objective of this study is to quantify seasonal variations of LAI using raw near-infrared (NIR) images coupled with a histogram shape-based algorithm in a rice paddy. As vegetation highly reflects the NIR light, we installed NIR digital camera 1.8 m above the ground surface and acquired unsaturated raw format images at one-hour intervals between 15 to 80 º solar zenith angles over the entire growing season in 2016 (from May to September). We applied a sub-pixel classification combined with light scattering correction method. Finally, to confirm the accuracy of the quantified LAI, we also conducted direct (destructive sampling) and indirect (LAI-2200) manual observations of LAI once per ten days on average. Preliminary results show that NIR derived LAI agreed well with in-situ observations but divergence tended to appear once rice canopy is fully developed. The continuous monitoring of LAI in rice paddy will help to understand carbon and water fluxes better and evaluate satellite based LAI products.

  1. Quantifying the statistical complexity of low-frequency fluctuations in semiconductor lasers with optical feedback

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tiana-Alsina, J.; Torrent, M. C.; Masoller, C.

    Low-frequency fluctuations (LFFs) represent a dynamical instability that occurs in semiconductor lasers when they are operated near the lasing threshold and subject to moderate optical feedback. LFFs consist of sudden power dropouts followed by gradual, stepwise recoveries. We analyze experimental time series of intensity dropouts and quantify the complexity of the underlying dynamics employing two tools from information theory, namely, Shannon's entropy and the Martin, Plastino, and Rosso statistical complexity measure. These measures are computed using a method based on ordinal patterns, by which the relative length and ordering of consecutive interdropout intervals (i.e., the time intervals between consecutive intensitymore » dropouts) are analyzed, disregarding the precise timing of the dropouts and the absolute durations of the interdropout intervals. We show that this methodology is suitable for quantifying subtle characteristics of the LFFs, and in particular the transition to fully developed chaos that takes place when the laser's pump current is increased. Our method shows that the statistical complexity of the laser does not increase continuously with the pump current, but levels off before reaching the coherence collapse regime. This behavior coincides with that of the first- and second-order correlations of the interdropout intervals, suggesting that these correlations, and not the chaotic behavior, are what determine the level of complexity of the laser's dynamics. These results hold for two different dynamical regimes, namely, sustained LFFs and coexistence between LFFs and steady-state emission.« less

  2. Quantifying long-term human impact in contrasting environments: Statistical analysis of modern and fossil pollen records

    NASA Astrophysics Data System (ADS)

    Broothaerts, Nils; López-Sáez, José Antonio; Verstraeten, Gert

    2017-04-01

    Reconstructing and quantifying human impact is an important step to understand human-environment interactions in the past. Quantitative measures of human impact on the landscape are needed to fully understand long-term influence of anthropogenic land cover changes on the global climate, ecosystems and geomorphic processes. Nevertheless, quantifying past human impact is not straightforward. Recently, multivariate statistical analysis of fossil pollen records have been proposed to characterize vegetation changes and to get insights in past human impact. Although statistical analysis of fossil pollen data can provide useful insights in anthropogenic driven vegetation changes, still it cannot be used as an absolute quantification of past human impact. To overcome this shortcoming, in this study fossil pollen records were included in a multivariate statistical analysis (cluster analysis and non-metric multidimensional scaling (NMDS)) together with modern pollen data and modern vegetation data. The information on the modern pollen and vegetation dataset can be used to get a better interpretation of the representativeness of the fossil pollen records, and can result in a full quantification of human impact in the past. This methodology was applied in two contrasting environments: SW Turkey and Central Spain. For each region, fossil pollen data from different study sites were integrated, together with modern pollen data and information on modern vegetation. In this way, arboreal cover, grazing pressure and agricultural activities in the past were reconstructed and quantified. The data from SW Turkey provides new integrated information on changing human impact through time in the Sagalassos territory, and shows that human impact was most intense during the Hellenistic and Roman Period (ca. 2200-1750 cal a BP) and decreased and changed in nature afterwards. The data from central Spain shows for several sites that arboreal cover decreases bellow 5% from the Feudal period

  3. Quantifying phalangeal curvature: an empirical comparison of alternative methods.

    PubMed

    Stern, J T; Jungers, W L; Susman, R L

    1995-05-01

    It has been generally assumed and theoretically argued that the curvature of finger and toe bones seen in some nonhuman primates is associated with cheiridial use in an arboreal setting. Assessment of such curvature in fossil primates has been used to infer the positional behavior of these animals. Several methods of quantifying curvature of bones have been proposed. The measure most commonly applied to phalanges is that of included angle, but this has come under some criticism. We consider various other approaches for quantifying phalangeal curvature, demonstrating that some are equivalent to use of included angle, but that one--normalized curvature moment arm (NCMA)--represents a true alternative. A comparison of NCMA to included angle, both calculated on manual and pedal proximal phalanges of humans, apes, some monkeys, and the Hadar fossils, revealed that these two different measures of curvature are highly correlated and result in very similar distributional patterns.

  4. Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness.

    PubMed

    Richie, Megan; Josephson, S Andrew

    2018-01-01

    Construct: Authors examined whether a new vignette-based instrument could isolate and quantify heuristic bias. Heuristics are cognitive shortcuts that may introduce bias and contribute to error. There is no standardized instrument available to quantify heuristic bias in clinical decision making, limiting future study of educational interventions designed to improve calibration of medical decisions. This study presents validity data to support a vignette-based instrument quantifying bias due to the anchoring, availability, and representativeness heuristics. Participants completed questionnaires requiring assignment of probabilities to potential outcomes of medical and nonmedical scenarios. The instrument randomly presented scenarios in one of two versions: Version A, encouraging heuristic bias, and Version B, worded neutrally. The primary outcome was the difference in probability judgments for Version A versus Version B scenario options. Of 167 participants recruited, 139 enrolled. Participants assigned significantly higher mean probability values to Version A scenario options (M = 9.56, SD = 3.75) than Version B (M = 8.98, SD = 3.76), t(1801) = 3.27, p = .001. This result remained significant analyzing medical scenarios alone (Version A, M = 9.41, SD = 3.92; Version B, M = 8.86, SD = 4.09), t(1204) = 2.36, p = .02. Analyzing medical scenarios by heuristic revealed a significant difference between Version A and B for availability (Version A, M = 6.52, SD = 3.32; Version B, M = 5.52, SD = 3.05), t(404) = 3.04, p = .003, and representativeness (Version A, M = 11.45, SD = 3.12; Version B, M = 10.67, SD = 3.71), t(396) = 2.28, p = .02, but not anchoring. Stratifying by training level, students maintained a significant difference between Version A and B medical scenarios (Version A, M = 9.83, SD = 3.75; Version B, M = 9.00, SD = 3.98), t(465) = 2.29, p = .02, but not residents or attendings. Stratifying by heuristic and training level, availability maintained

  5. Quantifying Variability of Avian Colours: Are Signalling Traits More Variable?

    PubMed Central

    Delhey, Kaspar; Peters, Anne

    2008-01-01

    Background Increased variability in sexually selected ornaments, a key assumption of evolutionary theory, is thought to be maintained through condition-dependence. Condition-dependent handicap models of sexual selection predict that (a) sexually selected traits show amplified variability compared to equivalent non-sexually selected traits, and since males are usually the sexually selected sex, that (b) males are more variable than females, and (c) sexually dimorphic traits more variable than monomorphic ones. So far these predictions have only been tested for metric traits. Surprisingly, they have not been examined for bright coloration, one of the most prominent sexual traits. This omission stems from computational difficulties: different types of colours are quantified on different scales precluding the use of coefficients of variation. Methodology/Principal Findings Based on physiological models of avian colour vision we develop an index to quantify the degree of discriminable colour variation as it can be perceived by conspecifics. A comparison of variability in ornamental and non-ornamental colours in six bird species confirmed (a) that those coloured patches that are sexually selected or act as indicators of quality show increased chromatic variability. However, we found no support for (b) that males generally show higher levels of variability than females, or (c) that sexual dichromatism per se is associated with increased variability. Conclusions/Significance We show that it is currently possible to realistically estimate variability of animal colours as perceived by them, something difficult to achieve with other traits. Increased variability of known sexually-selected/quality-indicating colours in the studied species, provides support to the predictions borne from sexual selection theory but the lack of increased overall variability in males or dimorphic colours in general indicates that sexual differences might not always be shaped by similar selective

  6. Quantifying forest mortality with the remote sensing of snow

    NASA Astrophysics Data System (ADS)

    Baker, Emily Hewitt

    Greenhouse gas emissions have altered global climate significantly, increasing the frequency of drought, fire, and pest-related mortality in forests across the western United States, with increasing area affected each year. Associated changes in forests are of great concern for the public, land managers, and the broader scientific community. These increased stresses have resulted in a widespread, spatially heterogeneous decline of forest canopies, which in turn exerts strong controls on the accumulation and melt of the snowpack, and changes forest-atmosphere exchanges of carbon, water, and energy. Most satellite-based retrievals of summer-season forest data are insufficient to quantify canopy, as opposed to the combination of canopy and undergrowth, since the signals of the two types of vegetation greenness have proven persistently difficult to distinguish. To overcome this issue, this research develops a method to quantify forest canopy cover using winter-season fractional snow covered area (FSCA) data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) snow covered area and grain size (MODSCAG) algorithm. In areas where the ground surface and undergrowth are completely snow-covered, a pixel comprises only forest canopy and snow. Following a snowfall event, FSCA initially rises, as snow is intercepted in the canopy, and then falls, as snow unloads. A select set of local minima in a winter F SCA timeseries form a threshold where canopy is snow-free, but forest understory is snow-covered. This serves as a spatially-explicit measurement of forest canopy, and viewable gap fraction (VGF) on a yearly basis. Using this method, we determine that MODIS-observed VGF is significantly correlated with an independent product of yearly crown mortality derived from spectral analysis of Landsat imagery at 25 high-mortality sites in northern Colorado. (r =0.96 +/-0.03, p =0.03). Additionally, we determine the lag timing between green-stage tree mortality and

  7. Quantifying uncertainty in stable isotope mixing models

    DOE PAGES

    Davis, Paul; Syme, James; Heikoop, Jeffrey; ...

    2015-05-19

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [ Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ 15N and δ 18O) butmore » all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated

  8. Defining and quantifying users' mental Imagery-based BCI skills: a first step.

    PubMed

    Lotte, Fabien; Jeunet, Camille

    2018-05-17

    While promising for many applications, Electroencephalography (EEG)-based Brain-Computer Interfaces (BCIs) are still scarcely used outside laboratories, due to a poor reliability. It is thus necessary to study and fix this reliability issue. Doing so requires the use of appropriate reliability metrics to quantify both the classification algorithm and the BCI user's performances. So far, Classification Accuracy (CA) is the typical metric used for both aspects. However, we argue in this paper that CA is a poor metric to study BCI users' skills. Here, we propose a definition and new metrics to quantify such BCI skills for Mental Imagery (MI) BCIs, independently of any classification algorithm. Approach: We first show in this paper that CA is notably unspecific, discrete, training data and classifier dependent, and as such may not always reflect successful self-modulation of EEG patterns by the user. We then propose a definition of MI-BCI skills that reflects how well the user can self-modulate EEG patterns, and thus how well he could control an MI-BCI. Finally, we propose new performance metrics, classDis, restDist and classStab that specifically measure how distinct and stable the EEG patterns produced by the user are, independently of any classifier. Main results: By re-analyzing EEG data sets with such new metrics, we indeed confirmed that CA may hide some increase in MI-BCI skills or hide the user inability to self-modulate a given EEG pattern. On the other hand, our new metrics could reveal such skill improvements as well as identify when a mental task performed by a user was no different than rest EEG. Significance: Our results showed that when studying MI-BCI users' skills, CA should be used with care, and complemented with metrics such as the new ones proposed. Our results also stressed the need to redefine BCI user training by considering the different BCI subskills and their measures. To promote the complementary use

  9. Using nitrate to quantify quick flow in a karst aquifer

    USGS Publications Warehouse

    Mahler, B.J.; Garner, B.D.

    2009-01-01

    In karst aquifers, contaminated recharge can degrade spring water quality, but quantifying the rapid recharge (quick flow) component of spring flow is challenging because of its temporal variability. Here, we investigate the use of nitrate in a two-endmember mixing model to quantify quick flow in Barton Springs, Austin, Texas. Historical nitrate data from recharging creeks and Barton Springs were evaluated to determine a representative nitrate concentration for the aquifer water endmember (1.5 mg/L) and the quick flow endmember (0.17 mg/L for nonstormflow conditions and 0.25 mg/L for stormflow conditions). Under nonstormflow conditions for 1990 to 2005, model results indicated that quick flow contributed from 0% to 55% of spring flow. The nitrate-based two-endmember model was applied to the response of Barton Springs to a storm and results compared to those produced using the same model with ??18O and specific conductance (SC) as tracers. Additionally, the mixing model was modified to allow endmember quick flow values to vary over time. Of the three tracers, nitrate appears to be the most advantageous because it is conservative and because the difference between the concentrations in the two endmembers is large relative to their variance. The ??18O- based model was very sensitive to variability within the quick flow endmember, and SC was not conservative over the timescale of the storm response. We conclude that a nitrate-based two-endmember mixing model might provide a useful approach for quantifying the temporally variable quick flow component of spring flow in some karst systems. ?? 2008 National Ground Water Association.

  10. Quantifying expert diagnosis variability when grading tumor-infiltrating lymphocytes

    NASA Astrophysics Data System (ADS)

    Toro, Paula; Corredor, Germán.; Wang, Xiangxue; Arias, Viviana; Velcheti, Vamsidhar; Madabhushi, Anant; Romero, Eduardo

    2017-11-01

    Tumor-infiltrating lymphocytes (TILs) have proved to play an important role in predicting prognosis, survival, and response to treatment in patients with a variety of solid tumors. Unfortunately, currently, there are not a standardized methodology to quantify the infiltration grade. The aim of this work is to evaluate variability among the reports of TILs given by a group of pathologists who examined a set of digitized Non-Small Cell Lung Cancer samples (n=60). 28 pathologists answered a different number of histopathological images. The agreement among pathologists was evaluated by computing the Kappa index coefficient and the standard deviation of their estimations. Furthermore, TILs reports were correlated with patient's prognosis and survival using the Pearson's correlation coefficient. General results show that the agreement among experts grading TILs in the dataset is low since Kappa values remain below 0.4 and the standard deviation values demonstrate that in none of the images there was a full consensus. Finally, the correlation coefficient for each pathologist also reveals a low association between the pathologists' predictions and the prognosis/survival data. Results suggest the need of defining standardized, objective, and effective strategies to evaluate TILs, so they could be used as a biomarker in the daily routine.

  11. Quantifying climatic controls on river network topology across scales

    NASA Astrophysics Data System (ADS)

    Ranjbar Moshfeghi, S.; Hooshyar, M.; Wang, D.; Singh, A.

    2017-12-01

    Branching structure of river networks is an important topologic and geomorphologic feature that depends on several factors (e.g. climate, tectonic). However, mechanisms that cause these drainage patterns in river networks are poorly understood. In this study, we investigate the effects of varying climatic forcing on river network topology and geomorphology. For this, we select 20 catchments across the United States with different long-term climatic conditions quantified by climate aridity index (AI), defined here as the ratio of mean annual potential evaporation (Ep) to precipitation (P), capturing variation in runoff and vegetation cover. The river networks of these catchments are extracted, using a curvature-based method, from high-resolution (1 m) digital elevation models and several metrics such as drainage density, branching angle, and width functions are computed. We also use a multiscale-entropy-based approach to quantify the topologic irregularity and structural richness of these river networks. Our results reveal systematic impacts of climate forcing on the structure of river networks.

  12. Quantifying the relative contribution of climate and human impacts on streamflow at seasonal scale

    NASA Astrophysics Data System (ADS)

    Xin, Z.; Zhang, L.; Li, Y.; Zhang, C.

    2017-12-01

    Both climate change and human activities have induced changes to hydrology. The quantification of their impacts on streamflow is a challenge, especially at the seasonal scale due to seasonality of climate and human impacts, i.e., water use for irrigation and water storage and release due to reservoir operation. In this study, the decomposition method based on the Budyko hypothesis is extended to the seasonal scale and is used to quantify the climate and human impacts on annual and seasonal streamflow changes. The results are further compared and verified with those simulated by the hydrological method of abcd model. Data are split into two periods (1953-1974 and 1975-2005) to quantify the change. Three seasons, including wet, dry and irrigation seasons are defined by introducing the monthly aridity index. In general, results showed a satisfactory agreement between the Budyko decomposition method and abcd model. Both climate change and human activities were found to induce a decrease in streamflow at the annual scale, with 67% of the change contributed by human activities. At the seasonal scale, the human-induced contribution to the reduced stream flow was 64% and 73% for dry and wet seasons, respectively; whereas in the irrigation season, the impact of human activities on reducing the streamflow was more pronounced (180%) since the climate contributes to increased streamflow. In addition, the quantification results were analyzed for each month in the wet season to reveal the effects of intense precipitation and reservoir operation rules during flood season.

  13. Clinical relevance of quantified fundus autofluorescence in diabetic macular oedema

    PubMed Central

    Yoshitake, S; Murakami, T; Uji, A; Unoki, N; Dodo, Y; Horii, T; Yoshimura, N

    2015-01-01

    Purpose To quantify the signal intensity of fundus autofluorescence (FAF) and evaluate its association with visual function and optical coherence tomography (OCT) findings in diabetic macular oedema (DMO). Methods We reviewed 103 eyes of 78 patients with DMO and 30 eyes of 22 patients without DMO. FAF images were acquired using Heidelberg Retina Angiograph 2, and the signal levels of FAF in the individual subfields of the Early Treatment Diabetic Retinopathy Study grid were measured. We evaluated the association between quantified FAF and the logMAR VA and OCT findings. Results One hundred and three eyes with DMO had lower FAF signal intensity levels in the parafoveal subfields compared with 30 eyes without DMO. The autofluorescence intensity in the parafoveal subfields was associated negatively with logMAR VA and the retinal thickness in the corresponding subfields. The autofluorescence levels in the parafoveal subfield, except the nasal subfield, were lower in eyes with autofluorescent cystoid spaces in the corresponding subfield than in those without autofluorescent cystoid spaces. The autofluorescence level in the central subfield was related to foveal cystoid spaces but not logMAR VA or retinal thickness in the corresponding area. Conclusions Quantified FAF in the parafovea has diagnostic significance and is clinically relevant in DMO. PMID:25771817

  14. [Computer-assisted image processing for quantifying histopathologic variables in the healing of colonic anastomosis in dogs].

    PubMed

    Novelli, M D; Barreto, E; Matos, D; Saad, S S; Borra, R C

    1997-01-01

    The authors present the experimental results of the computerized quantifying of tissular structures involved in the reparative process of colonic anastomosis performed by manual suture and biofragmentable ring. The quantified variables in this study were: oedema fluid, myofiber tissue, blood vessel and cellular nuclei. An image processing software developed at Laboratório de Informática Dedicado à Odontologia (LIDO) was utilized to quantifying the pathognomonic alterations in the inflammatory process in colonic anastomosis performed in 14 dogs. The results were compared to those obtained through traditional way diagnosis by two pathologists in view of counterproof measures. The criteria for these diagnoses were defined in levels represented by absent, light, moderate and intensive which were compared to analysis performed by the computer. There was significant statistical difference between two techniques: the biofragmentable ring technique exhibited low oedema fluid, organized myofiber tissue and higher number of alongated cellular nuclei in relation to manual suture technique. The analysis of histometric variables through computational image processing was considered efficient and powerful to quantify the main tissular inflammatory and reparative changing.

  15. Cross-linguistic patterns in the acquisition of quantifiers

    PubMed Central

    Cummins, Chris; Gavarró, Anna; Kuvač Kraljević, Jelena; Hrzica, Gordana; Grohmann, Kleanthes K.; Skordi, Athina; Jensen de López, Kristine; Sundahl, Lone; van Hout, Angeliek; Hollebrandse, Bart; Overweg, Jessica; Faber, Myrthe; van Koert, Margreet; Smith, Nafsika; Vija, Maigi; Zupping, Sirli; Kunnari, Sari; Morisseau, Tiffany; Rusieshvili, Manana; Yatsushiro, Kazuko; Fengler, Anja; Varlokosta, Spyridoula; Konstantzou, Katerina; Farby, Shira; Guasti, Maria Teresa; Vernice, Mirta; Okabe, Reiko; Isobe, Miwa; Crosthwaite, Peter; Hong, Yoonjee; Balčiūnienė, Ingrida; Ahmad Nizar, Yanti Marina; Grech, Helen; Gatt, Daniela; Cheong, Win Nee; Asbjørnsen, Arve; Torkildsen, Janne von Koss; Haman, Ewa; Miękisz, Aneta; Gagarina, Natalia; Puzanova, Julia; Anđelković, Darinka; Savić, Maja; Jošić, Smiljana; Slančová, Daniela; Kapalková, Svetlana; Barberán, Tania; Özge, Duygu; Hassan, Saima; Chan, Cecilia Yuet Hung; Okubo, Tomoya; van der Lely, Heather; Sauerland, Uli; Noveck, Ira

    2016-01-01

    Learners of most languages are faced with the task of acquiring words to talk about number and quantity. Much is known about the order of acquisition of number words as well as the cognitive and perceptual systems and cultural practices that shape it. Substantially less is known about the acquisition of quantifiers. Here, we consider the extent to which systems and practices that support number word acquisition can be applied to quantifier acquisition and conclude that the two domains are largely distinct in this respect. Consequently, we hypothesize that the acquisition of quantifiers is constrained by a set of factors related to each quantifier’s specific meaning. We investigate competence with the expressions for “all,” “none,” “some,” “some…not,” and “most” in 31 languages, representing 11 language types, by testing 768 5-y-old children and 536 adults. We found a cross-linguistically similar order of acquisition of quantifiers, explicable in terms of four factors relating to their meaning and use. In addition, exploratory analyses reveal that language- and learner-specific factors, such as negative concord and gender, are significant predictors of variation. PMID:27482119

  16. Application of Tsallis Entropy to EEG: Quantifying the Presence of Burst Suppression After Asphyxial Cardiac Arrest in Rats

    PubMed Central

    Zhang, Dandan; Jia, Xiaofeng; Ding, Haiyan; Ye, Datian; Thakor, Nitish V.

    2011-01-01

    Burst suppression (BS) activity in EEG is clinically accepted as a marker of brain dysfunction or injury. Experimental studies in a rodent model of brain injury following asphyxial cardiac arrest (CA) show evidence of BS soon after resuscitation, appearing as a transitional recovery pattern between isoelectricity and continuous EEG. The EEG trends in such experiments suggest varying levels of uncertainty or randomness in the signals. To quantify the EEG data, Shannon entropy and Tsallis entropy (TsEn) are examined. More specifically, an entropy-based measure named TsEn area (TsEnA) is proposed to reveal the presence and the extent of development of BS following brain injury. The methodology of TsEnA and the selection of its parameter are elucidated in detail. To test the validity of this measure, 15 rats were subjected to 7 or 9 min of asphyxial CA. EEG recordings immediately after resuscitation from CA were investigated and characterized by TsEnA. The results show that TsEnA correlates well with the outcome assessed by evaluating the rodents after the experiments using a well-established neurological deficit score (Pearson correlation = 0.86, p ⪡ 0.01). This research shows that TsEnA reliably quantifies the complex dynamics in BS EEG, and may be useful as an experimental or clinical tool for objective estimation of the gravity of brain damage after CA. PMID:19695982

  17. Quantifying the Risk of Blood Exposure in Optometric Clinical Education.

    ERIC Educational Resources Information Center

    Hoppe, Elizabeth

    1997-01-01

    A study attempted to quantify risk of blood exposure in optometric clinical education by surveying optometric interns in their fourth year at the Southern California College of Optometry concerning their history of exposure or use of a needle. Results indicate blood exposure or needle use ranged from 0.95 to 18.71 per 10,000 patient encounters.…

  18. Quantifying the impacts of vegetation changes on catchment storage-discharge dynamics using paired-catchment data

    NASA Astrophysics Data System (ADS)

    Cheng, Lei; Zhang, Lu; Chiew, Francis H. S.; Canadell, Josep G.; Zhao, Fangfang; Wang, Ying-Ping; Hu, Xianqun; Lin, Kairong

    2017-07-01

    It is widely recognized that vegetation changes can significantly affect the local water availability. Methods have been developed to predict the effects of vegetation change on water yield or total streamflow. However, it is still a challenge to predict changes in base flow following vegetation change due to limited understanding of catchment storage-discharge dynamics. In this study, the power law relationship for describing catchment storage-discharge dynamics is reformulated to quantify the changes in storage-discharge relationship resulting from vegetation changes using streamflow data from six paired-catchment experiments, of which two are deforestation catchments and four are afforestation catchments. Streamflow observations from the paired-catchment experiments clearly demonstrate that vegetation changes have led to significant changes in catchment storage-discharge relationships, accounting for about 83-128% of the changes in groundwater discharge in the treated catchments. Deforestation has led to increases in groundwater discharge (or base flow) but afforestation has resulted in decreases in groundwater discharge. Further analysis shows that the contribution of changes in groundwater discharge to the total changes in streamflow varies greatly among experimental catchments ranging from 12% to 80% with a mean of 38 ± 22% (μ ± σ). This study proposed a new method to quantify the effects of vegetation changes on groundwater discharge from catchment storage and will improve our predictability about the impacts of vegetation changes on catchment water yields.

  19. Quantifying muscle patterns and spine load during various forms of the push-up.

    PubMed

    Freeman, Stephanie; Karpowicz, Amy; Gray, John; McGill, Stuart

    2006-03-01

    This study was conducted to quantify the normalized amplitudes of the abdominal wall and back extensor musculature during a variety of push-up styles. We also sought to quantify their impact on spinal loading by calculating spinal compression and torque generation in the L4-5 area. Ten university-age participants, nine males and one female, in good to excellent condition, volunteered to participate in this study. All participants were requested to perform a maximum of 12 different push-up exercises, three trials per exercise. Surface electromyographic data (EMG) were collected bilaterally on rectus abdominis, external oblique, internal oblique, latissimus dorsi, and erector spinae muscles, and unilaterally (right side) on pectoralis major, triceps brachii, biceps brachii, and anterior deltoid muscles. Spine kinetics were obtained using an anatomically detailed model of the torso/spine. This study revealed that more dynamic push-ups (i.e., ballistic, with hand movement) required more muscle activation and higher spine load, whereas placing labile balls under the hands only resulted in modest increases in spine load. Right rectus abdominis (RA) activation was significantly higher than left RA activation during the left hand forward push-up and vice versa for the right hand forward push-up (P < 0.001). External oblique (EO) demonstrated the same switch in dominance during staggered hand push-ups (P < 0.01). The one-arm push-up resulted in the highest spine compression. Skilled participants showed greater synchronicity with peak muscle activation (plyometric type of contractions) during ballistic push-ups. These data will help guide exercise selection for individuals with differing training objectives and injury history.

  20. Stimfit: quantifying electrophysiological data with Python

    PubMed Central

    Guzman, Segundo J.; Schlögl, Alois; Schmidt-Hieber, Christoph

    2013-01-01

    Intracellular electrophysiological recordings provide crucial insights into elementary neuronal signals such as action potentials and synaptic currents. Analyzing and interpreting these signals is essential for a quantitative understanding of neuronal information processing, and requires both fast data visualization and ready access to complex analysis routines. To achieve this goal, we have developed Stimfit, a free software package for cellular neurophysiology with a Python scripting interface and a built-in Python shell. The program supports most standard file formats for cellular neurophysiology and other biomedical signals through the Biosig library. To quantify and interpret the activity of single neurons and communication between neurons, the program includes algorithms to characterize the kinetics of presynaptic action potentials and postsynaptic currents, estimate latencies between pre- and postsynaptic events, and detect spontaneously occurring events. We validate and benchmark these algorithms, give estimation errors, and provide sample use cases, showing that Stimfit represents an efficient, accessible and extensible way to accurately analyze and interpret neuronal signals. PMID:24600389

  1. Quantifying climate changes of the Common Era for Finland

    NASA Astrophysics Data System (ADS)

    Luoto, Tomi P.; Nevalainen, Liisa

    2017-10-01

    In this study, we aim to quantify summer air temperatures from sediment records from Southern, Central and Northern Finland over the past 2000 years. We use lake sediment archives to estimate paleotemperatures applying fossil Chironomidae assemblages and the transfer function approach. The used enhanced Chironomidae-based temperature calibration set was validated in a 70-year high-resolution sediment record against instrumentally measured temperatures. Since the inferred and observed temperatures showed close correlation, we deduced that the new calibration model is reliable for reconstructions beyond the monitoring records. The 700-year long temperature reconstructions from three sites at multi-decadal temporal resolution showed similar trends, although they had differences in timing of the cold Little Ice Age (LIA) and the initiation of recent warming. The 2000-year multi-centennial reconstructions from three different sites showed resemblance with each other having clear signals of the Medieval Climate Anomaly (MCA) and LIA, but with differences in their timing. The influence of external forcing on climate of the southern and central sites appeared to be complex at the decadal scale, but the North Atlantic Oscillation (NAO) was closely linked to the temperature development of the northern site. Solar activity appears to be synchronous with the temperature fluctuations at the multi-centennial scale in all the sites. The present study provides new insights into centennial and decadal variability in air temperature dynamics in Northern Europe and on the external forcing behind these trends. These results are particularly useful in comparing regional responses and lags of temperature trends between different parts of Scandinavia.

  2. A mass-balance model to separate and quantify colloidal and solute redistributions in soil

    USGS Publications Warehouse

    Bern, C.R.; Chadwick, O.A.; Hartshorn, A.S.; Khomo, L.M.; Chorover, J.

    2011-01-01

    Studies of weathering and pedogenesis have long used calculations based upon low solubility index elements to determine mass gains and losses in open systems. One of the questions currently unanswered in these settings is the degree to which mass is transferred in solution (solutes) versus suspension (colloids). Here we show that differential mobility of the low solubility, high field strength (HFS) elements Ti and Zr can trace colloidal redistribution, and we present a model for distinguishing between mass transfer in suspension and solution. The model is tested on a well-differentiated granitic catena located in Kruger National Park, South Africa. Ti and Zr ratios from parent material, soil and colloidal material are substituted into a mixing equation to quantify colloidal movement. The results show zones of both colloid removal and augmentation along the catena. Colloidal losses of 110kgm-2 (-5% relative to parent material) are calculated for one eluviated soil profile. A downslope illuviated profile has gained 169kgm-2 (10%) colloidal material. Elemental losses by mobilization in true solution are ubiquitous across the catena, even in zones of colloidal accumulation, and range from 1418kgm-2 (-46%) for an eluviated profile to 195kgm-2 (-23%) at the bottom of the catena. Quantification of simultaneous mass transfers in solution and suspension provide greater specificity on processes within soils and across hillslopes. Additionally, because colloids include both HFS and other elements, the ability to quantify their redistribution has implications for standard calculations of soil mass balances using such index elements. ?? 2011.

  3. Quantifying hydrologic connectivity with measures from the brain neurosciences - a feasibility study

    NASA Astrophysics Data System (ADS)

    Rinderer, Michael; Ali, Genevieve; Larsen, Laurel

    2017-04-01

    While the concept of connectivity is increasingly applied in hydrology and ecology, little agreement exists on its definition and quantification approaches. In contrast, the neurosciences have developed a systematic conceptualization of connectivity and methods to quantify it. In particular, neuroscientists make a clear distinction between: 1) structural connectivity, which is determined by the anatomy of the brain neural network, 2) functional connectivity, that is based on statistical dependencies between neural signals, and 3) effective connectivity, that allows to infer causal relations based on the assumption that "true" interactions occur with a certain time delay. In a similar vein, in hydrology, structural connectivity can be defined as the physical adjacency of landscape elements that are seen as a prerequisite of material transfer, while functional or process connectivity would rather describe interactions or causal relations between spatial adjacency characteristics and temporally varying factors. While hydrologists have suggested methods to derive structural connectivity (SC), the quantification of functional (FC) or effective connectivity (EC) has remained elusive. The goal of the current study was therefore to apply timeseries analysis methods from brain neuroscience to quantify EC and FC among groundwater (n = 34) and stream discharge (n = 1) monitoring sites in a 20-ha Swiss catchment where topography is assumed to be a major driver of connectivity. SC was assessed through influence maps that quantify the percentage of flow from an upslope site to a downslope site by applying a multiple flow direction algorithm. FC was assessed by cross-correlation, total and partial mutual information while EC was quantified via total and partial entropy, Granger causality and a phase slope index. Our results showed that many structural connections were also expressed as functional or effective connections, which is reasonable in a catchment with shallow perched

  4. Visual Attention and Quantifier-Spreading in Heritage Russian Bilinguals

    ERIC Educational Resources Information Center

    Sekerina, Irina A.; Sauermann, Antje

    2015-01-01

    It is well established in language acquisition research that monolingual children and adult second language learners misinterpret sentences with the universal quantifier "every" and make quantifier-spreading errors that are attributed to a preference for a match in number between two sets of objects. The present Visual World eye-tracking…

  5. Quantifying new water fractions and water age distributions using ensemble hydrograph separation

    NASA Astrophysics Data System (ADS)

    Kirchner, James

    2017-04-01

    Catchment transit times are important controls on contaminant transport, weathering rates, and runoff chemistry. Recent theoretical studies have shown that catchment transit time distributions are nonstationary, reflecting the temporal variability in precipitation forcing, the structural heterogeneity of catchments themselves, and the nonlinearity of the mechanisms controlling storage and transport in the subsurface. The challenge of empirically estimating these nonstationary transit time distributions in real-world catchments, however, has only begun to be explored. Long, high-frequency tracer time series are now becoming available, creating new opportunities to study how rainfall becomes streamflow on timescales of minutes to days following the onset of precipitation. Here I show that the conventional formula used for hydrograph separation can be converted into an equivalent linear regression equation that quantifies the fraction of current rainfall in streamflow across ensembles of precipitation events. These ensembles can be selected to represent different discharge ranges, different precipitation intensities, or different levels of antecedent moisture, thus quantifying how the fraction of "new water" in streamflow varies with forcings such as these. I further show how this approach can be generalized to empirically determine the contributions of precipitation inputs to streamflow across a range of time lags. In this way the short-term tail of the transit time distribution can be directly quantified for an ensemble of precipitation events. Benchmark testing with a simple, nonlinear, nonstationary catchment model demonstrates that this approach quantitatively measures the short tail of the transit time distribution for a wide range of catchment response characteristics. In combination with reactive tracer time series, this approach can potentially be extended to measure short-term chemical reaction rates at the catchment scale. High-frequency tracer time series

  6. Quantifying fossil fuel CO2 from continuous measurements of APO: a novel approach

    NASA Astrophysics Data System (ADS)

    Pickers, Penelope; Manning, Andrew C.; Forster, Grant L.; van der Laan, Sander; Wilson, Phil A.; Wenger, Angelina; Meijer, Harro A. J.; Oram, David E.; Sturges, William T.

    2016-04-01

    Using atmospheric measurements to accurately quantify CO2 emissions from fossil fuel sources requires the separation of biospheric and anthropogenic CO2 fluxes. The ability to quantify the fossil fuel component of CO2 (ffCO2) from atmospheric measurements enables more accurate 'top-down' verification of CO2 emissions inventories, which frequently have large uncertainty. Typically, ffCO2 is quantified (in ppm units) from discrete atmospheric measurements of Δ14CO2, combined with higher resolution atmospheric CO measurements, and with knowledge of CO:ffCO2 ratios. In the United Kingdom (UK), however, measurements of Δ14CO2 are often significantly biased by nuclear power plant influences, which limit the use of this approach. We present a novel approach for quantifying ffCO2 using measurements of APO (Atmospheric Potential Oxygen; a tracer derived from concurrent measurements of CO2 and O2) from two measurement sites in Norfolk, UK. Our approach is similar to that used for quantifying ffCO2 from CO measurements (ffCO2(CO)), whereby ffCO2(APO) = (APOmeas - APObg)/RAPO, where (APOmeas - APObg) is the APO deviation from the background, and RAPO is the APO:CO2 combustion ratio for fossil fuel. Time varying values of RAPO are calculated from the global gridded COFFEE (CO2 release and Oxygen uptake from Fossil Fuel Emission Estimate) dataset, combined with NAME (Numerical Atmospheric-dispersion Modelling Environment) transport model footprints. We compare our ffCO2(APO) results to results obtained using the ffCO2(CO) method, using CO:CO2 fossil fuel emission ratios (RCO) from the EDGAR (Emission Database for Global Atmospheric Research) database. We find that the APO ffCO2 quantification method is more precise than the CO method, owing primarily to a smaller range of possible APO:CO2 fossil fuel emission ratios, compared to the CO:CO2 emission ratio range. Using a long-term dataset of atmospheric O2, CO2, CO and Δ14CO2 from Lutjewad, The Netherlands, we examine the

  7. Quantifying residues from postharvest fumigation of almonds and walnuts with propylene oxide

    USDA-ARS?s Scientific Manuscript database

    A novel analytical approach, involving solvent extraction with methyl tert-butyl ether (MTBE) followed by gas chromatography (GC), was developed to quantify residues that result from the postharvest fumigation of almonds and walnuts with propylene oxide (PPO). Verification and quantification of PPO,...

  8. Quantifying the Incoming Jet Past Heart Valve Prostheses Using Vortex Formation Dynamics

    NASA Astrophysics Data System (ADS)

    Pierrakos, Olga

    2005-11-01

    Heart valve (HV) replacement prostheses are associated with hemodynamic compromises compared to their native counterparts. Traditionally, HV performance and hemodynamics have been quantified using effective orifice size and pressure gradients. However, quality and direction of flow are also important aspects of HV function and relate to HV design, implantation technique, and orientation. The flow past any HV is governed by the generation of shear layers followed by the formation and shedding of organized flow structures in the form of vortex rings (VR). For the first time, vortex formation (VF) in the LV is quantified. Vortex energy measurements allow for calculation of the critical formation number (FN), which is the time at which the VR reaches its maximum strength. Inefficiencies in HV function result in critical FN decrease. This study uses the concept of FN to compare mitral HV prostheses in an in-vitro model (a silicone LV model housed in a piston-driven heart simulator) using Time-resolved Digital Particle Image Velocimetry. Two HVs were studied: a porcine HV and bileaflet MHV, which was tested in an anatomic and non-anatomic orientation. The results suggest that HV orientation and design affect the critical FN. We propose that the critical FN, which is contingent on the HV design, orientation, and physical flow characteristics, serve as a parameter to quantify the incoming jet and the efficiency of the HV.

  9. Quantifying the entropic cost of cellular growth control

    NASA Astrophysics Data System (ADS)

    De Martino, Daniele; Capuani, Fabrizio; De Martino, Andrea

    2017-07-01

    Viewing the ways a living cell can organize its metabolism as the phase space of a physical system, regulation can be seen as the ability to reduce the entropy of that space by selecting specific cellular configurations that are, in some sense, optimal. Here we quantify the amount of regulation required to control a cell's growth rate by a maximum-entropy approach to the space of underlying metabolic phenotypes, where a configuration corresponds to a metabolic flux pattern as described by genome-scale models. We link the mean growth rate achieved by a population of cells to the minimal amount of metabolic regulation needed to achieve it through a phase diagram that highlights how growth suppression can be as costly (in regulatory terms) as growth enhancement. Moreover, we provide an interpretation of the inverse temperature β controlling maximum-entropy distributions based on the underlying growth dynamics. Specifically, we show that the asymptotic value of β for a cell population can be expected to depend on (i) the carrying capacity of the environment, (ii) the initial size of the colony, and (iii) the probability distribution from which the inoculum was sampled. Results obtained for E. coli and human cells are found to be remarkably consistent with empirical evidence.

  10. Quantifying chaotic dynamics from integrate-and-fire processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pavlov, A. N.; Saratov State Technical University, Politehnicheskaya Str. 77, 410054 Saratov; Pavlova, O. N.

    2015-01-15

    Characterizing chaotic dynamics from integrate-and-fire (IF) interspike intervals (ISIs) is relatively easy performed at high firing rates. When the firing rate is low, a correct estimation of Lyapunov exponents (LEs) describing dynamical features of complex oscillations reflected in the IF ISI sequences becomes more complicated. In this work we discuss peculiarities and limitations of quantifying chaotic dynamics from IF point processes. We consider main factors leading to underestimated LEs and demonstrate a way of improving numerical determining of LEs from IF ISI sequences. We show that estimations of the two largest LEs can be performed using around 400 mean periodsmore » of chaotic oscillations in the regime of phase-coherent chaos. Application to real data is discussed.« less

  11. A new methodology for quantifying the impact of water repellency on the filtering function of soils

    NASA Astrophysics Data System (ADS)

    Müller, Karin; Deurer, Markus; Kawamoto, Ken; Hiradate, Syuntaro; Komatsu, Toshiko; Clothier, Brent

    2014-05-01

    Soils deliver a range of ecosystem services, and some of the most valuable relate to the regulating services resulting from the buffering and filtering of solutes by soil. However, it is commonly accepted that soil water repellency (SWR) can lead to finger flow and preferential flow. Yet, there have been few attempts to quantify the impact of such flow phenomena on the buffering and filtering of solutes. No method is available to quantify directly how SWR affects the transport of reactive solutes. We have closed this gap and developed a new method for quantifying solute transport by novel experiments with water-repellent soils. It involves sequentially applying two liquids, one water, and the other a reference fully wetting liquid, namely, aqueous ethanol, to the same intact soil core with air-drying between the application of the two liquids. Our results highlight that sorption experiments are necessary to complement our new method to ascertain directly the impact of SWR on the filtering of a solute. We conducted transport and sorption experiments, by applying our new method, with the herbicide 2,4-Dichlorophenoxyacetic acid and two Andosol top-soils; one from Japan and the other one from New Zealand. Breakthrough curves from the water experiments were characterized by preferential flow with high initial concentrations, tailing and a long prevalence of solutes remaining in the soil. Our results clearly demonstrate and quantify the impact of SWR on the leaching of this herbicide. This technique for quantifying the reduction of the soil's filtering efficiency by SWR enables assessment of the increased risk of groundwater contamination by solutes exogenously applied to water-repellent soils.

  12. Quantifying torso deformity in scoliosis

    NASA Astrophysics Data System (ADS)

    Ajemba, Peter O.; Kumar, Anish; Durdle, Nelson G.; Raso, V. James

    2006-03-01

    Scoliosis affects the alignment of the spine and the shape of the torso. Most scoliosis patients and their families are more concerned about the effect of scoliosis on the torso than its effect on the spine. There is a need to develop robust techniques for quantifying torso deformity based on full torso scans. In this paper, deformation indices obtained from orthogonal maps of full torso scans are used to quantify torso deformity in scoliosis. 'Orthogonal maps' are obtained by applying orthogonal transforms to 3D surface maps. (An 'orthogonal transform' maps a cylindrical coordinate system to a Cartesian coordinate system.) The technique was tested on 361 deformed computer models of the human torso and on 22 scans of volunteers (8 normal and 14 scoliosis). Deformation indices from the orthogonal maps correctly classified up to 95% of the volunteers with a specificity of 1.00 and a sensitivity of 0.91. In addition to classifying scoliosis, the system gives a visual representation of the entire torso in one view and is viable for use in a clinical environment for managing scoliosis.

  13. Quantifying Climatological Ranges and Anomalies for Pacific Coral Reef Ecosystems

    PubMed Central

    Gove, Jamison M.; Williams, Gareth J.; McManus, Margaret A.; Heron, Scott F.; Sandin, Stuart A.; Vetter, Oliver J.; Foley, David G.

    2013-01-01

    Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abiotic–biotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km) from 85% of our study locations. These metrics will

  14. Quantifying climatological ranges and anomalies for Pacific coral reef ecosystems.

    PubMed

    Gove, Jamison M; Williams, Gareth J; McManus, Margaret A; Heron, Scott F; Sandin, Stuart A; Vetter, Oliver J; Foley, David G

    2013-01-01

    Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abiotic-biotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km) from 85% of our study locations. These metrics will help

  15. Metrics to quantify the importance of mixing state for CCN activity

    DOE PAGES

    Ching, Joseph; Fast, Jerome; West, Matthew; ...

    2017-06-21

    It is commonly assumed that models are more prone to errors in predicted cloud condensation nuclei (CCN) concentrations when the aerosol populations are externally mixed. In this work we investigate this assumption by using the mixing state index ( χ) proposed by Riemer and West (2013) to quantify the degree of external and internal mixing of aerosol populations. We combine this metric with particle-resolved model simulations to quantify error in CCN predictions when mixing state information is neglected, exploring a range of scenarios that cover different conditions of aerosol aging. We show that mixing state information does indeed become unimportantmore » for more internally mixed populations, more precisely for populations with χ larger than 75 %. For more externally mixed populations ( χ below 20 %) the relationship of χ and the error in CCN predictions is not unique and ranges from lower than -40 % to about 150 %, depending on the underlying aerosol population and the environmental supersaturation. We explain the reasons for this behavior with detailed process analyses.« less

  16. Metrics to quantify the importance of mixing state for CCN activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ching, Joseph; Fast, Jerome; West, Matthew

    It is commonly assumed that models are more prone to errors in predicted cloud condensation nuclei (CCN) concentrations when the aerosol populations are externally mixed. In this work we investigate this assumption by using the mixing state index ( χ) proposed by Riemer and West (2013) to quantify the degree of external and internal mixing of aerosol populations. We combine this metric with particle-resolved model simulations to quantify error in CCN predictions when mixing state information is neglected, exploring a range of scenarios that cover different conditions of aerosol aging. We show that mixing state information does indeed become unimportantmore » for more internally mixed populations, more precisely for populations with χ larger than 75 %. For more externally mixed populations ( χ below 20 %) the relationship of χ and the error in CCN predictions is not unique and ranges from lower than -40 % to about 150 %, depending on the underlying aerosol population and the environmental supersaturation. We explain the reasons for this behavior with detailed process analyses.« less

  17. Toward quantifying the effectiveness of water trading under uncertainty.

    PubMed

    Luo, B; Huang, G H; Zou, Y; Yin, Y Y

    2007-04-01

    This paper presents a methodology for quantifying the effectiveness of water-trading under uncertainty, by developing an optimization model based on the interval-parameter two-stage stochastic program (TSP) technique. In the study, the effectiveness of a water-trading program is measured by the water volume that can be released through trading from a statistical point of view. The methodology can also deal with recourse water allocation problems generated by randomness in water availability and, at the same time, tackle uncertainties expressed as intervals in the trading system. The developed methodology was tested with a hypothetical water-trading program in an agricultural system in the Swift Current Creek watershed, Canada. Study results indicate that the methodology can effectively measure the effectiveness of a trading program through estimating the water volume being released through trading in a long-term view. A sensitivity analysis was also conducted to analyze the effects of different trading costs on the trading program. It shows that the trading efforts would become ineffective when the trading costs are too high. The case study also demonstrates that the trading program is more effective in a dry season when total water availability is in shortage.

  18. Design and Analysis of a Micromechanical Three-Component Force Sensor for Characterizing and Quantifying Surface Roughness

    NASA Astrophysics Data System (ADS)

    Liang, Q.; Wu, W.; Zhang, D.; Wei, B.; Sun, W.; Wang, Y.; Ge, Y.

    2015-10-01

    Roughness, which can represent the trade-off between manufacturing cost and performance of mechanical components, is a critical predictor of cracks, corrosion and fatigue damage. In order to measure polished or super-finished surfaces, a novel touch probe based on three-component force sensor for characterizing and quantifying surface roughness is proposed by using silicon micromachining technology. The sensor design is based on a cross-beam structure, which ensures that the system possesses high sensitivity and low coupling. The results show that the proposed sensor possesses high sensitivity, low coupling error, and temperature compensation function. The proposed system can be used to investigate micromechanical structures with nanometer accuracy.

  19. Quantifying incident-induced travel delays on freeways using traffic sensor data : phase II

    DOT National Transportation Integrated Search

    2010-12-01

    Traffic incidents cause approximately 50 percent of freeway congestion in metropolitan areas, resulting in extra travel time and fuel cost. Quantifying incident-induced delay (IID) will help people better understand the real costs of incidents, maxim...

  20. Towards simulating and quantifying the light-cone EoR 21-cm signal

    NASA Astrophysics Data System (ADS)

    Mondal, Rajesh; Bharadwaj, Somnath; Datta, Kanan K.

    2018-02-01

    The light-cone (LC) effect causes the Epoch of Reionization (EoR) 21-cm signal T_b (\\hat{n}, ν ) to evolve significantly along the line-of-sight (LoS) direction ν. In the first part of this paper, we present a method to properly incorporate the LC effect in simulations of the EoR 21-cm signal that includes peculiar velocities. Subsequently, we discuss how to quantify the second-order statistics of the EoR 21-cm signal in the presence of the LC effect. We demonstrate that the 3D power spectrum P(k) fails to quantify the entire information because it assumes the signal to be ergodic and periodic, whereas the LC effect breaks these conditions along the LoS. Considering a LC simulation centred at redshift 8 where the mean neutral fraction drops from 0.65 to 0.35 across the box, we find that P(k) misses out ˜ 40 per cent of the information at the two ends of the 17.41 MHz simulation bandwidth. The multifrequency angular power spectrum (MAPS) C_{ℓ}(ν_1,ν_2) quantifies the statistical properties of T_b (\\hat{n}, ν ) without assuming the signal to be ergodic and periodic along the LoS. We expect this to quantify the entire statistical information of the EoR 21-cm signal. We apply MAPS to our LC simulation and present preliminary results for the EoR 21-cm signal.

  1. Quantifying and Generalizing Hydrologic Responses to Dam Regulation using a Statistical Modeling Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McManamay, Ryan A

    2014-01-01

    Despite the ubiquitous existence of dams within riverscapes, much of our knowledge about dams and their environmental effects remains context-specific. Hydrology, more than any other environmental variable, has been studied in great detail with regard to dam regulation. While much progress has been made in generalizing the hydrologic effects of regulation by large dams, many aspects of hydrology show site-specific fidelity to dam operations, small dams (including diversions), and regional hydrologic regimes. A statistical modeling framework is presented to quantify and generalize hydrologic responses to varying degrees of dam regulation. Specifically, the objectives were to 1) compare the effects ofmore » local versus cumulative dam regulation, 2) determine the importance of different regional hydrologic regimes in influencing hydrologic responses to dams, and 3) evaluate how different regulation contexts lead to error in predicting hydrologic responses to dams. Overall, model performance was poor in quantifying the magnitude of hydrologic responses, but performance was sufficient in classifying hydrologic responses as negative or positive. Responses of some hydrologic indices to dam regulation were highly dependent upon hydrologic class membership and the purpose of the dam. The opposing coefficients between local and cumulative-dam predictors suggested that hydrologic responses to cumulative dam regulation are complex, and predicting the hydrology downstream of individual dams, as opposed to multiple dams, may be more easy accomplished using statistical approaches. Results also suggested that particular contexts, including multipurpose dams, high cumulative regulation by multiple dams, diversions, close proximity to dams, and certain hydrologic classes are all sources of increased error when predicting hydrologic responses to dams. Statistical models, such as the ones presented herein, show promise in their ability to model the effects of dam regulation effects

  2. Quantifying functional mobility progress for chronic disease management.

    PubMed

    Boyle, Justin; Karunanithi, Mohan; Wark, Tim; Chan, Wilbur; Colavitti, Christine

    2006-01-01

    A method for quantifying improvements in functional mobility is presented based on patient-worn accelerometer devices. For patients with cardiovascular, respiratory, or other chronic disease, increasing the amount of functional mobility is a large component of rehabilitation programs. We have conducted an observational trial on the use of accelerometers for quantifying mobility improvements in a small group of chronic disease patients (n=15, 48 - 86 yrs). Cognitive impairments precluded complex instrumentation of patients, and movement data was obtained from a single 2-axis accelerometer device worn at the hip. In our trial, movement data collected from accelerometer devices was classified into Lying vs Sitting/Standing vs Walking/Activity movements. This classification enabled the amount of walking to be quantified and graphically presented to clinicians and carers for feedback on exercise efficacy. Presenting long term trends in this data to patients also provides valuable feedback for self managed care and assisting with compliance.

  3. Quantifying and minimizing entropy generation in AMTEC cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, T.J.; Huang, C.

    1997-12-31

    Entropy generation in an AMTEC cell represents inherent power loss to the AMTEC cell. Minimizing cell entropy generation directly maximizes cell power generation and efficiency. An internal project is on-going at AMPS to identify, quantify and minimize entropy generation mechanisms within an AMTEC cell, with the goal of determining cost-effective design approaches for maximizing AMTEC cell power generation. Various entropy generation mechanisms have been identified and quantified. The project has investigated several cell design techniques in a solar-driven AMTEC system to minimize cell entropy generation and produce maximum power cell designs. In many cases, various sources of entropy generation aremore » interrelated such that minimizing entropy generation requires cell and system design optimization. Some of the tradeoffs between various entropy generation mechanisms are quantified and explained and their implications on cell design are discussed. The relationship between AMTEC cell power and efficiency and entropy generation is presented and discussed.« less

  4. Quantified Energy Dissipation Rates in the Terrestrial Bow Shock. 1.; Analysis Techniques and Methodology

    NASA Technical Reports Server (NTRS)

    Wilson, L. B., III; Sibeck, D. G.; Breneman, A.W.; Le Contel, O.; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.

    2014-01-01

    We present a detailed outline and discussion of the analysis techniques used to compare the relevance of different energy dissipation mechanisms at collisionless shock waves. We show that the low-frequency, quasi-static fields contribute less to ohmic energy dissipation, (-j · E ) (minus current density times measured electric field), than their high-frequency counterparts. In fact, we found that high-frequency, large-amplitude (greater than 100 millivolts per meter and/or greater than 1 nanotesla) waves are ubiquitous in the transition region of collisionless shocks. We quantitatively show that their fields, through wave-particle interactions, cause enough energy dissipation to regulate the global structure of collisionless shocks. The purpose of this paper, part one of two, is to outline and describe in detail the background, analysis techniques, and theoretical motivation for our new results presented in the companion paper. The companion paper presents the results of our quantitative energy dissipation rate estimates and discusses the implications. Together, the two manuscripts present the first study quantifying the contribution that high-frequency waves provide, through wave-particle interactions, to the total energy dissipation budget of collisionless shock waves.

  5. Quantifying complexity in translational research: an integrated approach.

    PubMed

    Munoz, David A; Nembhard, Harriet Black; Kraschnewski, Jennifer L

    2014-01-01

    The purpose of this paper is to quantify complexity in translational research. The impact of major operational steps and technical requirements is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. A three-phase integrated quality function deployment (QFD) and analytic hierarchy process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, the authors found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research.

  6. Quantifying complexity in translational research: an integrated approach

    PubMed Central

    Munoz, David A.; Nembhard, Harriet Black; Kraschnewski, Jennifer L.

    2014-01-01

    Purpose This article quantifies complexity in translational research. The impact of major operational steps and technical requirements (TR) is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. Design/Methodology/Approach A three-phase integrated Quality Function Deployment (QFD) and Analytic Hierarchy Process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Findings Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, we found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. Research limitations/implications As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. Practical implications The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Originality/value Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research. PMID:25417380

  7. Comprehensive analysis of individual pulp fiber bonds quantifies the mechanisms of fiber bonding in paper

    PubMed Central

    Hirn, Ulrich; Schennach, Robert

    2015-01-01

    The process of papermaking requires substantial amounts of energy and wood consumption, which contributes to larger environmental costs. In order to optimize the production of papermaking to suit its many applications in material science and engineering, a quantitative understanding of bonding forces between the individual pulp fibers is of importance. Here we show the first approach to quantify the bonding energies contributed by the individual bonding mechanisms. We calculated the impact of the following mechanisms necessary for paper formation: mechanical interlocking, interdiffusion, capillary bridges, hydrogen bonding, Van der Waals forces, and Coulomb forces on the bonding energy. Experimental results quantify the area in molecular contact necessary for bonding. Atomic force microscopy experiments derive the impact of mechanical interlocking. Capillary bridges also contribute to the bond. A model based on the crystal structure of cellulose leads to values for the chemical bonds. In contrast to general believe which favors hydrogen bonding Van der Waals bonds play the most important role according to our model. Comparison with experimentally derived bond energies support the presented model. This study characterizes bond formation between pulp fibers leading to insight that could be potentially used to optimize the papermaking process, while reducing energy and wood consumption. PMID:26000898

  8. The likelihood of achieving quantified road safety targets: a binary logistic regression model for possible factors.

    PubMed

    Sze, N N; Wong, S C; Lee, C Y

    2014-12-01

    In past several decades, many countries have set quantified road safety targets to motivate transport authorities to develop systematic road safety strategies and measures and facilitate the achievement of continuous road safety improvement. Studies have been conducted to evaluate the association between the setting of quantified road safety targets and road fatality reduction, in both the short and long run, by comparing road fatalities before and after the implementation of a quantified road safety target. However, not much work has been done to evaluate whether the quantified road safety targets are actually achieved. In this study, we used a binary logistic regression model to examine the factors - including vehicle ownership, fatality rate, and national income, in addition to level of ambition and duration of target - that contribute to a target's success. We analyzed 55 quantified road safety targets set by 29 countries from 1981 to 2009, and the results indicate that targets that are in progress and with lower level of ambitions had a higher likelihood of eventually being achieved. Moreover, possible interaction effects on the association between level of ambition and the likelihood of success are also revealed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Development of a simple computerized torsion test to quantify subjective ocular torsion.

    PubMed

    Kim, Y D; Yang, H K; Hwang, J-M

    2017-11-01

    PurposeThe double Maddox-rod test (DMRT) and Lancaster red-green test (LRGT) are the most widely used tests worldwide to assess subjective ocular torsion. However, these tests require equipment and the quantified results of ocular torsion are only provided in rough values. Here we developed a novel computerized torsion test (CTT) for individual assessment of subjective ocular torsion and validated the reliability and accuracy of the test compared with those of the DMRT and LRGT.MethodsA total of 30 patients with cyclovertical strabismus and 30 controls were recruited. The CTT was designed using Microsoft Office PowerPoint. Subjects wore red-green filter spectacles and viewed gradually tilted red and cyan lines on an LCD monitor and pressed the keyboard to go through the slides, until both lines seemed parallel. All subjects underwent the CTT, DMRT, and LRGT. Intraclass correlation coefficients and Bland-Altman plots were analyzed to assess the acceptability of the CTT compared with that of the DMRT.ResultsBoth the DMRT and CTT showed no significant test-retest differences in the strabismus and control groups. The DMRT and CTT results demonstrated an acceptable agreement. The reliability of the CTT was better than that of the DMRT. The LRGT showed low sensitivity for the detection of ocular torsion compared with the DMRT (40.0%) and CTT (39.1%).ConclusionOur results suggest that the assessment of subjective ocular torsion using the CTT based on PowerPoint software is simple, reproducible, and accurate and can be applied in clinical practice.

  10. Quantifying falsifiability of scientific theories

    NASA Astrophysics Data System (ADS)

    Nemenman, Ilya

    I argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the statistical Occam's razor, and allows transforming some long-running arguments about validity of scientific theories from philosophical discussions to rigorous mathematical calculations.

  11. Quantifying ecological impacts of mass extinctions with network analysis of fossil communities

    PubMed Central

    Muscente, A. D.; Prabhu, Anirudh; Zhong, Hao; Eleish, Ahmed; Meyer, Michael B.; Fox, Peter; Hazen, Robert M.; Knoll, Andrew H.

    2018-01-01

    Mass extinctions documented by the fossil record provide critical benchmarks for assessing changes through time in biodiversity and ecology. Efforts to compare biotic crises of the past and present, however, encounter difficulty because taxonomic and ecological changes are decoupled, and although various metrics exist for describing taxonomic turnover, no methods have yet been proposed to quantify the ecological impacts of extinction events. To address this issue, we apply a network-based approach to exploring the evolution of marine animal communities over the Phanerozoic Eon. Network analysis of fossil co-occurrence data enables us to identify nonrandom associations of interrelated paleocommunities. These associations, or evolutionary paleocommunities, dominated total diversity during successive intervals of relative community stasis. Community turnover occurred largely during mass extinctions and radiations, when ecological reorganization resulted in the decline of one association and the rise of another. Altogether, we identify five evolutionary paleocommunities at the generic and familial levels in addition to three ordinal associations that correspond to Sepkoski’s Cambrian, Paleozoic, and Modern evolutionary faunas. In this context, we quantify magnitudes of ecological change by measuring shifts in the representation of evolutionary paleocommunities over geologic time. Our work shows that the Great Ordovician Biodiversification Event had the largest effect on ecology, followed in descending order by the Permian–Triassic, Cretaceous–Paleogene, Devonian, and Triassic–Jurassic mass extinctions. Despite its taxonomic severity, the Ordovician extinction did not strongly affect co-occurrences of taxa, affirming its limited ecological impact. Network paleoecology offers promising approaches to exploring ecological consequences of extinctions and radiations. PMID:29686079

  12. Quantifying ecological impacts of mass extinctions with network analysis of fossil communities.

    PubMed

    Muscente, A D; Prabhu, Anirudh; Zhong, Hao; Eleish, Ahmed; Meyer, Michael B; Fox, Peter; Hazen, Robert M; Knoll, Andrew H

    2018-05-15

    Mass extinctions documented by the fossil record provide critical benchmarks for assessing changes through time in biodiversity and ecology. Efforts to compare biotic crises of the past and present, however, encounter difficulty because taxonomic and ecological changes are decoupled, and although various metrics exist for describing taxonomic turnover, no methods have yet been proposed to quantify the ecological impacts of extinction events. To address this issue, we apply a network-based approach to exploring the evolution of marine animal communities over the Phanerozoic Eon. Network analysis of fossil co-occurrence data enables us to identify nonrandom associations of interrelated paleocommunities. These associations, or evolutionary paleocommunities, dominated total diversity during successive intervals of relative community stasis. Community turnover occurred largely during mass extinctions and radiations, when ecological reorganization resulted in the decline of one association and the rise of another. Altogether, we identify five evolutionary paleocommunities at the generic and familial levels in addition to three ordinal associations that correspond to Sepkoski's Cambrian, Paleozoic, and Modern evolutionary faunas. In this context, we quantify magnitudes of ecological change by measuring shifts in the representation of evolutionary paleocommunities over geologic time. Our work shows that the Great Ordovician Biodiversification Event had the largest effect on ecology, followed in descending order by the Permian-Triassic, Cretaceous-Paleogene, Devonian, and Triassic-Jurassic mass extinctions. Despite its taxonomic severity, the Ordovician extinction did not strongly affect co-occurrences of taxa, affirming its limited ecological impact. Network paleoecology offers promising approaches to exploring ecological consequences of extinctions and radiations. Copyright © 2018 the Author(s). Published by PNAS.

  13. Quantifying the Availability of Vertebrate Hosts to Ticks: A Camera-Trapping Approach

    PubMed Central

    Hofmeester, Tim R.; Rowcliffe, J. Marcus; Jansen, Patrick A.

    2017-01-01

    The availability of vertebrate hosts is a major determinant of the occurrence of ticks and tick-borne zoonoses in natural and anthropogenic ecosystems and thus drives disease risk for wildlife, livestock, and humans. However, it remains challenging to quantify the availability of vertebrate hosts in field settings, particularly for medium-sized to large-bodied mammals. Here, we present a method that uses camera traps to quantify the availability of warm-bodied vertebrates to ticks. The approach is to deploy camera traps at questing height at a representative sample of random points across the study area, measure the average photographic capture rate for vertebrate species, and then correct these rates for the effective detection distance. The resulting “passage rate” is a standardized measure of the frequency at which vertebrates approach questing ticks, which we show is proportional to contact rate. A field test across twenty 1-ha forest plots in the Netherlands indicated that this method effectively captures differences in wildlife assemblage composition between sites. Also, the relative abundances of three life stages of the sheep tick Ixodes ricinus from drag sampling were correlated with passage rates of deer, which agrees with the known association with this group of host species, suggesting that passage rate effectively reflects the availability of medium- to large-sized hosts to ticks. This method will facilitate quantitative studies of the relationship between densities of questing ticks and the availability of different vertebrate species—wild as well as domesticated species—in natural and anthropogenic settings. PMID:28770219

  14. Quantifying the ventilatory control contribution to sleep apnoea using polysomnography.

    PubMed

    Terrill, Philip I; Edwards, Bradley A; Nemati, Shamim; Butler, James P; Owens, Robert L; Eckert, Danny J; White, David P; Malhotra, Atul; Wellman, Andrew; Sands, Scott A

    2015-02-01

    Elevated loop gain, consequent to hypersensitive ventilatory control, is a primary nonanatomical cause of obstructive sleep apnoea (OSA) but it is not possible to quantify this in the clinic. Here we provide a novel method to estimate loop gain in OSA patients using routine clinical polysomnography alone. We use the concept that spontaneous ventilatory fluctuations due to apnoeas/hypopnoeas (disturbance) result in opposing changes in ventilatory drive (response) as determined by loop gain (response/disturbance). Fitting a simple ventilatory control model (including chemical and arousal contributions to ventilatory drive) to the ventilatory pattern of OSA reveals the underlying loop gain. Following mathematical-model validation, we critically tested our method in patients with OSA by comparison with a standard (continuous positive airway pressure (CPAP) drop method), and by assessing its ability to detect the known reduction in loop gain with oxygen and acetazolamide. Our method quantified loop gain from baseline polysomnography (correlation versus CPAP-estimated loop gain: n=28; r=0.63, p<0.001), detected the known reduction in loop gain with oxygen (n=11; mean±sem change in loop gain (ΔLG) -0.23±0.08, p=0.02) and acetazolamide (n=11; ΔLG -0.20±0.06, p=0.005), and predicted the OSA response to loop gain-lowering therapy. We validated a means to quantify the ventilatory control contribution to OSA pathogenesis using clinical polysomnography, enabling identification of likely responders to therapies targeting ventilatory control. Copyright ©ERS 2015.

  15. Quantifying the ventilatory control contribution to sleep apnoea using polysomnography

    PubMed Central

    Terrill, Philip I.; Edwards, Bradley A.; Nemati, Shamim; Butler, James P.; Owens, Robert L.; Eckert, Danny J.; White, David P.; Malhotra, Atul; Wellman, Andrew; Sands, Scott A.

    2015-01-01

    Elevated loop gain, consequent to hypersensitive ventilatory control, is a primary nonanatomical cause of obstructive sleep apnoea (OSA) but it is not possible to quantify this in the clinic. Here we provide a novel method to estimate loop gain in OSA patients using routine clinical polysomnography alone. We use the concept that spontaneous ventilatory fluctuations due to apnoeas/hypopnoeas (disturbance) result in opposing changes in ventilatory drive (response) as determined by loop gain (response/disturbance). Fitting a simple ventilatory control model (including chemical and arousal contributions to ventilatory drive) to the ventilatory pattern of OSA reveals the underlying loop gain. Following mathematical-model validation, we critically tested our method in patients with OSA by comparison with a standard (continuous positive airway pressure (CPAP) drop method), and by assessing its ability to detect the known reduction in loop gain with oxygen and acetazolamide. Our method quantified loop gain from baseline polysomnography (correlation versus CPAP-estimated loop gain: n=28; r=0.63, p<0.001), detected the known reduction in loop gain with oxygen (n=11; mean±SEM change in loop gain (ΔLG) −0.23±0.08, p=0.02) and acetazolamide (n=11; ΔLG −0.20±0.06, p=0.005), and predicted the OSA response to loop gain-lowering therapy. We validated a means to quantify the ventilatory control contribution to OSA pathogenesis using clinical polysomnography, enabling identification of likely responders to therapies targeting ventilatory control. PMID:25323235

  16. Quantifying South East Asia's forest degradation using latest generation optical and radar satellite remote sensing

    NASA Astrophysics Data System (ADS)

    Broich, M.; Tulbure, M. G.; Wijaya, A.; Weisse, M.; Stolle, F.

    2017-12-01

    Deforestation and forest degradation form the 2nd largest source of anthropogenic CO2 emissions. While deforestation is being globally mapped with satellite image time series, degradation remains insufficiently quantified. Previous studies quantified degradation for small scale, local sites. A method suitable for accurate mapping across large areas has not yet been developed due to the variability of the low magnitude and short-lived degradation signal and the absence of data with suitable resolution properties. Here we use a combination of newly available streams of free optical and radar image time series acquired by NASA and ESA, and HPC-based data science algorithms to innovatively quantify degradation consistently across Southeast Asia (SEA). We used Sentinel1 c-band radar data and NASA's new Harmonized Landsat8 (L8) Sentinel2 (S2) product (HLS) for cloud free optical images. Our results show that dense time series of cloud penetrating Sentinel 1 c-band radar can provide degradation alarm flags, while the HLS product of cloud-free optical images can unambiguously confirm degradation alarms. The detectability of degradation differed across SEA. In the seasonal forest of continental SEA the reliability of our radar-based alarm flags increased as the variability in landscape moisture decreases in the dry season. We reliably confirmed alarms with optical image time series during the late dry season, where degradation in open canopy forests becomes detectable once the undergrowth vegetation has died down. Conversely, in insular SEA landscape moisture is low, the radar time series generated degradation alarms flags with moderate to high reliability throughout the year, further confirmed with the HLS product. Based on the HLS product we can now confirm degradation within < 6 months on average as opposed to 1 year when using either L8 or S2 alone. In contrast to continental SEA, across insular SEA our degradation maps are not suitable to provide annual maps of total

  17. Quantifying Semantic Linguistic Maturity in Children.

    PubMed

    Hansson, Kristina; Bååth, Rasmus; Löhndorf, Simone; Sahlén, Birgitta; Sikström, Sverker

    2016-10-01

    We propose a method to quantify semantic linguistic maturity (SELMA) based on a high dimensional semantic representation of words created from the co-occurrence of words in a large text corpus. The method was applied to oral narratives from 108 children aged 4;0-12;10. By comparing the SELMA measure with maturity ratings made by human raters we found that SELMA predicted the rating of semantic maturity made by human raters over and above the prediction made using a child's age and number of words produced. We conclude that the semantic content of narratives changes in a predictable pattern with children's age and argue that SELMA is a measure quantifying semantic linguistic maturity. The study opens up the possibility of using quantitative measures for studying the development of semantic representation in children's narratives, and emphasizes the importance of word co-occurrences for understanding the development of meaning.

  18. New Approaches to Quantifying Transport Model Error in Atmospheric CO2 Simulations

    NASA Technical Reports Server (NTRS)

    Ott, L.; Pawson, S.; Zhu, Z.; Nielsen, J. E.; Collatz, G. J.; Gregg, W. W.

    2012-01-01

    In recent years, much progress has been made in observing CO2 distributions from space. However, the use of these observations to infer source/sink distributions in inversion studies continues to be complicated by difficulty in quantifying atmospheric transport model errors. We will present results from several different experiments designed to quantify different aspects of transport error using the Goddard Earth Observing System, Version 5 (GEOS-5) Atmospheric General Circulation Model (AGCM). In the first set of experiments, an ensemble of simulations is constructed using perturbations to parameters in the model s moist physics and turbulence parameterizations that control sub-grid scale transport of trace gases. Analysis of the ensemble spread and scales of temporal and spatial variability among the simulations allows insight into how parameterized, small-scale transport processes influence simulated CO2 distributions. In the second set of experiments, atmospheric tracers representing model error are constructed using observation minus analysis statistics from NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA). The goal of these simulations is to understand how errors in large scale dynamics are distributed, and how they propagate in space and time, affecting trace gas distributions. These simulations will also be compared to results from NASA's Carbon Monitoring System Flux Pilot Project that quantified the impact of uncertainty in satellite constrained CO2 flux estimates on atmospheric mixing ratios to assess the major factors governing uncertainty in global and regional trace gas distributions.

  19. Quantifying VOC emissions from polymers: A case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schulze, J.K.; Qasem, J.S.; Snoddy, R.

    1996-12-31

    Evaluating residual volatile organic compound emissions emanating from low-density polyethylene can pose significant challenges. These challenges include quantifying emissions from: (a) multiple process lines with different operating conditions; (b) several different comonomers; (c) variations of comonomer content in each grade; and (d) over 120 grades of LDPE. This presentation is a Case Study outlining a project to develop grade-specific emission data for low-density polyethylene pellets. This study included extensive laboratory analyses and required the development of a relational database to compile analytical results, calculate the mean concentration and standard deviation, and generate emissions reports.

  20. COMPARISON OF MEASUREMENT TECHNIQUES FOR QUANTIFYING SELECTED ORGANIC EMISSIONS FROM KEROSENE SPACE HEATERS

    EPA Science Inventory

    The report goes results of (1) a comparison the hood and chamber techniques for quantifying pollutant emission rates from unvented combustion appliances, and (2) an assessment of the semivolatile and nonvolatile organic-compound emissions from unvented kerosene space heaters. In ...

  1. Meta-Analysis of Quantification Methods Shows that Archaea and Bacteria Have Similar Abundances in the Subseafloor

    PubMed Central

    May, Megan K.; Kevorkian, Richard T.; Steen, Andrew D.

    2013-01-01

    There is no universally accepted method to quantify bacteria and archaea in seawater and marine sediments, and different methods have produced conflicting results with the same samples. To identify best practices, we compiled data from 65 studies, plus our own measurements, in which bacteria and archaea were quantified with fluorescent in situ hybridization (FISH), catalyzed reporter deposition FISH (CARD-FISH), polyribonucleotide FISH, or quantitative PCR (qPCR). To estimate efficiency, we defined “yield” to be the sum of bacteria and archaea counted by these techniques divided by the total number of cells. In seawater, the yield was high (median, 71%) and was similar for FISH, CARD-FISH, and polyribonucleotide FISH. In sediments, only measurements by CARD-FISH in which archaeal cells were permeabilized with proteinase K showed high yields (median, 84%). Therefore, the majority of cells in both environments appear to be alive, since they contain intact ribosomes. In sediments, the sum of bacterial and archaeal 16S rRNA gene qPCR counts was not closely related to cell counts, even after accounting for variations in copy numbers per genome. However, qPCR measurements were precise relative to other qPCR measurements made on the same samples. qPCR is therefore a reliable relative quantification method. Inconsistent results for the relative abundance of bacteria versus archaea in deep subsurface sediments were resolved by the removal of CARD-FISH measurements in which lysozyme was used to permeabilize archaeal cells and qPCR measurements which used ARCH516 as an archaeal primer or TaqMan probe. Data from best-practice methods showed that archaea and bacteria decreased as the depth in seawater and marine sediments increased, although archaea decreased more slowly. PMID:24096423

  2. Quantifying Standing Dead Tree Volume and Structural Loss with Voxelized Terrestrial Lidar Data

    NASA Astrophysics Data System (ADS)

    Popescu, S. C.; Putman, E.

    2017-12-01

    were characterized by estimating the amount of volumetric loss occurring in 20 equal-interval height bins of each SDT. Results showed that large pine snags exhibited more rapid structural loss in comparison to medium-sized oak snags in this study.

  3. Quantifying thermal modifications on laser welded skin tissue

    NASA Astrophysics Data System (ADS)

    Tabakoglu, Hasim Ö.; Gülsoy, Murat

    2011-02-01

    Laser tissue welding is a potential medical treatment method especially on closing cuts implemented during any kind of surgery. Photothermal effects of laser on tissue should be quantified in order to determine optimal dosimetry parameters. Polarized light and phase contrast techniques reveal information about extend of thermal change over tissue occurred during laser welding application. Change in collagen structure in skin tissue stained with hematoxilen and eosin samples can be detected. In this study, three different near infrared laser wavelengths (809 nm, 980 nm and 1070 nm) were compared for skin welding efficiency. 1 cm long cuts were treated spot by spot laser application on Wistar rats' dorsal skin, in vivo. In all laser applications, 0.5 W of optical power was delivered to the tissue, 5 s continuously, resulting in 79.61 J/cm2 energy density (15.92 W/cm2 power density) for each spot. The 1st, 4th, 7th, 14th, and 21st days of recovery period were determined as control days, and skin samples needed for histology were removed on these particular days. The stained samples were examined under a light microscope. Images were taken with a CCD camera and examined with imaging software. 809 Nm laser was found to be capable of creating strong full-thickness closure, but thermal damage was evident. The thermal damage from 980 nm laser welding was found to be more tolerable. The results showed that 1070 nm laser welding produced noticeably stronger bonds with minimal scar formation.

  4. Quantifying the seismicity on Taiwan

    NASA Astrophysics Data System (ADS)

    Wu, Yi-Hsuan; Chen, Chien-Chih; Turcotte, Donald L.; Rundle, John B.

    2013-07-01

    We quantify the seismicity on the island of Taiwan using the frequency-magnitude statistics of earthquakes since 1900. A break in Gutenberg-Richter scaling for large earthquakes in global seismicity has been observed, this break is also observed in our Taiwan study. The seismic data from the Central Weather Bureau Seismic Network are in good agreement with the Gutenberg-Richter relation taking b ≈ 1 when M < 7. For large earthquakes, M ≥ 7, the seismic data fit Gutenberg-Richter scaling with b ≈ 1.5. If the Gutenberg-Richter scaling for M < 7 earthquakes is extrapolated to larger earthquakes, we would expect a M > 8 earthquake in the study region about every 25 yr. However, our analysis shows a lower frequency of occurrence of large earthquakes so that the expected frequency of M > 8 earthquakes is about 200 yr. The level of seismicity for smaller earthquakes on Taiwan is about 12 times greater than in Southern California and the possibility of a M ≈ 9 earthquake north or south of Taiwan cannot be ruled out. In light of the Fukushima, Japan nuclear disaster, we also discuss the implications of our study for the three operating nuclear power plants on the coast of Taiwan.

  5. Quantifying the tibiofemoral joint space using x-ray tomosynthesis.

    PubMed

    Kalinosky, Benjamin; Sabol, John M; Piacsek, Kelly; Heckel, Beth; Gilat Schmidt, Taly

    2011-12-01

    trials. A linear fit estimated a slope of 0.887 (R² = 0.962) and a mean error across all trials of 0.34 mm for the PA phantom data. The estimated minimum JSW values for the lateral adjustable phantom acquisitions were found to have low correlation to the measured values (R² = 0.377), with a mean error of 2.13 mm. The error in the lateral adjustable-phantom datasets appeared to be caused by artifacts due to unrealistic features in the phantom bones. JSW maps generated by DTS and CT varied by a mean of 0.6 mm and 0.8 mm across the knee joint, for PA and lateral scans. The tibial and femoral edges were successfully segmented and JSW maps determined for PA and lateral clinical DTS datasets. A semiautomated method is presented for quantifying the 3D joint space in a 2D JSW map using tomosynthesis images. The proposed algorithm quantified the JSW across the knee joint to sub-millimeter accuracy for PA tomosynthesis acquisitions. Overall, the results suggest that x-ray tomosynthesis may be beneficial for diagnosing and monitoring disease progression or treatment of osteoarthritis by providing quantitative images of JSW in the load-bearing knee.

  6. Two complementary approaches to quantify variability in heat resistance of spores of Bacillus subtilis.

    PubMed

    den Besten, Heidy M W; Berendsen, Erwin M; Wells-Bennik, Marjon H J; Straatsma, Han; Zwietering, Marcel H

    2017-07-17

    Realistic prediction of microbial inactivation in food requires quantitative information on variability introduced by the microorganisms. Bacillus subtilis forms heat resistant spores and in this study the impact of strain variability on spore heat resistance was quantified using 20 strains. In addition, experimental variability was quantified by using technical replicates per heat treatment experiment, and reproduction variability was quantified by using two biologically independent spore crops for each strain that were heat treated on different days. The fourth-decimal reduction times and z-values were estimated by a one-step and two-step model fitting procedure. Grouping of the 20 B. subtilis strains into two statistically distinguishable groups could be confirmed based on their spore heat resistance. The reproduction variability was higher than experimental variability, but both variabilities were much lower than strain variability. The model fitting approach did not significantly affect the quantification of variability. Remarkably, when strain variability in spore heat resistance was quantified using only the strains producing low-level heat resistant spores, then this strain variability was comparable with the previously reported strain variability in heat resistance of vegetative cells of Listeria monocytogenes, although in a totally other temperature range. Strains that produced spores with high-level heat resistance showed similar temperature range for growth as strains that produced low-level heat resistance. Strain variability affected heat resistance of spores most, and therefore integration of this variability factor in modelling of spore heat resistance will make predictions more realistic. Copyright © 2017. Published by Elsevier B.V.

  7. Survey results show that adults are willing to pay higher insurance premiums for generous coverage of specialty drugs.

    PubMed

    Romley, John A; Sanchez, Yuri; Penrod, John R; Goldman, Dana P

    2012-04-01

    Generous coverage of specialty drugs for cancer and other diseases may be valuable not only for sick patients currently using these drugs, but also for healthy people who recognize the potential need for them in the future. This study estimated how healthy people value insurance coverage of specialty drugs, defined as high-cost drugs that treat cancer and other serious health conditions like multiple sclerosis, by quantifying willingness to pay via a survey. US adults were estimated to be willing to pay an extra $12.94 on average in insurance premiums per month for generous specialty-drug coverage--in effect, $2.58 for every dollar in out-of-pocket costs that they would expect to pay with a less generous insurance plan. Given the value that people assign to generous coverage of specialty drugs, having high cost sharing on these drugs seemingly runs contrary to what people value in their health insurance.

  8. Spatially quantifying and attributing 17 years of land cover change to examine post-agricultural forest transition in Hawai`i

    NASA Astrophysics Data System (ADS)

    Lucas, M.; Trauernicht, C.; Carlson, K. M.; Miura, T.; Giambelluca, T. W.; Chen, Q.

    2017-12-01

    The past decades in Hawaii have seen large scale land use change and land cover shifts. However, much these dynamics are only described anecdotally or studied at a single locale, with little information on the extent, rate, or direction of change. This lack of data hinders any effort to assess, plan, and prioritize land management. To improve assessments of statewide vegetation and land cover change, this project developed high resolution, sub-pixel, percent cover maps of forest, grassland and bare earth at annual time steps from 1999 to 2016. Vegetation cover was quantified using archived LANDSAT imagery and a custom remote-sensing algorithm developed in the Google Earth Engine platform. A statistical trend analysis of annual maps of the these three proportional land covers were then used to detect land cover transitions across the archipelago. The aim of this work focused on quantifying the total area of change, annual rates of change and final vegetation cover outcomes statewide. Additionally these findings were attributed to past and current land uses and management history by compiling spatial datasets of development, agriculture, forest restoration sites and burned areas statewide. Results indicated that nearly 10% of the state's land surfaces are suspected to have transitioned between the three cover classes during the study period. Total statewide net change resulted in a gain in forest cover with largest areas of change occurring in unmanaged areas, current and past pastoral land, commercial forestry and abandoned cultivated land. The fastest annual rates of change were forest increases that occurred in restoration areas and commercial forestry. These findings indicate that Hawaii is going through a forest transition, primarily driven by agricultural abandonment with likely feedbacks from invasive species, but also influenced by the establishment of forestry production on former agricultural lands that show potential for native forest restoration. These

  9. PATIENT-CENTERED DECISION MAKING: LESSONS FROM MULTI-CRITERIA DECISION ANALYSIS FOR QUANTIFYING PATIENT PREFERENCES.

    PubMed

    Marsh, Kevin; Caro, J Jaime; Zaiser, Erica; Heywood, James; Hamed, Alaa

    2018-01-01

    Patient preferences should be a central consideration in healthcare decision making. However, stories of patients challenging regulatory and reimbursement decisions has led to questions on whether patient voices are being considered sufficiently during those decision making processes. This has led some to argue that it is necessary to quantify patient preferences before they can be adequately considered. This study considers the lessons from the use of multi-criteria decision analysis (MCDA) for efforts to quantify patient preferences. It defines MCDA and summarizes the benefits it can provide to decision makers, identifies examples of MCDAs that have involved patients, and summarizes good practice guidelines as they relate to quantifying patient preferences. The guidance developed to support the use of MCDA in healthcare provide some useful considerations for the quantification of patient preferences, namely that researchers should give appropriate consideration to: the heterogeneity of patient preferences, and its relevance to decision makers; the cognitive challenges posed by different elicitation methods; and validity of the results they produce. Furthermore, it is important to consider how the relevance of these considerations varies with the decision being supported. The MCDA literature holds important lessons for how patient preferences should be quantified to support healthcare decision making.

  10. Multiscale contact mechanics model for RF-MEMS switches with quantified uncertainties

    NASA Astrophysics Data System (ADS)

    Kim, Hojin; Huda Shaik, Nurul; Xu, Xin; Raman, Arvind; Strachan, Alejandro

    2013-12-01

    We introduce a multiscale model for contact mechanics between rough surfaces and apply it to characterize the force-displacement relationship for a metal-dielectric contact relevant for radio frequency micro-electromechanicl system (MEMS) switches. We propose a mesoscale model to describe the history-dependent force-displacement relationships in terms of the surface roughness, the long-range attractive interaction between the two surfaces, and the repulsive interaction between contacting asperities (including elastic and plastic deformation). The inputs to this model are the experimentally determined surface topography and the Hamaker constant as well as the mechanical response of individual asperities obtained from density functional theory calculations and large-scale molecular dynamics simulations. The model captures non-trivial processes including the hysteresis during loading and unloading due to plastic deformation, yet it is computationally efficient enough to enable extensive uncertainty quantification and sensitivity analysis. We quantify how uncertainties and variability in the input parameters, both experimental and theoretical, affect the force-displacement curves during approach and retraction. In addition, a sensitivity analysis quantifies the relative importance of the various input quantities for the prediction of force-displacement during contact closing and opening. The resulting force-displacement curves with quantified uncertainties can be directly used in device-level simulations of micro-switches and enable the incorporation of atomic and mesoscale phenomena in predictive device-scale simulations.

  11. Quantifying relationships between bird and butterfly community shifts and environmental change.

    PubMed

    Debinski, Diane M; Vannimwegen, Ron E; Jakubauskas, Mark E

    2006-02-01

    Quantifying the manner in which ecological communities respond during a time of decreasing precipitation is a first step in understanding how they will respond to longer-term climate change. Here we coupled analysis of interannual variability in remotely sensed data with analyses of bird and butterfly community changes in montane meadow communities of the Greater Yellowstone Ecosystem. Landsat satellite imagery was used to classify these meadows into six types along a hydrological gradient. The northern portion of the ecosystem, or Gallatin region, has smaller mean patch sizes separated by ridges of mountains, whereas the southern portion of the ecosystem, or Teton region, has much larger patches within the Jackson Hole valley. Both support a similar suite of butterfly and bird species. The Gallatin region showed more overall among-year variation in the normalized difference vegetation index (NDVI) when meadow types were pooled within regions, perhaps because the patch sizes are smaller on average. Bird and butterfly communities showed significant relationships relative to meadow type and NDVI. We identified several key species that are tightly associated with specific meadow types along the hydrological gradient. Comparing taxonomic groups, fewer birds showed specific habitat affinities than butterflies, perhaps because birds are responding to differences in habitat structure among meadow types and using the landscape at a coarser scale than the butterflies. Comparing regions, the Teton region showed higher predictability of community assemblages as compared to the Gallatin region. The Gallatin region exhibited more significant temporal trends with respect to butterflies. Butterfly communities in wet meadows showed a distinctive shift along the hydrological gradient during a drought period (1997-2000). These results imply that the larger Teton meadows will show more predictable (i.e., static) species-habitat associations over the long term, but that the smaller

  12. Quantifying the costs and benefits of occupational health and safety interventions at a Bangladesh shipbuilding company

    PubMed Central

    Thiede, Irene; Thiede, Michael

    2015-01-01

    Background: This study is the first cost–benefit analysis (CBA) of occupational health and safety (OHS) in a low-income country. It focuses on one of the largest shipbuilding companies in Bangladesh, where globally recognised Occupational Health and Safety Advisory Services (OHSAS) 18001 certification was achieved in 2012. Objectives: The study examines the relative costs of implementing OHS measures against qualitative and quantifiable benefits of implementation in order to determine whether OHSAS measures are economically advantageous. Methods: Quantifying past costs and benefits and discounting future ones, this study looks at the returns of OHS measures at Western Marine Shipbuilding Company Ltd. Results: Costs included investments in workplace and environmental safety, a new clinic that also serves the community, and personal protective equipment (PPE) and training. The results are impressive: previously high injury statistics dropped to close to zero. Conclusions: OHS measures decrease injuries, increase efficiency, and bring income security to workers’ families. Certification has proven a competitive edge for the shipyard, resulting in access to greater markets. Intangible benefits such as trust, motivation and security are deemed crucial in the CBA, and this study finds the high investments made are difficult to offset with quantifiable benefits alone. PMID:25589369

  13. Loschmidt echo as a robust decoherence quantifier for many-body systems

    NASA Astrophysics Data System (ADS)

    Zangara, Pablo R.; Dente, Axel D.; Levstein, Patricia R.; Pastawski, Horacio M.

    2012-07-01

    We employ the Loschmidt echo, i.e., the signal recovered after the reversal of an evolution, to identify and quantify the processes contributing to decoherence. This procedure, which has been extensively used in single-particle physics, is employed here in a spin ladder. The isolated chains have 1/2 spins with XY interaction and their excitations would sustain a one-body-like propagation. One of them constitutes the controlled system S whose reversible dynamics is degraded by the weak coupling with the uncontrolled second chain, i.e., the environment E. The perturbative SE coupling is swept through arbitrary combinations of XY and Ising-like interactions, that contain the standard Heisenberg and dipolar ones. Different time regimes are identified for the Loschmidt echo dynamics in this perturbative configuration. In particular, the exponential decay scales as a Fermi golden rule, where the contributions of the different SE terms are individually evaluated and analyzed. Comparisons with previous analytical and numerical evaluations of decoherence based on the attenuation of specific interferences show that the Loschmidt echo is an advantageous decoherence quantifier at any time, regardless of the S internal dynamics.

  14. UV-vis spectra as an alternative to the Lowry method for quantify hair damage induced by surfactants.

    PubMed

    Pires-Oliveira, Rafael; Joekes, Inés

    2014-11-01

    It is well known that long term use of shampoo causes damage to human hair. Although the Lowry method has been widely used to quantify hair damage, it is unsuitable to determine this in the presence of some surfactants and there is no other method proposed in literature. In this work, a different method is used to investigate and compare the hair damage induced by four types of surfactants (including three commercial-grade surfactants) and water. Hair samples were immersed in aqueous solution of surfactants under conditions that resemble a shower (38 °C, constant shaking). These solutions become colored with time of contact with hair and its UV-vis spectra were recorded. For comparison, the amount of extracted proteins from hair by sodium dodecyl sulfate (SDS) and by water were estimated by the Lowry method. Additionally, non-pigmented vs. pigmented hair and also sepia melanin were used to understand the washing solution color and their spectra. The results presented herein show that hair degradation is mostly caused by the extraction of proteins, cuticle fragments and melanin granules from hair fiber. It was found that the intensity of solution color varies with the charge density of the surfactants. Furthermore, the intensity of solution color can be correlated to the amount of proteins quantified by the Lowry method as well as to the degree of hair damage. UV-vis spectrum of hair washing solutions is a simple and straightforward method to quantify and compare hair damages induced by different commercial surfactants. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. The Emergence of the Quantified Child

    ERIC Educational Resources Information Center

    Smith, Rebecca

    2017-01-01

    Using document analysis, this paper examines the historical emergence of the quantified child, revealing how the collection and use of data has become normalized through legitimizing discourses. First, following in the traditions of Foucault's genealogy and studies examining the sociology of numbers, this paper traces the evolution of data…

  16. Quantifying Unnecessary Normal Tissue Complication Risks due to Suboptimal Planning: A Secondary Study of RTOG 0126

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Schmidt, Rachel; Moiseenko, Vitali

    Purpose: The purpose of this study was to quantify the frequency and clinical severity of quality deficiencies in intensity modulated radiation therapy (IMRT) planning in the Radiation Therapy Oncology Group 0126 protocol. Methods and Materials: A total of 219 IMRT patients from the high-dose arm (79.2 Gy) of RTOG 0126 were analyzed. To quantify plan quality, we used established knowledge-based methods for patient-specific dose-volume histogram (DVH) prediction of organs at risk and a Lyman-Kutcher-Burman (LKB) model for grade ≥2 rectal complications to convert DVHs into normal tissue complication probabilities (NTCPs). The LKB model was validated by fitting dose-response parameters relative tomore » observed toxicities. The 90th percentile (22 of 219) of plans with the lowest excess risk (difference between clinical and model-predicted NTCP) were used to create a model for the presumed best practices in the protocol (pDVH{sub 0126,top10%}). Applying the resultant model to the entire sample enabled comparisons between DVHs that patients could have received to DVHs they actually received. Excess risk quantified the clinical impact of suboptimal planning. Accuracy of pDVH predictions was validated by replanning 30 of 219 patients (13.7%), including equal numbers of presumed “high-quality,” “low-quality,” and randomly sampled plans. NTCP-predicted toxicities were compared to adverse events on protocol. Results: Existing models showed that bladder-sparing variations were less prevalent than rectum quality variations and that increased rectal sparing was not correlated with target metrics (dose received by 98% and 2% of the PTV, respectively). Observed toxicities were consistent with current LKB parameters. Converting DVH and pDVH{sub 0126,top10%} to rectal NTCPs, we observed 94 of 219 patients (42.9%) with ≥5% excess risk, 20 of 219 patients (9.1%) with ≥10% excess risk, and 2 of 219 patients (0.9%) with ≥15% excess risk. Replanning demonstrated the

  17. Quantifying the risk of extreme aviation accidents

    NASA Astrophysics Data System (ADS)

    Das, Kumer Pial; Dey, Asim Kumer

    2016-12-01

    Air travel is considered a safe means of transportation. But when aviation accidents do occur they often result in fatalities. Fortunately, the most extreme accidents occur rarely. However, 2014 was the deadliest year in the past decade causing 111 plane crashes, and among them worst four crashes cause 298, 239, 162 and 116 deaths. In this study, we want to assess the risk of the catastrophic aviation accidents by studying historical aviation accidents. Applying a generalized Pareto model we predict the maximum fatalities from an aviation accident in future. The fitted model is compared with some of its competitive models. The uncertainty in the inferences are quantified using simulated aviation accident series, generated by bootstrap resampling and Monte Carlo simulations.

  18. A simple method for quantifying jump loads in volleyball athletes.

    PubMed

    Charlton, Paula C; Kenneally-Dabrowski, Claire; Sheppard, Jeremy; Spratford, Wayne

    2017-03-01

    Evaluate the validity of a commercially available wearable device, the Vert, for measuring vertical displacement and jump count in volleyball athletes. Propose a potential method of quantifying external load during training and match play within this population. Validation study. The ability of the Vert device to measure vertical displacement in male, junior elite volleyball athletes was assessed against reference standard laboratory motion analysis. The ability of the Vert device to count jumps during training and match-play was assessed via comparison with retrospective video analysis to determine precision and recall. A method of quantifying external load, known as the load index (LdIx) algorithm was proposed using the product of the jump count and average kinetic energy. Correlation between two separate Vert devices and three-dimensional trajectory data were good to excellent for all jump types performed (r=0.83-0.97), with a mean bias of between 3.57-4.28cm. When matched against jumps identified through video analysis, the Vert demonstrated excellent precision (0.995-1.000) evidenced by a low number of false positives. The number of false negatives identified with the Vert was higher resulting in lower recall values (0.814-0.930). The Vert is a commercially available tool that has potential for measuring vertical displacement and jump count in elite junior volleyball athletes without the need for time-consuming analysis and bespoke software. Subsequently, allowing the collected data to better quantify load using the proposed algorithm (LdIx). Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  19. Usefullness of three-dimensional templating software to quantify the contact state between implant and femur in total hip arthroplasty.

    PubMed

    Inoue, Daisuke; Kabata, Tamon; Maeda, Toru; Kajino, Yoshitomo; Fujita, Kenji; Hasegawa, Kazuhiro; Yamamoto, Takashi; Takagi, Tomoharu; Ohmori, Takaaki; Tsuchiya, Hiroyuki

    2015-12-01

    It would be ideal if surgeons could precisely confirm whether the planned femoral component achieves the best fit and fill of implant and femur. However, the cortico-cancellous interfaces can be difficult to standardize using plain radiography, and therefore, determining the contact state is a subjective decision by the examiner. Few reports have described the use of CT-based three-dimensional templating software to quantify the contact state of stem and femur in detail. The purpose of this study was to use three-dimensional templating software to quantify the implant-femur contact state and develop a technique to analyze the initial fixation pattern of a cementless femoral stem. We conducted a retrospective review of 55 hips in 53 patients using a short proximal fit-and-fill anatomical stem (APS Natural-Hip™ System). All femurs were examined by density mapping which can visualize and digitize the contact state. We evaluated the contact state of implant and femur by using density mapping. The varus group (cases that had changed varus 2° by 3 months after surgery) consisted of 11 hips. The varus group showed no significant difference with regard to cortical contact in the proximal medial portion (Gruen 7), but the contact area in the distal portion (Gruen 3 and Gruen 5) was significantly lower than that of non-varus group. Density mapping showed that the stem only has to be press-fit to the medial calcar, but also must fill the distal portion of the implant in order to achieve the ideal contact state. Our results indicated that quantifying the contact state of implant and femur by using density mapping is a useful technique to accurately analyze the fixation pattern of a cementless femoral stem.

  20. Quantifying complexity of financial short-term time series by composite multiscale entropy measure

    NASA Astrophysics Data System (ADS)

    Niu, Hongli; Wang, Jun

    2015-05-01

    It is significant to study the complexity of financial time series since the financial market is a complex evolved dynamic system. Multiscale entropy is a prevailing method used to quantify the complexity of a time series. Due to its less reliability of entropy estimation for short-term time series at large time scales, a modification method, the composite multiscale entropy, is applied to the financial market. To qualify its effectiveness, its applications in the synthetic white noise and 1 / f noise with different data lengths are reproduced first in the present paper. Then it is introduced for the first time to make a reliability test with two Chinese stock indices. After conducting on short-time return series, the CMSE method shows the advantages in reducing deviations of entropy estimation and demonstrates more stable and reliable results when compared with the conventional MSE algorithm. Finally, the composite multiscale entropy of six important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.

  1. Quantifying spatial and temporal patterns of flow intermittency using spatially contiguous runoff data

    NASA Astrophysics Data System (ADS)

    Yu (于松延), Songyan; Bond, Nick R.; Bunn, Stuart E.; Xu, Zongxue; Kennard, Mark J.

    2018-04-01

    River channel drying caused by intermittent stream flow is a widely-recognized factor shaping stream ecosystems. There is a strong need to quantify the distribution of intermittent streams across catchments to inform management. However, observational gauge networks provide only point estimates of streamflow variation. Increasingly, this limitation is being overcome through the use of spatially contiguous estimates of the terrestrial water-balance, which can also assist in estimating runoff and streamflow at large-spatial scales. Here we proposed an approach to quantifying spatial and temporal variation in monthly flow intermittency throughout river networks in eastern Australia. We aggregated gridded (5 × 5 km) monthly water-balance data with a hierarchically nested catchment dataset to simulate catchment runoff accumulation throughout river networks from 1900 to 2016. We also predicted zero flow duration for the entire river network by developing a robust predictive model relating measured zero flow duration (% months) to environmental predictor variables (based on 43 stream gauges). We then combined these datasets by using the predicted zero flow duration from the regression model to determine appropriate 'zero' flow thresholds for the modelled discharge data, which varied spatially across the catchments examined. Finally, based on modelled discharge data and identified actual zero flow thresholds, we derived summary metrics describing flow intermittency across the catchment (mean flow duration and coefficient-of-variation in flow permanence from 1900 to 2016). We also classified the relative degree of flow intermittency annually to characterise temporal variation in flow intermittency. Results showed that the degree of flow intermittency varied substantially across streams in eastern Australia, ranging from perennial streams flowing permanently (11-12 months) to strongly intermittent streams flowing 4 months or less of year. Results also showed that the

  2. Hyperspectral remote sensing tools for quantifying plant litter and invasive species in arid ecosystems

    USGS Publications Warehouse

    Nagler, Pamela L.; Sridhar, B.B. Maruthi; Olsson, Aaryn Dyami; Glenn, Edward P.; van Leeuwen, Willem J.D.; Thenkabail, Prasad S.; Huete, Alfredo; Lyon, John G.

    2012-01-01

    Green vegetation can be distinguished using visible and infrared multi-band and hyperspectral remote sensing methods. The problem has been in identifying and distinguishing the non-photosynthetically active radiation (PAR) landscape components, such as litter and soils, and from green vegetation. Additionally, distinguishing different species of green vegetation is challenging using the relatively few bands available on most satellite sensors. This chapter focuses on hyperspectral remote sensing characteristics that aim to distinguish between green vegetation, soil, and litter (or senescent vegetation). Quantifying litter by remote sensing methods is important in constructing carbon budgets of natural and agricultural ecosystems. Distinguishing between plant types is important in tracking the spread of invasive species. Green leaves of different species usually have similar spectra, making it difficult to distinguish between species. However, in this chapter we show that phenological differences between species can be used to detect some invasive species by their distinct patterns of greening and dormancy over an annual cycle based on hyperspectral data. Both applications require methods to quantify the non-green cellulosic fractions of plant tissues by remote sensing even in the presence of soil and green plant cover. We explore these methods and offer three case studies. The first concerns distinguishing surface litter from soil using the Cellulose Absorption Index (CAI), as applied to no-till farming practices where plant litter is left on the soil after harvest. The second involves using different band combinations to distinguish invasive saltcedar from agricultural and native riparian plants on the Lower Colorado River. The third illustrates the use of the CAI and NDVI in time-series analyses to distinguish between invasive buffelgrass and native plants in a desert environment in Arizona. Together the results show how hyperspectral imagery can be applied to

  3. Quantifying ecological thresholds from response surfaces

    Treesearch

    Heather E. Lintz; Bruce McCune; Andrew N. Gray; Katherine A. McCulloh

    2011-01-01

    Ecological thresholds are abrupt changes of ecological state. While an ecological threshold is a widely accepted concept, most empirical methods detect them in time or across geographic space. Although useful, these approaches do not quantify the direct drivers of threshold response. Causal understanding of thresholds detected empirically requires their investigation...

  4. Using NASA`s Airborne Topographic Mapper IV to Quantify Geomorphic Change in Arid Southwestern Stream Systems

    NASA Astrophysics Data System (ADS)

    Finnegan, D. C.; Krabill, W.; Lichvar, R. W.; Ericsson, M. P.; Frederick, E.; Manizade, S.; Yungel, J.; Sonntag, J.; Swift, R.

    2005-12-01

    Understanding how arid stream systems respond to individual climatic events is often difficult given the dynamic and `flashy' nature of most watersheds and the unpredictable nature of individual storm events. Until recently conventional methods for quantifying change dictated the use of stream gauge measurements coupled with periodic cross-section measurements to quantify changes in large-scale channel geometry. Using this approach to quantify change across large areas often proves to be impractical and unattainable given the laborious nature of most surveying techniques including modern GPS systems. Alternately, airborne laser technologies such as NASA's Airborne Topographic Mapper (ATM) are capable of quantifying small-scale changes (~5-10cm) across large-scale terrain rapidly and accurately. The ATM was developed at the NASA-GSFC Wallops Flight Facility. Its current version, ATM-4, measures topography 5,000 times per second across a 45-degree swath below the aircraft by transmitting a 532nm (green) laser pulse and receiving the backscattered signal in a high-speed waveform digitizer. The laser range measurements are combined with aircraft location from GPS and attitude from an inertial navigation system (INS) to provide a precise XYZ coordinate for each (~1-meter diameter) laser footprint on the ground. Our work focuses on the use of airborne laser altimetry to quantify the nature of individual surfaces and the geomorphic change that occurs within small arid stream systems during significant storm events. In September of 2003 and 2005 acquisition surveys using NASA's ATM-IV were flown over Mission Creek, a small arid stream system in Southern California's Mojave Desert with a relatively long gauging history (>40yrs), allowing us to quantify the geomorphic change occurring within the channel as a result of the record storm events during the winter of 2004-2005. Preliminary results associated with our work are encouraging and lead us to believe that when compared

  5. Quantifying solute transport processes: are chemically "conservative" tracers electrically conservative?

    USGS Publications Warehouse

    Singha, Kamini; Li, Li; Day-Lewis, Frederick D.; Regberg, Aaron B.

    2012-01-01

    The concept of a nonreactive or conservative tracer, commonly invoked in investigations of solute transport, requires additional study in the context of electrical geophysical monitoring. Tracers that are commonly considered conservative may undergo reactive processes, such as ion exchange, thus changing the aqueous composition of the system. As a result, the measured electrical conductivity may reflect not only solute transport but also reactive processes. We have evaluated the impacts of ion exchange reactions, rate-limited mass transfer, and surface conduction on quantifying tracer mass, mean arrival time, and temporal variance in laboratory-scale column experiments. Numerical examples showed that (1) ion exchange can lead to resistivity-estimated tracer mass, velocity, and dispersivity that may be inaccurate; (2) mass transfer leads to an overestimate in the mobile tracer mass and an underestimate in velocity when using electrical methods; and (3) surface conductance does not notably affect estimated moments when high-concentration tracers are used, although this phenomenon may be important at low concentrations or in sediments with high and/or spatially variable cation-exchange capacity. In all cases, colocated groundwater concentration measurements are of high importance for interpreting geophysical data with respect to the controlling transport processes of interest.

  6. Statistical physics approach to quantifying differences in myelinated nerve fibers

    NASA Astrophysics Data System (ADS)

    Comin, César H.; Santos, João R.; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L.; Gabrielli, Andrea; da F. Costa, Luciano; Stanley, H. Eugene

    2014-03-01

    We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross-sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum.

  7. Quantifying on- and off-target genome editing.

    PubMed

    Hendel, Ayal; Fine, Eli J; Bao, Gang; Porteus, Matthew H

    2015-02-01

    Genome editing with engineered nucleases is a rapidly growing field thanks to transformative technologies that allow researchers to precisely alter genomes for numerous applications including basic research, biotechnology, and human gene therapy. While the ability to make precise and controlled changes at specified sites throughout the genome has grown tremendously in recent years, we still lack a comprehensive and standardized battery of assays for measuring the different genome editing outcomes created at endogenous genomic loci. Here we review the existing assays for quantifying on- and off-target genome editing and describe their utility in advancing the technology. We also highlight unmet assay needs for quantifying on- and off-target genome editing outcomes and discuss their importance for the genome editing field. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Quantifying the Relationship between Curvature and Electric Potential in Lipid Bilayers.

    PubMed

    Bruhn, Dennis S; Lomholt, Michael A; Khandelia, Himanshu

    2016-06-02

    Cellular membranes mediate vital cellular processes by being subject to curvature and transmembrane electrical potentials. Here we build upon the existing theory for flexoelectricity in liquid crystals to quantify the coupling between lipid bilayer curvature and membrane potentials. Using molecular dynamics simulations, we show that headgroup dipole moments, the lateral pressure profile across the bilayer, and spontaneous curvature all systematically change with increasing membrane potentials. In particular, there is a linear dependence between the bending moment (the product of bending rigidity and spontaneous curvature) and the applied membrane potentials. We show that biologically relevant membrane potentials can induce biologically relevant curvatures corresponding to radii of around 500 nm. The implications of flexoelectricity in lipid bilayers are thus likely to be of considerable consequence both in biology and in model lipid bilayer systems.

  9. Next-Generation Genotyping by Digital PCR to Detect and Quantify the BRAF V600E Mutation in Melanoma Biopsies.

    PubMed

    Lamy, Pierre-Jean; Castan, Florence; Lozano, Nicolas; Montélion, Cécile; Audran, Patricia; Bibeau, Frédéric; Roques, Sylvie; Montels, Frédéric; Laberenne, Anne-Claire

    2015-07-01

    The detection of the BRAF V600E mutation in melanoma samples is used to select patients who should respond to BRAF inhibitors. Different techniques are routinely used to determine BRAF status in clinical samples. However, low tumor cellularity and tumor heterogeneity can affect the sensitivity of somatic mutation detection. Digital PCR (dPCR) is a next-generation genotyping method that clonally amplifies nucleic acids and allows the detection and quantification of rare mutations. Our aim was to evaluate the clinical routine performance of a new dPCR-based test to detect and quantify BRAF mutation load in 47 paraffin-embedded cutaneous melanoma biopsies. We compared the results obtained by dPCR with high-resolution melting curve analysis and pyrosequencing or with one of the allele-specific PCR methods available on the market. dPCR showed the lowest limit of detection. dPCR and allele-specific amplification detected the highest number of mutated samples. For the BRAF mutation load quantification both dPCR and pyrosequencing gave similar results with strong disparities in allele frequencies in the 47 tumor samples under study (from 0.7% to 79% of BRAF V600E mutations/sample). In conclusion, the four methods showed a high degree of concordance. dPCR was the more-sensitive method to reliably and easily detect mutations. Both pyrosequencing and dPCR could quantify the mutation load in heterogeneous tumor samples. Copyright © 2015 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  10. Quantifying the Benefits of NEC

    DTIC Science & Technology

    2005-04-01

    R UNCLASSIFIED/UNLIMITED Quantifying the Benefits of NEC Georgia Court and Lynda CM Sharp Dstl/UK MoD A2 Building, Dstl Farnborough Ively...transformation of UK forces is dependent on exploiting the benefits of Network Enabled Capability (NEC). The white paper notes that NEC, through... benefit to defence; • what can be traded off to pay for it; • what changes are required to processes, structures, equipment etc, to deliver the

  11. Quantifying the causes of differences in tropospheric OH within global models

    NASA Astrophysics Data System (ADS)

    Nicely, Julie M.; Salawitch, Ross J.; Canty, Timothy; Anderson, Daniel C.; Arnold, Steve R.; Chipperfield, Martyn P.; Emmons, Louisa K.; Flemming, Johannes; Huijnen, Vincent; Kinnison, Douglas E.; Lamarque, Jean-François; Mao, Jingqiu; Monks, Sarah A.; Steenrod, Stephen D.; Tilmes, Simone; Turquety, Solene

    2017-02-01

    The hydroxyl radical (OH) is the primary daytime oxidant in the troposphere and provides the main loss mechanism for many pollutants and greenhouse gases, including methane (CH4). Global mean tropospheric OH differs by as much as 80% among various global models, for reasons that are not well understood. We use neural networks (NNs), trained using archived output from eight chemical transport models (CTMs) that participated in the Polar Study using Aircraft, Remote Sensing, Surface Measurements and Models, of Climate, Chemistry, Aerosols and Transport Model Intercomparison Project (POLMIP), to quantify the factors responsible for differences in tropospheric OH and resulting CH4 lifetime (τCH4) between these models. Annual average τCH4, for loss by OH only, ranges from 8.0 to 11.6 years for the eight POLMIP CTMs. The factors driving these differences were quantified by inputting 3-D chemical fields from one CTM into the trained NN of another CTM. Across all CTMs, the largest mean differences in τCH4 (ΔτCH4) result from variations in chemical mechanisms (ΔτCH4 = 0.46 years), the photolysis frequency (J) of O3 → O(1D) (0.31 years), local O3 (0.30 years), and CO (0.23 years). The ΔτCH4 due to CTM differences in NOx (NO + NO2) is relatively low (0.17 years), although large regional variation in OH between the CTMs is attributed to NOx. Differences in isoprene and J(NO2) have negligible overall effect on globally averaged tropospheric OH, although the extent of OH variations due to each factor depends on the model being examined. This study demonstrates that NNs can serve as a useful tool for quantifying why tropospheric OH varies between global models, provided that essential chemical fields are archived.

  12. Quantifying the Causes of Differences in Tropospheric OH Within Global Models

    NASA Technical Reports Server (NTRS)

    Nicely, Julie M.; Salawitch, Ross J.; Canty, Timothy; Anderson, Daniel C.; Arnold, Steve R.; Chipperfield, Martyn P.; Emmons, Louisa K.; Flemming, Johannes; Huijnen, Vincent; Kinnison, Douglas E.; hide

    2017-01-01

    The hydroxyl radical (OH) is the primary daytime oxidant in the troposphere and provides the main loss mechanism for many pollutants and greenhouse gases, including methane (CH4). Global mean tropospheric OH differs by as much as 80% among various global models, for reasons that are not well understood. We use neural networks (NNs), trained using archived output from eight chemical transport models (CTMs) that participated in the Polar Study using Aircraft, Remote Sensing, Surface Measurements and Models, of Climate, Chemistry, Aerosols and Transport Model Intercomparison Project (POLMIP), to quantify the factors responsible for differences in tropospheric OH and resulting CH4 lifetime (Tau CH4) between these models. Annual average Tau CH4, for loss by OH only, ranges from 8.0 to 11.6 years for the eight POLMIP CTMs. The factors driving these differences were quantified by inputting 3-D chemical fields from one CTM into the trained NN of another CTM. Across all CTMs, the largest mean differences in Tau CH4 (Delta Tau CH4) result from variations in chemical mechanisms (Delta Tau CH4 = 0.46 years), the photolysis frequency (J) of O3 yields O(D-1) (0.31 years), local O3 (0.30 years), and CO (0.23 years). The Delta Tau CH4 due to CTM differences in NO(x) (NO + NO2) is relatively low (0.17 years), although large regional variation in OH between the CTMs is attributed to NO(x). Differences in isoprene and J(NO2) have negligible overall effect on globally averaged tropospheric OH, although the extent of OH variations due to each factor depends on the model being examined. This study demonstrates that NNs can serve as a useful tool for quantifying why tropospheric OH varies between global models, provided that essential chemical fields are archived.

  13. A fluorescent imaging technique for quantifying spray deposits on plant leaves

    USDA-ARS?s Scientific Manuscript database

    Because of the unique characteristics of electrostatically-charged sprays, use of traditional methods to quantify deposition from these sprays has been challenging. A new fluorescent imaging technique was developed to quantify spray deposits from electrostatically-charged sprays on natural plant lea...

  14. Quantifying the Reuse of Learning Objects

    ERIC Educational Resources Information Center

    Elliott, Kristine; Sweeney, Kevin

    2008-01-01

    This paper reports the findings of one case study from a larger project, which aims to quantify the claimed efficiencies of reusing learning objects to develop e-learning resources. The case study describes how an online inquiry project "Diabetes: A waste of energy" was developed by searching for, evaluating, modifying and then…

  15. Quantifying Water Stress Using Total Water Volumes and GRACE

    NASA Astrophysics Data System (ADS)

    Richey, A. S.; Famiglietti, J. S.; Druffel-Rodriguez, R.

    2011-12-01

    Water will follow oil as the next critical resource leading to unrest and uprisings globally. To better manage this threat, an improved understanding of the distribution of water stress is required today. This study builds upon previous efforts to characterize water stress by improving both the quantification of human water use and the definition of water availability. Current statistics on human water use are often outdated or inaccurately reported nationally, especially for groundwater. This study improves these estimates by defining human water use in two ways. First, we use NASA's Gravity Recovery and Climate Experiment (GRACE) to isolate the anthropogenic signal in water storage anomalies, which we equate to water use. Second, we quantify an ideal water demand by using average water requirements for the domestic, industrial, and agricultural water use sectors. Water availability has traditionally been limited to "renewable" water, which ignores large, stored water sources that humans use. We compare water stress estimates derived using either renewable water or the total volume of water globally. We use the best-available data to quantify total aquifer and surface water volumes, as compared to groundwater recharge and surface water runoff from land-surface models. The work presented here should provide a more realistic image of water stress by explicitly quantifying groundwater, defining water availability as total water supply, and using GRACE to more accurately quantify water use.

  16. Quantifying the Labeling and the Levels of Plant Cell Wall Precursors Using Ion Chromatography Tandem Mass Spectrometry1[W][OA

    PubMed Central

    Alonso, Ana P.; Piasecki, Rebecca J.; Wang, Yan; LaClair, Russell W.; Shachar-Hill, Yair

    2010-01-01

    The biosynthesis of cell wall polymers involves enormous fluxes through central metabolism that are not fully delineated and whose regulation is poorly understood. We have established and validated a liquid chromatography tandem mass spectrometry method using multiple reaction monitoring mode to separate and quantify the levels of plant cell wall precursors. Target analytes were identified by their parent/daughter ions and retention times. The method allows the quantification of precursors at low picomole quantities with linear responses up to the nanomole quantity range. When applying the technique to Arabidopsis (Arabidopsis thaliana) T87 cell cultures, 16 hexose-phosphates (hexose-Ps) and nucleotide-sugars (NDP-sugars) involved in cell wall biosynthesis were separately quantified. Using hexose-P and NDP-sugar standards, we have shown that hot water extraction allows good recovery of the target metabolites (over 86%). This method is applicable to quantifying the levels of hexose-Ps and NDP-sugars in different plant tissues, such as Arabidopsis T87 cells in culture and fenugreek (Trigonella foenum-graecum) endosperm tissue, showing higher levels of galacto-mannan precursors in fenugreek endosperm. In Arabidopsis cells incubated with [U-13CFru]sucrose, the method was used to track the labeling pattern in cell wall precursors. As the fragmentation of hexose-Ps and NDP-sugars results in high yields of [PO3]−/or [H2PO4]− ions, mass isotopomers can be quantified directly from the intensity of selected tandem mass spectrometry transitions. The ability to directly measure 13C labeling in cell wall precursors makes possible metabolic flux analysis of cell wall biosynthesis based on dynamic labeling experiments. PMID:20442274

  17. A sprinkling experiment to quantify celerity-velocity differences at the hillslope scale

    NASA Astrophysics Data System (ADS)

    van Verseveld, Willem J.; Barnard, Holly R.; Graham, Chris B.; McDonnell, Jeffrey J.; Renée Brooks, J.; Weiler, Markus

    2017-11-01

    Few studies have quantified the differences between celerity and velocity of hillslope water flow and explained the processes that control these differences. Here, we asses these differences by combining a 24-day hillslope sprinkling experiment with a spatially explicit hydrologic model analysis. We focused our work on Watershed 10 at the H. J. Andrews Experimental Forest in western Oregon. Celerities estimated from wetting front arrival times were generally much faster than average vertical velocities of δ2H. In the model analysis, this was consistent with an identifiable effective porosity (fraction of total porosity available for mass transfer) parameter, indicating that subsurface mixing was controlled by an immobile soil fraction, resulting in the attenuation of the δ2H input signal in lateral subsurface flow. In addition to the immobile soil fraction, exfiltrating deep groundwater that mixed with lateral subsurface flow captured at the experimental hillslope trench caused further reduction in the δ2H input signal. Finally, our results suggest that soil depth variability played a significant role in the celerity-velocity responses. Deeper upslope soils damped the δ2H input signal, while a shallow soil near the trench controlled the δ2H peak in lateral subsurface flow response. Simulated exit time and residence time distributions with our hillslope hydrologic model showed that water captured at the trench did not represent the entire modeled hillslope domain; the exit time distribution for lateral subsurface flow captured at the trench showed more early time weighting.

  18. A sprinkling experiment to quantify celerity-velocity differences at the hillslope scale.

    PubMed

    van Verseveld, Willem J; Barnard, Holly R; Graham, Chris B; McDonnell, Jeffrey J; Brooks, J Renée; Weiler, Markus

    2017-01-01

    Few studies have quantified the differences between celerity and velocity of hillslope water flow and explained the processes that control these differences. Here, we asses these differences by combining a 24-day hillslope sprinkling experiment with a spatially explicit hydrologic model analysis. We focused our work on Watershed 10 at the H. J. Andrews Experimental Forest in western Oregon. Celerities estimated from wetting front arrival times were generally much faster than average vertical velocities of δ 2 H. In the model analysis, this was consistent with an identifiable effective porosity (fraction of total porosity available for mass transfer) parameter, indicating that subsurface mixing was controlled by an immobile soil fraction, resulting in the attenuation of the δ 2 H input signal in lateral subsurface flow. In addition to the immobile soil fraction, exfiltrating deep groundwater that mixed with lateral subsurface flow captured at the experimental hillslope trench caused further reduction in the δ 2 H input signal. Finally, our results suggest that soil depth variability played a significant role in the celerity-velocity responses. Deeper upslope soils damped the δ 2 H input signal, while a shallow soil near the trench controlled the δ 2 H peak in lateral subsurface flow response. Simulated exit time and residence time distributions with our hillslope hydrologic model showed that water captured at the trench did not represent the entire modeled hillslope domain; the exit time distribution for lateral subsurface flow captured at the trench showed more early time weighting.

  19. Quantifying Ballistic Armor Performance: A Minimally Invasive Approach

    NASA Astrophysics Data System (ADS)

    Holmes, Gale; Kim, Jaehyun; Blair, William; McDonough, Walter; Snyder, Chad

    2006-03-01

    Theoretical and non-dimensional analyses suggest a critical link between the performance of ballistic resistant armor and the fundamental mechanical properties of the polymeric materials that comprise them. Therefore, a test methodology that quantifies these properties without compromising an armored vest that is exposed to the industry standard V-50 ballistic performance test is needed. Currently, there is considerable speculation about the impact that competing degradation mechanisms (e.g., mechanical, humidity, ultraviolet) may have on ballistic resistant armor. We report on the use of a new test methodology that quantifies the mechanical properties of ballistic fibers and how each proposed degradation mechanism may impact a vest's ballistic performance.

  20. Quantifying dynamic characteristics of human walking for comprehensive gait cycle.

    PubMed

    Mummolo, Carlotta; Mangialardi, Luigi; Kim, Joo H

    2013-09-01

    Normal human walking typically consists of phases during which the body is statically unbalanced while maintaining dynamic stability. Quantifying the dynamic characteristics of human walking can provide better understanding of gait principles. We introduce a novel quantitative index, the dynamic gait measure (DGM), for comprehensive gait cycle. The DGM quantifies the effects of inertia and the static balance instability in terms of zero-moment point and ground projection of center of mass and incorporates the time-varying foot support region (FSR) and the threshold between static and dynamic walking. Also, a framework of determining the DGM from experimental data is introduced, in which the gait cycle segmentation is further refined. A multisegmental foot model is integrated into a biped system to reconstruct the walking motion from experiments, which demonstrates the time-varying FSR for different subphases. The proof-of-concept results of the DGM from a gait experiment are demonstrated. The DGM results are analyzed along with other established features and indices of normal human walking. The DGM provides a measure of static balance instability of biped walking during each (sub)phase as well as the entire gait cycle. The DGM of normal human walking has the potential to provide some scientific insights in understanding biped walking principles, which can also be useful for their engineering and clinical applications.

  1. Quantifying temporal bone morphology of great apes and humans: an approach using geometric morphometrics.

    PubMed

    Lockwood, Charles A; Lynch, John M; Kimbel, William H

    2002-12-01

    The hominid temporal bone offers a complex array of morphology that is linked to several different functional systems. Its frequent preservation in the fossil record gives the temporal bone added significance in the study of human evolution, but its morphology has proven difficult to quantify. In this study we use techniques of 3D geometric morphometrics to quantify differences among humans and great apes and discuss the results in a phylogenetic context. Twenty-three landmarks on the ectocranial surface of the temporal bone provide a high level of anatomical detail. Generalized Procrustes analysis (GPA) is used to register (adjust for position, orientation and scale) landmark data from 405 adults representing Homo, Pan, Gorilla and Pongo. Principal components analysis of residuals from the GPA shows that the major source of variation is between humans and apes. Human characteristics such as a coronally orientated petrous axis, a deep mandibular fossa, a projecting mastoid process, and reduced lateral extension of the tympanic element strongly impact the analysis. In phenetic cluster analyses, gorillas and orangutans group together with respect to chimpanzees, and all apes group together with respect to humans. Thus, the analysis contradicts depictions of African apes as a single morphotype. Gorillas and orangutans lack the extensive preglenoid surface of chimpanzees, and their mastoid processes are less medially inflected. These and other characters shared by gorillas and orangutans are probably primitive for the African hominid clade.

  2. Quantifying temporal bone morphology of great apes and humans: an approach using geometric morphometrics

    PubMed Central

    Lockwood, Charles A; Lynch, John M; Kimbel, William H

    2002-01-01

    The hominid temporal bone offers a complex array of morphology that is linked to several different functional systems. Its frequent preservation in the fossil record gives the temporal bone added significance in the study of human evolution, but its morphology has proven difficult to quantify. In this study we use techniques of 3D geometric morphometrics to quantify differences among humans and great apes and discuss the results in a phylogenetic context. Twenty-three landmarks on the ectocranial surface of the temporal bone provide a high level of anatomical detail. Generalized Procrustes analysis (GPA) is used to register (adjust for position, orientation and scale) landmark data from 405 adults representing Homo, Pan, Gorilla and Pongo. Principal components analysis of residuals from the GPA shows that the major source of variation is between humans and apes. Human characteristics such as a coronally orientated petrous axis, a deep mandibular fossa, a projecting mastoid process, and reduced lateral extension of the tympanic element strongly impact the analysis. In phenetic cluster analyses, gorillas and orangutans group together with respect to chimpanzees, and all apes group together with respect to humans. Thus, the analysis contradicts depictions of African apes as a single morphotype. Gorillas and orangutans lack the extensive preglenoid surface of chimpanzees, and their mastoid processes are less medially inflected. These and other characters shared by gorillas and orangutans are probably primitive for the African hominid clade. PMID:12489757

  3. Quantifying uncertainty in discharge measurements: A new approach

    USGS Publications Warehouse

    Kiang, J.E.; Cohn, T.A.; Mason, R.R.

    2009-01-01

    The accuracy of discharge measurements using velocity meters and the velocity-area method is typically assessed based on empirical studies that may not correspond to conditions encountered in practice. In this paper, a statistical approach for assessing uncertainty based on interpolated variance estimation (IVE) is introduced. The IVE method quantifies all sources of random uncertainty in the measured data. This paper presents results employing data from sites where substantial over-sampling allowed for the comparison of IVE-estimated uncertainty and observed variability among repeated measurements. These results suggest that the IVE approach can provide approximate estimates of measurement uncertainty. The use of IVE to estimate the uncertainty of a discharge measurement would provide the hydrographer an immediate determination of uncertainty and help determine whether there is a need for additional sampling in problematic river cross sections. ?? 2009 ASCE.

  4. Lunar Water Resource Demonstration (LWRD) Test Results

    NASA Technical Reports Server (NTRS)

    Muscatello, Anthony C.; Captain, Janine E.; Quinn, Jacqueline W.; Gibson, Tracy L.; Perusich, Stephen A.; Weis, Kyle H.

    2009-01-01

    NASA has undertaken the In-Situ Resource Utilization (lSRU) project called RESOLVE (Regolith and Environment Science & Oxygen and Lunar Volatile Extraction). This project is an Earth-based lunar precursor demonstration of a system that could be sent to explore permanently shadowed polar lunar craters, where it would drill into regolith, quantify the volatiles that are present, and extract oxygen by hydrogen reduction of iron oxides. The RESOLVE chemical processing system was mounted within the CMU rover "Scarab" and successfully demonstrated on Hawaii's Mauna Kea volcano in November 2008. This technology could be used on Mars as well. As described at the 2008 Mars Society Convention, the Lunar Water Resource Demonstration (LWRD) supports the objectives of the RESOLVE project by capturing and quantifying water and hydrogen released by regolith upon heating. Field test results for the quantification of water using LWRD showed that the volcanic ash (tephra) samples contained 0.15-0.41% water, in agreement with GC water measurements. Reduction of the RH in the surge tank to near zero during recirculation show that the water is captured by the water beds as desired. The water can be recovered by heating the Water Beds to 230 C or higher. Test results for the capture and quantification of pure hydrogen have shown that over 90% of the hydrogen can be captured and 98% of the absorbed hydrogen can be recovered upon heating the hydride to 400 C and desorbing the hydrogen several times into the evacuated surge tank. Thus, the essential requirement of capturing hydrogen and recovering it has been demonstrated. ,

  5. Quantifying Biodiversity Losses Due to Human Consumption: A Global-Scale Footprint Analysis.

    PubMed

    Wilting, Harry C; Schipper, Aafke M; Bakkenes, Michel; Meijer, Johan R; Huijbregts, Mark A J

    2017-03-21

    It is increasingly recognized that human consumption leads to considerable losses of biodiversity. This study is the first to systematically quantify these losses in relation to land use and greenhouse gas (GHG) emissions associated with the production and consumption of (inter)nationally traded goods and services by presenting consumption-based biodiversity losses, in short biodiversity footprint, for 45 countries and world regions globally. Our results showed that (i) the biodiversity loss per citizen shows large variations among countries, with higher values when per-capita income increases; (ii) the share of biodiversity losses due to GHG emissions in the biodiversity footprint increases with income; (iii) food consumption is the most important driver of biodiversity loss in most of the countries and regions, with a global average of 40%; (iv) more than 50% of the biodiversity loss associated with consumption in developed economies occurs outside their territorial boundaries; and (v) the biodiversity footprint per dollar consumed is lower for wealthier countries. The insights provided by our analysis might support policymakers in developing adequate responses to avert further losses of biodiversity when population and incomes increase. Both the mitigation of GHG emissions and land use related reduction options in production and consumption should be considered in strategies to protect global biodiversity.

  6. The Use of Wearable Microsensors to Quantify Sport-Specific Movements.

    PubMed

    Chambers, Ryan; Gabbett, Tim J; Cole, Michael H; Beard, Adam

    2015-07-01

    Microtechnology has allowed sport scientists to understand the locomotor demands of various sports. While wearable global positioning technology has been used to quantify the locomotor demands of sporting activities, microsensors (i.e. accelerometers, gyroscopes and magnetometers) embedded within the units also have the capability to detect sport-specific movements. The objective of this study was to determine the extent to which microsensors (also referred to as inertial measurement units and microelectromechanical sensors) have been utilised in quantifying sport-specific movements. A systematic review of the use of microsensors and associated terms to evaluate sport-specific movements was conducted; permutations of the terms used included alternate names of the various technologies used, their applications and different applied environments. Studies for this review were published between 2008 and 2014 and were identified through a systematic search of six electronic databases: Academic Search Complete, CINAHL, PsycINFO, PubMed, SPORTDiscus, and Web of Science. Articles were required to have used athlete-mounted sensors to detect sport-specific movements (e.g. rugby union tackle) rather than sensors mounted to equipment and monitoring generic movement patterns. A total of 2395 studies were initially retrieved from the six databases and 737 results were removed as they were duplicates, review articles or conference abstracts. After screening titles and abstracts of the remaining papers, the full text of 47 papers was reviewed, resulting in the inclusion of 28 articles that met the set criteria around the application of microsensors for detecting sport-specific movements. Eight articles addressed the use of microsensors within individual sports, team sports provided seven results, water sports provided eight articles, and five articles addressed the use of microsensors in snow sports. All articles provided evidence of the ability of microsensors to detect sport

  7. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1988-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach was developed to quantify the effects of the random uncertainties. The results indicate that only the variations in geometry have significant effects.

  8. Quantifying radionuclide signatures from a γ-γ coincidence system.

    PubMed

    Britton, Richard; Jackson, Mark J; Davies, Ashley V

    2015-11-01

    A method for quantifying gamma coincidence signatures has been developed, and tested in conjunction with a high-efficiency multi-detector system to quickly identify trace amounts of radioactive material. The γ-γ system utilises fully digital electronics and list-mode acquisition to time-stamp each event, allowing coincidence matrices to be easily produced alongside typical 'singles' spectra. To quantify the coincidence signatures a software package has been developed to calculate efficiency and cascade summing corrected branching ratios. This utilises ENSDF records as an input, and can be fully automated, allowing the user to quickly and easily create/update a coincidence library that contains all possible γ and conversion electron cascades, associated cascade emission probabilities, and true-coincidence summing corrected γ cascade detection probabilities. It is also fully searchable by energy, nuclide, coincidence pair, γ multiplicity, cascade probability and half-life of the cascade. The probabilities calculated were tested using measurements performed on the γ-γ system, and found to provide accurate results for the nuclides investigated. Given the flexibility of the method, (it only relies on evaluated nuclear data, and accurate efficiency characterisations), the software can now be utilised for a variety of systems, quickly and easily calculating coincidence signature probabilities. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  9. Quantifying the Electrocatalytic Turnover of Vitamin B12-Mediated Dehalogenation on Single Soft Nanoparticles.

    PubMed

    Cheng, Wei; Compton, Richard G

    2016-02-12

    We report the electrocatalytic dehalogenation of trichloroethylene (TCE) by single soft nanoparticles in the form of Vitamin B12 -containing droplets. We quantify the turnover number of the catalytic reaction at the single soft nanoparticle level. The kinetic data shows that the binding of TCE with the electro-reduced vitamin in the Co(I) oxidation state is chemically reversible. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Quantifying Uncertainty in Expert Judgment: Initial Results

    DTIC Science & Technology

    2013-03-01

    lines of source code were added in . ---------- C++ = 32%; JavaScript = 29%; XML = 15%; C = 7%; CSS = 7%; Java = 5%; Oth- er = 5% LOC = 927,266...much total effort in person years has been spent on this project? CMU/SEI-2013-TR-001 | 33 5 MySQL , the most popular Open Source SQL...as MySQL , Oracle, PostgreSQL, MS SQL Server, ODBC, or Interbase. Features include email reminders, iCal/vCal import/export, re- mote subscriptions

  11. Quantifying and measuring cyber resiliency

    NASA Astrophysics Data System (ADS)

    Cybenko, George

    2016-05-01

    Cyber resliency has become an increasingly attractive research and operational concept in cyber security. While several metrics have been proposed for quantifying cyber resiliency, a considerable gap remains between those metrics and operationally measurable and meaningful concepts that can be empirically determined in a scientific manner. This paper describes a concrete notion of cyber resiliency that can be tailored to meet specific needs of organizations that seek to introduce resiliency into their assessment of their cyber security posture.

  12. Quantifying the plant actin cytoskeleton response to applied pressure using nanoindentation.

    PubMed

    Branco, Rémi; Pearsall, Eliza-Jane; Rundle, Chelsea A; White, Rosemary G; Bradby, Jodie E; Hardham, Adrienne R

    2017-03-01

    Detection of potentially pathogenic microbes through recognition by plants and animals of both physical and chemical signals associated with the pathogens is vital for host well-being. Signal perception leads to the induction of a variety of responses that augment pre-existing, constitutive defences. The plant cell wall is a highly effective preformed barrier which becomes locally reinforced at the infection site through delivery of new wall material by the actin cytoskeleton. Although mechanical stimulation can produce a reaction, there is little understanding of the nature of physical factors capable of triggering plant defence. Neither the magnitude of forces nor the contact time required has been quantified. In the study reported here, mechanical stimulation with a tungsten microneedle has been used to quantify the response of Arabidopsis plants expressing an actin-binding protein tagged with green fluorescent protein (GFP) to reveal the organisation of the actin cytoskeleton. Using confocal microscopy, the response time for actin reorganisation in epidermal cells of Arabidopsis hypocotyls was shown to be 116 ± 49 s. Using nanoindentation and a diamond spherical tip indenter, the magnitude of the forces capable of triggering an actin response has been quantified. We show that Arabidopsis hypocotyl cells can detect a force as small as 4 μN applied for as short a time as 21.6 s to trigger reorganisation of the actin cytoskeleton. This force is an order of magnitude less than the potential invasive force determined for a range of fungal and oomycete plant pathogens. To our knowledge, this is the first quantification of the magnitude and duration of mechanical forces capable of stimulating a structural defence response in a plant cell.

  13. Classifying and quantifying basins of attraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprott, J. C.; Xiong, Anda

    2015-08-15

    A scheme is proposed to classify the basins for attractors of dynamical systems in arbitrary dimensions. There are four basic classes depending on their size and extent, and each class can be further quantified to facilitate comparisons. The calculation uses a Monte Carlo method and is applied to numerous common dissipative chaotic maps and flows in various dimensions.

  14. Subtleties of Hidden Quantifiers in Implication

    ERIC Educational Resources Information Center

    Shipman, Barbara A.

    2016-01-01

    Mathematical conjectures and theorems are most often of the form P(x) ? Q(x), meaning ?x,P(x) ? Q(x). The hidden quantifier ?x is crucial in understanding the implication as a statement with a truth value. Here P(x) and Q(x) alone are only predicates, without truth values, since they contain unquantified variables. But standard textbook…

  15. Quantifying Semantic Linguistic Maturity in Children

    ERIC Educational Resources Information Center

    Hansson, Kristina; Bååth, Rasmus; Löhndorf, Simone; Sahlén, Birgitta; Sikström, Sverker

    2016-01-01

    We propose a method to quantify "semantic linguistic maturity" (SELMA) based on a high dimensional semantic representation of words created from the co-occurrence of words in a large text corpus. The method was applied to oral narratives from 108 children aged 4;0-12;10. By comparing the SELMA measure with maturity ratings made by human…

  16. Quantifying effects of retinal illuminance on frequency doubling perimetry.

    PubMed

    Swanson, William H; Dul, Mitchell W; Fischer, Susan E

    2005-01-01

    To measure and quantify effects of variation in retinal illuminance on frequency doubling technology (FDT) perimetry. A Zeiss-Humphrey/Welch Allyn FDT perimeter was used with the threshold N-30 strategy. Study 1, quantifying adaptation: 11 eyes of 11 subjects (24-46 years old) were tested with natural pupils, and then retested after stable pupillary dilation with neutral density filters of 0.0, 0.6, 1.2, and 1.6 log unit in front of the subject's eye. Study 2, predicting effect of reduced illuminance: 17 eyes of 17 subjects (26-61 years old) were tested with natural pupils, and then retested after stable pupillary miosis (assessed with an infrared camera). A quantitative adaptation model was fit to results of Study 1; the mean adaptation parameter was used to predict change in Study 2. Study 1: Mean defect (MD) decreased by 10 dB over a 1.6 log unit range of retinal illuminances; model fits for all subjects had r2> 95%. Study 2: Change in MD (DeltaMD) ranged from -7.3 dB to +0.8 dB. The mean adaptation parameter from Study 1 accounted for 69% of the variance in DeltaMD (P <0.0005), and accuracy of the model was independent of the magnitude of DeltaMD (r2< 1%, P >0.75). The results confirmed previous findings that FDT perimetry can be dramatically affected by variations in retinal illuminance. Application of a quantitative adaptation model provided guidelines for estimating effects of pupil diameter and lens density on FDT perimetry.

  17. Quantifying the Nonlinear, Anisotropic Material Response of Spinal Ligaments

    NASA Astrophysics Data System (ADS)

    Robertson, Daniel J.

    Spinal ligaments may be a significant source of chronic back pain, yet they are often disregarded by the clinical community due to a lack of information with regards to their material response, and innervation characteristics. The purpose of this dissertation was to characterize the material response of spinal ligaments and to review their innervation characteristics. Review of relevant literature revealed that all of the major spinal ligaments are innervated. They cause painful sensations when irritated and provide reflexive control of the deep spinal musculature. As such, including the neurologic implications of iatrogenic ligament damage in the evaluation of surgical procedures aimed at relieving back pain will likely result in more effective long-term solutions. The material response of spinal ligaments has not previously been fully quantified due to limitations associated with standard soft tissue testing techniques. The present work presents and validates a novel testing methodology capable of overcoming these limitations. In particular, the anisotropic, inhomogeneous material constitutive properties of the human supraspinous ligament are quantified and methods for determining the response of the other spinal ligaments are presented. In addition, a method for determining the anisotropic, inhomogeneous pre-strain distribution of the spinal ligaments is presented. The multi-axial pre-strain distributions of the human anterior longitudinal ligament, ligamentum flavum and supraspinous ligament were determined using this methodology. Results from this work clearly demonstrate that spinal ligaments are not uniaxial structures, and that finite element models which account for pre-strain and incorporate ligament's complex material properties may provide increased fidelity to the in vivo condition.

  18. Quantifying Particle Numbers and Mass Flux in Drifting Snow

    NASA Astrophysics Data System (ADS)

    Crivelli, Philip; Paterna, Enrico; Horender, Stefan; Lehning, Michael

    2016-12-01

    We compare two of the most common methods of quantifying mass flux, particle numbers and particle-size distribution for drifting snow events, the snow-particle counter (SPC), a laser-diode-based particle detector, and particle tracking velocimetry based on digital shadowgraphic imaging. The two methods were correlated for mass flux and particle number flux. For the SPC measurements, the device was calibrated by the manufacturer beforehand. The shadowgrapic imaging method measures particle size and velocity directly from consecutive images, and before each new test the image pixel length is newly calibrated. A calibration study with artificially scattered sand particles and glass beads provides suitable settings for the shadowgraphical imaging as well as obtaining a first correlation of the two methods in a controlled environment. In addition, using snow collected in trays during snowfall, several experiments were performed to observe drifting snow events in a cold wind tunnel. The results demonstrate a high correlation between the mass flux obtained for the calibration studies (r ≥slant 0.93) and good correlation for the drifting snow experiments (r ≥slant 0.81). The impact of measurement settings is discussed in order to reliably quantify particle numbers and mass flux in drifting snow. The study was designed and performed to optimize the settings of the digital shadowgraphic imaging system for both the acquisition and the processing of particles in a drifting snow event. Our results suggest that these optimal settings can be transferred to different imaging set-ups to investigate sediment transport processes.

  19. Quantifying similarity in reliability surfaces using the probability of agreement

    DOE PAGES

    Stevens, Nathaniel T.; Anderson-Cook, Christine Michaela

    2017-03-30

    When separate populations exhibit similar reliability as a function of multiple explanatory variables, combining them into a single population is tempting. This can simplify future predictions and reduce uncertainty associated with estimation. However, combining these populations may introduce bias if the underlying relationships are in fact different. The probability of agreement formally and intuitively quantifies the similarity of estimated reliability surfaces across a two-factor input space. An example from the reliability literature demonstrates the utility of the approach when deciding whether to combine two populations or to keep them as distinct. As a result, new graphical summaries provide strategies formore » visualizing the results.« less

  20. Quantifying similarity in reliability surfaces using the probability of agreement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Nathaniel T.; Anderson-Cook, Christine Michaela

    When separate populations exhibit similar reliability as a function of multiple explanatory variables, combining them into a single population is tempting. This can simplify future predictions and reduce uncertainty associated with estimation. However, combining these populations may introduce bias if the underlying relationships are in fact different. The probability of agreement formally and intuitively quantifies the similarity of estimated reliability surfaces across a two-factor input space. An example from the reliability literature demonstrates the utility of the approach when deciding whether to combine two populations or to keep them as distinct. As a result, new graphical summaries provide strategies formore » visualizing the results.« less

  1. A method of semi-quantifying β-AP in brain PET-CT 11C-PiB images.

    PubMed

    Jiang, Jiehui; Lin, Xiaoman; Wen, Junlin; Huang, Zhemin; Yan, Zhuangzhi

    2014-01-01

    Alzheimer's disease (AD) is a common health problem for elderly populations. Positron emission tomography-computed tomography (PET-CT)11C-PiB for beta-P (amyloid-β peptide, β-AP) imaging is an advanced method to diagnose AD in early stage. However, in practice radiologists lack a standardized value to semi-quantify β-AP. This paper proposes such a standardized value: SVβ-AP. This standardized value measures the mean ratio between the dimension of β-AP areas in PET and CT images. A computer aided diagnosis approach is also proposed to achieve SVβ-AP. A simulation experiment was carried out to pre-test the technical feasibility of the CAD approach and SVβ-AP. The experiment results showed that it is technically feasible.

  2. Lamb Wave Dispersion Ultrasound Vibrometry (LDUV) Method for Quantifying Mechanical Properties of Viscoelastic Solids

    PubMed Central

    Nenadic, Ivan Z.; Urban, Matthew W.; Mitchell, Scott A.; Greenleaf, James F.

    2011-01-01

    Diastolic dysfunction is the inability of the left ventricle to supply sufficient stroke volumes under normal physiological conditions and is often accompanied by stiffening of the left-ventricular myocardium. A noninvasive technique capable of quantifying viscoelasticity of the myocardium would be beneficial in clinical settings. Our group has been investigating the use of Shearwave Dispersion Ultrasound Vibrometry (SDUV), a noninvasive ultrasound based method for quantifying viscoelasticity of soft tissues. The primary motive of this study is the design and testing of viscoelastic materials suitable for validation of the Lamb wave Dispersion Ultrasound Vibrometry (LDUV), an SDUV-based technique for measuring viscoelasticity of tissues with plate-like geometry. We report the results of quantifying viscoelasticity of urethane rubber and gelatin samples using LDUV and an embedded sphere method. The LDUV method was used to excite antisymmetric Lamb waves and measure the dispersion in urethane rubber and gelatin plates. An antisymmetric Lamb wave model was fitted to the wave speed dispersion data to estimate elasticity and viscosity of the materials. A finite element model of a viscoelastic plate submerged in water was used to study the appropriateness of the Lamb wave dispersion equations. An embedded sphere method was used as an independent measurement of the viscoelasticity of the urethane rubber and gelatin. The FEM dispersion data were in excellent agreement with the theoretical predictions. Viscoelasticity of the urethane rubber and gelatin obtained using the LDUV and embedded sphere methods agreed within one standard deviation. LDUV studies on excised porcine myocardium sample were performed to investigate the feasibility of the approach in preparation for open-chest in vivo studies. The results suggest that the LDUV technique can be used to quantify mechanical properties of soft tissues with a plate-like geometry. PMID:21403186

  3. Lamb wave dispersion ultrasound vibrometry (LDUV) method for quantifying mechanical properties of viscoelastic solids.

    PubMed

    Nenadic, Ivan Z; Urban, Matthew W; Mitchell, Scott A; Greenleaf, James F

    2011-04-07

    Diastolic dysfunction is the inability of the left ventricle to supply sufficient stroke volumes under normal physiological conditions and is often accompanied by stiffening of the left-ventricular myocardium. A noninvasive technique capable of quantifying viscoelasticity of the myocardium would be beneficial in clinical settings. Our group has been investigating the use of shear wave dispersion ultrasound vibrometry (SDUV), a noninvasive ultrasound-based method for quantifying viscoelasticity of soft tissues. The primary motive of this study is the design and testing of viscoelastic materials suitable for validation of the Lamb wave dispersion ultrasound vibrometry (LDUV), an SDUV-based technique for measuring viscoelasticity of tissues with plate-like geometry. We report the results of quantifying viscoelasticity of urethane rubber and gelatin samples using LDUV and an embedded sphere method. The LDUV method was used to excite antisymmetric Lamb waves and measure the dispersion in urethane rubber and gelatin plates. An antisymmetric Lamb wave model was fitted to the wave speed dispersion data to estimate elasticity and viscosity of the materials. A finite element model of a viscoelastic plate submerged in water was used to study the appropriateness of the Lamb wave dispersion equations. An embedded sphere method was used as an independent measurement of the viscoelasticity of the urethane rubber and gelatin. The FEM dispersion data were in excellent agreement with the theoretical predictions. Viscoelasticity of the urethane rubber and gelatin obtained using the LDUV and embedded sphere methods agreed within one standard deviation. LDUV studies on excised porcine myocardium sample were performed to investigate the feasibility of the approach in preparation for open-chest in vivo studies. The results suggest that the LDUV technique can be used to quantify the mechanical properties of soft tissues with a plate-like geometry.

  4. Quantifying Uncertainties from Presence Data Sampling Methods for Species Distribution Modeling: Focused on Vegetation.

    NASA Astrophysics Data System (ADS)

    Sung, S.; Kim, H. G.; Lee, D. K.; Park, J. H.; Mo, Y.; Kil, S.; Park, C.

    2016-12-01

    The impact of climate change has been observed throughout the globe. The ecosystem experiences rapid changes such as vegetation shift, species extinction. In these context, Species Distribution Model (SDM) is one of the popular method to project impact of climate change on the ecosystem. SDM basically based on the niche of certain species with means to run SDM present point data is essential to find biological niche of species. To run SDM for plants, there are certain considerations on the characteristics of vegetation. Normally, to make vegetation data in large area, remote sensing techniques are used. In other words, the exact point of presence data has high uncertainties as we select presence data set from polygons and raster dataset. Thus, sampling methods for modeling vegetation presence data should be carefully selected. In this study, we used three different sampling methods for selection of presence data of vegetation: Random sampling, Stratified sampling and Site index based sampling. We used one of the R package BIOMOD2 to access uncertainty from modeling. At the same time, we included BioCLIM variables and other environmental variables as input data. As a result of this study, despite of differences among the 10 SDMs, the sampling methods showed differences in ROC values, random sampling methods showed the lowest ROC value while site index based sampling methods showed the highest ROC value. As a result of this study the uncertainties from presence data sampling methods and SDM can be quantified.

  5. Quantifying and Monetizing Renewable Energy Resiliency

    DOE PAGES

    Anderson, Kate H.; Laws, Nicholas D.; Marr, Spencer; ...

    2018-03-23

    Energy resiliency has been thrust to the forefront by recent severe weather events and natural disasters. Billions of dollars are lost each year due to power outages. This article highlights the unique value renewable energy hybrid systems (REHS), comprised of solar, energy storage, and generators, provide in increasing resiliency. We present a methodology to quantify the amount and value of resiliency provided by REHS, and ways to monetize this resiliency value through insurance premium discounts. A case study of buildings in New York City demonstrates how implementing REHS in place of traditional backup diesel generators can double the amount ofmore » outage survivability, with an added value of $781,200. For a Superstorm Sandy type event, results indicate that insurance premium reductions could support up to 4% of the capital cost of REHS, and the potential exists to prevent up to $2.5 billion in business interruption losses with increased REHS deployment.« less

  6. Quantifying and Monetizing Renewable Energy Resiliency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Kate H.; Laws, Nicholas D.; Marr, Spencer

    Energy resiliency has been thrust to the forefront by recent severe weather events and natural disasters. Billions of dollars are lost each year due to power outages. This article highlights the unique value renewable energy hybrid systems (REHS), comprised of solar, energy storage, and generators, provide in increasing resiliency. We present a methodology to quantify the amount and value of resiliency provided by REHS, and ways to monetize this resiliency value through insurance premium discounts. A case study of buildings in New York City demonstrates how implementing REHS in place of traditional backup diesel generators can double the amount ofmore » outage survivability, with an added value of $781,200. For a Superstorm Sandy type event, results indicate that insurance premium reductions could support up to 4% of the capital cost of REHS, and the potential exists to prevent up to $2.5 billion in business interruption losses with increased REHS deployment.« less

  7. Quantifying Access Disparities in Response Plans

    PubMed Central

    Indrakanti, Saratchandra; Mikler, Armin R.; O’Neill, Martin; Tiwari, Chetan

    2016-01-01

    Effective response planning and preparedness are critical to the health and well-being of communities in the face of biological emergencies. Response plans involving mass prophylaxis may seem feasible when considering the choice of dispensing points within a region, overall population density, and estimated traffic demands. However, the plan may fail to serve particular vulnerable subpopulations, resulting in access disparities during emergency response. For a response plan to be effective, sufficient mitigation resources must be made accessible to target populations within short, federally-mandated time frames. A major challenge in response plan design is to establish a balance between the allocation of available resources and the provision of equal access to PODs for all individuals in a given geographic region. Limitations on the availability, granularity, and currency of data to identify vulnerable populations further complicate the planning process. To address these challenges and limitations, data driven methods to quantify vulnerabilities in the context of response plans have been developed and are explored in this article. PMID:26771551

  8. Quantifying Ant Activity Using Vibration Measurements

    PubMed Central

    Oberst, Sebastian; Baro, Enrique Nava; Lai, Joseph C. S.; Evans, Theodore A.

    2014-01-01

    Ant behaviour is of great interest due to their sociality. Ant behaviour is typically observed visually, however there are many circumstances where visual observation is not possible. It may be possible to assess ant behaviour using vibration signals produced by their physical movement. We demonstrate through a series of bioassays with different stimuli that the level of activity of meat ants (Iridomyrmex purpureus) can be quantified using vibrations, corresponding to observations with video. We found that ants exposed to physical shaking produced the highest average vibration amplitudes followed by ants with stones to drag, then ants with neighbours, illuminated ants and ants in darkness. In addition, we devised a novel method based on wavelet decomposition to separate the vibration signal owing to the initial ant behaviour from the substrate response, which will allow signals recorded from different substrates to be compared directly. Our results indicate the potential to use vibration signals to classify some ant behaviours in situations where visual observation could be difficult. PMID:24658467

  9. Long-Term Trial Results Show No Mortality Benefit from Annual Prostate Cancer Screening

    Cancer.gov

    Thirteen year follow-up data from the Prostate, Lung, Colorectal and Ovarian (PLCO) cancer screening trial show higher incidence but similar mortality among men screened annually with the prostate-specific antigen (PSA) test and digital rectal examination

  10. Quantitative real-time PCR method with internal amplification control to quantify cyclopiazonic acid producing molds in foods.

    PubMed

    Rodríguez, Alicia; Werning, María L; Rodríguez, Mar; Bermúdez, Elena; Córdoba, Juan J

    2012-12-01

    A quantitative TaqMan real-time PCR (qPCR) method that includes an internal amplification control (IAC) to quantify cyclopiazonic acid (CPA)-producing molds in foods has been developed. A specific primer pair (dmaTF/dmaTR) and a TaqMan probe (dmaTp) were designed on the basis of dmaT gene which encodes the enzyme dimethylallyl tryptophan synthase involved in the biosynthesis of CPA. The IAC consisted of a 105 bp chimeric DNA fragment containing a region of the hly gene of Listeria monocytogenes. Thirty-two mold reference strains representing CPA producers and non-producers of different mold species were used in this study. All strains were tested for CPA production by high-performance liquid chromatography-mass spectrometry (HPLC-MS). The functionality of the designed qPCR method was demonstrated by the high linear relationship of the standard curves relating to the dmaT gene copy numbers and the Ct values obtained from the different CPA producers tested. The ability of the qPCR protocol to quantify CPA-producing molds was evaluated in different artificially inoculated foods. A good linear correlation was obtained over the range 1-4 log cfu/g in the different food matrices. The detection limit in all inoculated foods ranged from 1 to 2 log cfu/g. This qPCR protocol including an IAC showed good efficiency to quantify CPA-producing molds in naturally contaminated foods avoiding false negative results. This method could be used to monitor the CPA producers in the HACCP programs to prevent the risk of CPA formation throughout the food chain. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Identifying and quantifying interactions in a laboratory swarm

    NASA Astrophysics Data System (ADS)

    Puckett, James; Kelley, Douglas; Ouellette, Nicholas

    2013-03-01

    Emergent collective behavior, such as in flocks of birds or swarms of bees, is exhibited throughout the animal kingdom. Many models have been developed to describe swarming and flocking behavior using systems of self-propelled particles obeying simple rules or interacting via various potentials. However, due to experimental difficulties and constraints, little empirical data exists for characterizing the exact form of the biological interactions. We study laboratory swarms of flying Chironomus riparius midges, using stereoimaging and particle tracking techniques to record three-dimensional trajectories for all the individuals in the swarm. We describe methods to identify and quantify interactions by examining these trajectories, and report results on interaction magnitude, frequency, and mutuality.

  12. Quantifying bushfire penetration into urban areas in Australia

    NASA Astrophysics Data System (ADS)

    Chen, Keping; McAneney, John

    2004-06-01

    The extent and trajectory of bushfire penetration at the bushland-urban interface are quantified using data from major historical fires in Australia. We find that the maximum distance at which homes are destroyed is typically less than 700 m. The probability of home destruction emerges as a simple linear and decreasing function of distance from the bushland-urban boundary but with a variable slope that presumably depends upon fire regime and human intervention. The collective data suggest that the probability of home destruction at the forest edge is around 60%. Spatial patterns of destroyed homes display significant neighbourhood clustering. Our results provide revealing spatial evidence for estimating fire risk to properties and suggest an ember-attack model.

  13. Quantifying antiviral activity optimizes drug combinations against hepatitis C virus infection.

    PubMed

    Koizumi, Yoshiki; Ohashi, Hirofumi; Nakajima, Syo; Tanaka, Yasuhito; Wakita, Takaji; Perelson, Alan S; Iwami, Shingo; Watashi, Koichi

    2017-02-21

    With the introduction of direct-acting antivirals (DAAs), treatment against hepatitis C virus (HCV) has significantly improved. To manage and control this worldwide infectious disease better, the "best" multidrug treatment is demanded based on scientific evidence. However, there is no method available that systematically quantifies and compares the antiviral efficacy and drug-resistance profiles of drug combinations. Based on experimental anti-HCV profiles in a cell culture system, we quantified the instantaneous inhibitory potential (IIP), which is the logarithm of the reduction in viral replication events, for both single drugs and multiple-drug combinations. From the calculated IIP of 15 anti-HCV drugs from different classes [telaprevir, danoprevir, asunaprevir, simeprevir, sofosbuvir (SOF), VX-222, dasabuvir, nesbuvir, tegobuvir, daclatasvir, ledipasvir, IFN-α, IFN-λ1, cyclosporin A, and SCY-635], we found that the nucleoside polymerase inhibitor SOF had one of the largest potentials to inhibit viral replication events. We also compared intrinsic antiviral activities of a panel of drug combinations. Our quantification analysis clearly indicated an advantage of triple-DAA treatments over double-DAA treatments, with triple-DAA treatments showing enhanced antiviral activity and a significantly lower probability for drug resistance to emerge at clinically relevant drug concentrations. Our framework provides quantitative information to consider in designing multidrug strategies before costly clinical trials.

  14. Quantifying cardiorespiratory responses resulting from speed and slope increments during motorized treadmill propulsion among manual wheelchair users.

    PubMed

    Gauthier, Cindy; Grangeon, Murielle; Ananos, Ludivine; Brosseau, Rachel; Gagnon, Dany H

    2017-09-01

    Cardiorespiratory fitness assessment and training among manual wheelchair (MW) users are predominantly done with an arm-crank ergometer. However, arm-crank ergometer biomechanics differ substantially from MW propulsion biomechanics. This study aimed to quantify cardiorespiratory responses resulting from speed and slope increments during MW propulsion on a motorized treadmill and to calculate a predictive equation based on speed and slope for estimating peak oxygen uptake (VO 2peak ) in MW users. In total, 17 long-term MW users completed 12 MW propulsion periods (PP), each lasting 2min, on a motorized treadmill, in a random order. Each PP was separated by a 2-min rest. PPs were characterized by a combination of 3 speeds (0.6, 0.8 and 1.0m/s) and 4 slopes (0°, 2.7°, 3.6° and 4.8°). Six key cardiorespiratory outcome measures (VO 2 , heart rate, respiratory rate, minute ventilation and tidal volume) were recorded by using a gas-exchange analysis system. Rate of perceived exertion (RPE) was measured by using the modified 10-point Borg scale after each PP. For the 14 participants who completed the test, cardiorespiratory responses increased in response to speed and/or slope increments, except those recorded between the 3.6 o and 4.8 o slope, for which most outcome measures were comparable. The RPE was positively associated with cardiorespiratory response (r s ≥0.85). A VO 2 predictive equation (R 2 =99.7%) based on speed and slope for each PP was computed. This equation informed the development of a future testing protocol to linearly increase VO 2 via 1-min stages during treadmill MW propulsion. Increasing speed and slope while propelling a MW on a motorized treadmill increases cardiorespiratory response along with RPE. RPE can be used to easily and accurately monitor cardiorespiratory responses during MW exercise. The VO 2 can be predicted to some extent by speed and slope during MW propulsion. A testing protocol is proposed to assess cardiorespiratory fitness

  15. Quantifying alteration of river flow regime by large reservoirs in France

    NASA Astrophysics Data System (ADS)

    Cipriani, Thomas; Sauquet, Eric

    2017-04-01

    Reservoirs may highly modify river flow regime. Knowing the alterations is of importance to better understand the biological and physical patterns along the river network. However data are not necessary available to carry out an analysis of modifications at a national scale, e.g. due to industrial interests or to lack of measurements. The objective of this study is to quantify the changes in a set of hydrological indices due to large reservoirs in France combining different data sources. The analysis is based on a comparison between influenced discharges (observed discharges) and natural discharges available from: (i) gauging stations available upstream the dam, (ii) regionalization procedures (Sauquet et al., 2008; Sauquet et Catalogne, 2011; Cipriani et al., 2012), or (iii) historical data free from human influence close to the dam location. The impact of large reservoirs is assessed considering different facets of the river flow regime, including flood quantiles, low flow characteristics, quantiles from the flow duration curve and the twelve mean monthly discharges. The departures from the indice representative of natural conditions quantify the effect of the reservoir management on the river flow regime. The analysis is based on 62 study cases. Results show large spread in terms of impact depending on the purposes of the reservoirs and the season of interest. Results also point out inconsistencies in data (water balance between outflow and inflow, downstream of the dam is not warranted) due to uncertainties in mean monthly discharges and to the imperfect knowledge of inflows and outflows. Lastly, we suggest a typology of hydrological alterations based on the purposes of the reservoirs. Cipriani T., Toilliez T., Sauquet E. (2012). Estimating 10 year return period peak flows and flood durations at ungauged locations in France. La Houille Blanche, 4-5: 5-13, doi : 10.1051/lhb/2012024. Sauquet E., Catalogne C. (2011). Comparison of catchment grouping methods for

  16. Cosmogenic 36Cl in karst waters: Quantifying contributions from atmospheric and bedrock sources

    NASA Astrophysics Data System (ADS)

    Johnston, V. E.; McDermott, F.

    2009-12-01

    Improved reconstructions of cosmogenic isotope production through time are crucial to understand past solar variability. As a preliminary step to derive atmospheric 36Cl/Cl solar proxy time-series from speleothems, we quantify 36Cl sources in cave dripwaters. Atmospheric 36Cl fallout rates are a potential proxy for solar output; however extraneous 36Cl derived from in-situ production in cave host-rocks could complicate the solar signal. Results from numerical modeling and preliminary geochemical data presented here show that the atmospheric 36Cl source dominates in many, but not all cave dripwaters. At favorable low elevation, mid-latitude sites, 36Cl based speleothem solar irradiance reconstructions could extend back to 500 ka, with a possible centennial scale temporal resolution. This would represent a marginal improvement in resolution compared with existing polar ice core records, with the added advantages of a wider geographic range, independent U-series constrained chronology, and the potential for contemporaneous climate signals within the same speleothem material.

  17. SU-E-T-789: Validation of 3DVH Accuracy On Quantifying Delivery Errors Based On Clinical Relevant DVH Metrics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, T; Kumaraswamy, L

    Purpose: Detection of treatment delivery errors is important in radiation therapy. However, accurate quantification of delivery errors is also of great importance. This study aims to evaluate the 3DVH software’s ability to accurately quantify delivery errors. Methods: Three VMAT plans (prostate, H&N and brain) were randomly chosen for this study. First, we evaluated whether delivery errors could be detected by gamma evaluation. Conventional per-beam IMRT QA was performed with the ArcCHECK diode detector for the original plans and for the following modified plans: (1) induced dose difference error up to ±4.0% and (2) control point (CP) deletion (3 to 10more » CPs were deleted) (3) gantry angle shift error (3 degree uniformly shift). 2D and 3D gamma evaluation were performed for all plans through SNC Patient and 3DVH, respectively. Subsequently, we investigated the accuracy of 3DVH analysis for all cases. This part evaluated, using the Eclipse TPS plans as standard, whether 3DVH accurately can model the changes in clinically relevant metrics caused by the delivery errors. Results: 2D evaluation seemed to be more sensitive to delivery errors. The average differences between ECLIPSE predicted and 3DVH results for each pair of specific DVH constraints were within 2% for all three types of error-induced treatment plans, illustrating the fact that 3DVH is fairly accurate in quantifying the delivery errors. Another interesting observation was that even though the gamma pass rates for the error plans are high, the DVHs showed significant differences between original plan and error-induced plans in both Eclipse and 3DVH analysis. Conclusion: The 3DVH software is shown to accurately quantify the error in delivered dose based on clinically relevant DVH metrics, where a conventional gamma based pre-treatment QA might not necessarily detect.« less

  18. Using High Resolution Regional Climate Models to Quantify the Snow Albedo Feedback in a Region of Complex Terrain

    NASA Astrophysics Data System (ADS)

    Letcher, T.; Minder, J. R.

    2015-12-01

    High resolution regional climate models are used to characterize and quantify the snow albedo feedback (SAF) over the complex terrain of the Colorado Headwaters region. Three pairs of 7-year control and pseudo global warming simulations (with horizontal grid spacings of 4, 12, and 36 km) are used to study how the SAF modifies the regional climate response to a large-scale thermodynamic perturbation. The SAF substantially enhances warming within the Headwaters domain, locally as much as 5 °C in regions of snow loss. The SAF also increases the inter-annual variability of the springtime warming within Headwaters domain under the perturbed climate. Linear feedback analysis is used quantify the strength of the SAF. The SAF attains a maximum value of 4 W m-2 K-1 during April when snow loss coincides with strong incoming solar radiation. On sub-seasonal timescales, simulations at 4 km and 12 km horizontal grid-spacing show good agreement in the strength and timing of the SAF, whereas a 36km simulation shows greater discrepancies that are tired to differences in snow accumulation and ablation caused by smoother terrain. An analysis of the regional energy budget shows that transport by atmospheric motion acts as a negative feedback to regional warming, damping the effects of the SAF. On the mesoscale, this transport causes non-local warming in locations with no snow. The methods presented here can be used generally to quantify the role of the SAF in other regional climate modeling experiments.

  19. Quantifying selective elbow movements during an exergame in children with neurological disorders: a pilot study.

    PubMed

    van Hedel, Hubertus J A; Häfliger, Nadine; Gerber, Corinna N

    2016-10-21

    It is difficult to distinguish between restorative and compensatory mechanisms underlying (pediatric) neurorehabilitation, as objective measures assessing selective voluntary motor control (SVMC) are scarce. We aimed to quantify SVMC of elbow movements in children with brain lesions. Children played an airplane game with the glove-based YouGrabber system. Participants were instructed to steer an airplane on a screen through a cloud-free path by correctly applying bilateral elbow flexion and extension movements. Game performance measures were (i) % time on the correct path and (ii) similarity between the ideal flight path and the actually flown path. SVMC was quantified by calculating a correlation coefficient between the derivative of the ideal path and elbow movements. A therapist scored whether the child had used compensatory movements. Thirty-three children with brain lesions (11 girls; 12.6 ± 3.6 years) participated. Clinical motor and cognitive scores correlated moderately with SVMC (0.50-0.74). Receiver Operating Characteristics analyses showed that SVMC could differentiate well and better than clinical and game performance measures between compensatory and physiological movements. We conclude that a simple measure assessed while playing a game appears promising in quantifying SVMC. We propose how to improve the methodology, and how this approach can be easily extended to other joints.

  20. A mathematical method for quantifying in vivo mechanical behaviour of heel pad under dynamic load.

    PubMed

    Naemi, Roozbeh; Chatzistergos, Panagiotis E; Chockalingam, Nachiappan

    2016-03-01

    Mechanical behaviour of the heel pad, as a shock attenuating interface during a foot strike, determines the loading on the musculoskeletal system during walking. The mathematical models that describe the force deformation relationship of the heel pad structure can determine the mechanical behaviour of heel pad under load. Hence, the purpose of this study was to propose a method of quantifying the heel pad stress-strain relationship using force-deformation data from an indentation test. The energy input and energy returned densities were calculated by numerically integrating the area below the stress-strain curve during loading and unloading, respectively. Elastic energy and energy absorbed densities were calculated as the sum of and the difference between energy input and energy returned densities, respectively. By fitting the energy function, derived from a nonlinear viscoelastic model, to the energy density-strain data, the elastic and viscous model parameters were quantified. The viscous and elastic exponent model parameters were significantly correlated with maximum strain, indicating the need to perform indentation tests at realistic maximum strains relevant to walking. The proposed method showed to be able to differentiate between the elastic and viscous components of the heel pad response to loading and to allow quantifying the corresponding stress-strain model parameters.

  1. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  2. A probabilistic approach to quantifying hydrologic thresholds regulating migration of adult Atlantic salmon into spawning streams

    NASA Astrophysics Data System (ADS)

    Lazzaro, G.; Soulsby, C.; Tetzlaff, D.; Botter, G.

    2017-03-01

    Atlantic salmon is an economically and ecologically important fish species, whose survival is dependent on successful spawning in headwater rivers. Streamflow dynamics often have a strong control on spawning because fish require sufficiently high discharges to move upriver and enter spawning streams. However, these streamflow effects are modulated by biological factors such as the number and the timing of returning fish in relation to the annual spawning window in the fall/winter. In this paper, we develop and apply a novel probabilistic approach to quantify these interactions using a parsimonious outflux-influx model linking the number of female salmon emigrating (i.e., outflux) and returning (i.e., influx) to a spawning stream in Scotland. The model explicitly accounts for the interannual variability of the hydrologic regime and the hydrological connectivity of spawning streams to main rivers. Model results are evaluated against a detailed long-term (40 years) hydroecological data set that includes annual fluxes of salmon, allowing us to explicitly assess the role of discharge variability. The satisfactory model results show quantitatively that hydrologic variability contributes to the observed dynamics of salmon returns, with a good correlation between the positive (negative) peaks in the immigration data set and the exceedance (nonexceedance) probability of a threshold flow (0.3 m3/s). Importantly, model performance deteriorates when the interannual variability of flow regime is disregarded. The analysis suggests that flow thresholds and hydrological connectivity for spawning return represent a quantifiable and predictable feature of salmon rivers, which may be helpful in decision making where flow regimes are altered by water abstractions.

  3. Quantifying Stock Return Distributions in Financial Markets.

    PubMed

    Botta, Federico; Moat, Helen Susannah; Stanley, H Eugene; Preis, Tobias

    2015-01-01

    Being able to quantify the probability of large price changes in stock markets is of crucial importance in understanding financial crises that affect the lives of people worldwide. Large changes in stock market prices can arise abruptly, within a matter of minutes, or develop across much longer time scales. Here, we analyze a dataset comprising the stocks forming the Dow Jones Industrial Average at a second by second resolution in the period from January 2008 to July 2010 in order to quantify the distribution of changes in market prices at a range of time scales. We find that the tails of the distributions of logarithmic price changes, or returns, exhibit power law decays for time scales ranging from 300 seconds to 3600 seconds. For larger time scales, we find that the distributions tails exhibit exponential decay. Our findings may inform the development of models of market behavior across varying time scales.

  4. Quantifying and Reducing Light Pollution

    NASA Astrophysics Data System (ADS)

    Gokhale, Vayujeet; Caples, David; Goins, Jordan; Herdman, Ashley; Pankey, Steven; Wren, Emily

    2018-06-01

    We describe the current level of light pollution in and around Kirksville, Missouri and around Anderson Mesa near Flagstaff, Arizona. We quantify the amount of light that is projected up towards the sky, instead of the ground, using Unihedron sky quality meters installed at various locations. We also present results from DSLR photometry of several standard stars, and compare the photometric quality of the data collected at locations with varying levels of light pollution. Presently, light fixture shields and ‘warm-colored’ lights are being installed on Truman State University’s campus in order to reduce light pollution. We discuss the experimental procedure we use to test the effectiveness of the different light fixtures shields in a controlled setting inside the Del and Norma Robison Planetarium.Apart from negatively affecting the quality of the night sky for astronomers, light pollution adversely affects migratory patterns of some animals and sleep-patterns in humans, increases our carbon footprint, and wastes resources and money. This problem threatens to get particularly acute with the increasing use of outdoor LED lamps. We conclude with a call to action to all professional and amateur astronomers to act against the growing nuisance of light pollution.

  5. Quantifying Human Visible Color Variation from High Definition Digital Images of Orb Web Spiders.

    PubMed

    Tapia-McClung, Horacio; Ajuria Ibarra, Helena; Rao, Dinesh

    2016-01-01

    Digital processing and analysis of high resolution images of 30 individuals of the orb web spider Verrucosa arenata were performed to extract and quantify human visible colors present on the dorsal abdomen of this species. Color extraction was performed with minimal user intervention using an unsupervised algorithm to determine groups of colors on each individual spider, which was then analyzed in order to quantify and classify the colors obtained, both spatially and using energy and entropy measures of the digital images. Analysis shows that the colors cover a small region of the visible spectrum, are not spatially homogeneously distributed over the patterns and from an entropic point of view, colors that cover a smaller region on the whole pattern carry more information than colors covering a larger region. This study demonstrates the use of processing tools to create automatic systems to extract valuable information from digital images that are precise, efficient and helpful for the understanding of the underlying biology.

  6. Quantifying Human Visible Color Variation from High Definition Digital Images of Orb Web Spiders

    PubMed Central

    Ajuria Ibarra, Helena; Rao, Dinesh

    2016-01-01

    Digital processing and analysis of high resolution images of 30 individuals of the orb web spider Verrucosa arenata were performed to extract and quantify human visible colors present on the dorsal abdomen of this species. Color extraction was performed with minimal user intervention using an unsupervised algorithm to determine groups of colors on each individual spider, which was then analyzed in order to quantify and classify the colors obtained, both spatially and using energy and entropy measures of the digital images. Analysis shows that the colors cover a small region of the visible spectrum, are not spatially homogeneously distributed over the patterns and from an entropic point of view, colors that cover a smaller region on the whole pattern carry more information than colors covering a larger region. This study demonstrates the use of processing tools to create automatic systems to extract valuable information from digital images that are precise, efficient and helpful for the understanding of the underlying biology. PMID:27902724

  7. Quantifying induced effects of subsurface renewable energy storage

    NASA Astrophysics Data System (ADS)

    Bauer, Sebastian; Beyer, Christof; Pfeiffer, Tilmann; Boockmeyer, Anke; Popp, Steffi; Delfs, Jens-Olaf; Wang, Bo; Li, Dedong; Dethlefsen, Frank; Dahmke, Andreas

    2015-04-01

    New methods and technologies for energy storage are required for the transition to renewable energy sources. Subsurface energy storage systems such as salt caverns or porous formations offer the possibility of hosting large amounts of energy or substance. When employing these systems, an adequate system and process understanding is required in order to assess the feasibility of the individual storage option at the respective site and to predict the complex and interacting effects induced. This understanding is the basis for assessing the potential as well as the risks connected with a sustainable usage of these storage options, especially when considering possible mutual influences. For achieving this aim, in this work synthetic scenarios for the use of the geological underground as an energy storage system are developed and parameterized. The scenarios are designed to represent typical conditions in North Germany. The types of subsurface use investigated here include gas storage and heat storage in porous formations. The scenarios are numerically simulated and interpreted with regard to risk analysis and effect forecasting. For this, the numerical simulators Eclipse and OpenGeoSys are used. The latter is enhanced to include the required coupled hydraulic, thermal, geomechanical and geochemical processes. Using the simulated and interpreted scenarios, the induced effects are quantified individually and monitoring concepts for observing these effects are derived. This presentation will detail the general investigation concept used and analyze the parameter availability for this type of model applications. Then the process implementation and numerical methods required and applied for simulating the induced effects of subsurface storage are detailed and explained. Application examples show the developed methods and quantify induced effects and storage sizes for the typical settings parameterized. This work is part of the ANGUS+ project, funded by the German Ministry

  8. Conventional physical therapy and physical therapy based on reflex stimulation showed similar results in children with myelomeningocele.

    PubMed

    Aizawa, Carolina Y P; Morales, Mariana P; Lundberg, Carolina; Moura, Maria Clara D Soares de; Pinto, Fernando C G; Voos, Mariana C; Hasue, Renata H

    2017-03-01

    We aimed to investigate whether infants with myelomeningocele would improve their motor ability and functional independence after ten sessions of physical therapy and compare the outcomes of conventional physical therapy (CPT) to a physical therapy program based on reflex stimulation (RPT). Twelve children were allocated to CPT (n = 6, age 18.3 months) or RPT (n = 6, age 18.2 months). The RPT involved proprioceptive neuromuscular facilitation. Children were assessed with the Gross Motor Function Measure and the Pediatric Evaluation of Disability Inventory before and after treatment. Mann-Whitney tests compared the improvement on the two scales of CPT versus RPT and the Wilcoxon test compared CPT to RPT (before vs. after treatment). Possible correlations between the two scales were tested with Spearman correlation coefficients. Both groups showed improvement on self-care and mobility domains of both scales. There were no differences between the groups, before, or after intervention. The CPT and RPT showed similar results after ten weeks of treatment.

  9. Quantifying the costs and benefits of occupational health and safety interventions at a Bangladesh shipbuilding company.

    PubMed

    Thiede, Irene; Thiede, Michael

    2015-01-01

    This study is the first cost-benefit analysis (CBA) of occupational health and safety (OHS) in a low-income country. It focuses on one of the largest shipbuilding companies in Bangladesh, where globally recognised Occupational Health and Safety Advisory Services (OHSAS) 18001 certification was achieved in 2012. The study examines the relative costs of implementing OHS measures against qualitative and quantifiable benefits of implementation in order to determine whether OHSAS measures are economically advantageous. Quantifying past costs and benefits and discounting future ones, this study looks at the returns of OHS measures at Western Marine Shipbuilding Company Ltd. Costs included investments in workplace and environmental safety, a new clinic that also serves the community, and personal protective equipment (PPE) and training. The results are impressive: previously high injury statistics dropped to close to zero. OHS measures decrease injuries, increase efficiency, and bring income security to workers' families. Certification has proven a competitive edge for the shipyard, resulting in access to greater markets. Intangible benefits such as trust, motivation and security are deemed crucial in the CBA, and this study finds the high investments made are difficult to offset with quantifiable benefits alone.

  10. An uncertainty-based framework to quantifying climate change impacts on coastal flood vulnerability: case study of New York City.

    PubMed

    Zahmatkesh, Zahra; Karamouz, Mohammad

    2017-10-17

    estimated with and without consideration of climate change impacts and after implementation of LIDs. Results show that climate change has the potential to increase rainfall intensity, flood volume, floodplain extent, and flood depth in the watershed. The results also reveal that improving system resilience by reinforcing the adaptation capacity through implementing LIDs could mitigate flood vulnerability. Moreover, the results indicate the significant effect of uncertainties, arising from the factors' weights as well as climate change, impacts modeling approach, on quantifying flood vulnerability. This study underlines the importance of developing applicable schemes to quantify coastal flood vulnerability for evolving future responses to adverse impacts of climate change.

  11. Anaphoric Reference to Quantified Antecedents: An Event-Related Brain Potential Study

    ERIC Educational Resources Information Center

    Filik, Ruth; Leuthold, Hartmut; Moxey, Linda M.; Sanford, Anthony J.

    2011-01-01

    We report an event-related brain potential (ERP) study examining how readers process sentences containing anaphoric reference to quantified antecedents. Previous studies indicate that positive (e.g. "many") and negative (e.g. "not many") quantifiers cause readers to focus on different sets of entities. For example in "Many of the fans attended the…

  12. Disruption of diphthamide synthesis genes and resulting toxin resistance as a robust technology for quantifying and optimizing CRISPR/Cas9-mediated gene editing.

    PubMed

    Killian, Tobias; Dickopf, Steffen; Haas, Alexander K; Kirstenpfad, Claudia; Mayer, Klaus; Brinkmann, Ulrich

    2017-11-13

    We have devised an effective and robust method for the characterization of gene-editing events. The efficacy of editing-mediated mono- and bi-allelic gene inactivation and integration events is quantified based on colony counts. The combination of diphtheria toxin (DT) and puromycin (PM) selection enables analyses of 10,000-100,000 individual cells, assessing hundreds of clones with inactivated genes per experiment. Mono- and bi-allelic gene inactivation is differentiated by DT resistance, which occurs only upon bi-allelic inactivation. PM resistance indicates integration. The robustness and generalizability of the method were demonstrated by quantifying the frequency of gene inactivation and cassette integration under different editing approaches: CRISPR/Cas9-mediated complete inactivation was ~30-50-fold more frequent than cassette integration. Mono-allelic inactivation without integration occurred >100-fold more frequently than integration. Assessment of gRNA length confirmed 20mers to be most effective length for inactivation, while 16-18mers provided the highest overall integration efficacy. The overall efficacy was ~2-fold higher for CRISPR/Cas9 than for zinc-finger nuclease and was significantly increased upon modulation of non-homologous end joining or homology-directed repair. The frequencies and ratios of editing events were similar for two different DPH genes (independent of the target sequence or chromosomal location), which indicates that the optimization parameters identified with this method can be generalized.

  13. Quantifying parametric uncertainty in the Rothermel model

    Treesearch

    S. Goodrick

    2008-01-01

    The purpose of the present work is to quantify parametric uncertainty in the Rothermel wildland fire spreadmodel (implemented in software such as fire spread models in the United States. This model consists of a non-linear system of equations that relates environmentalvariables (input parameter groups...

  14. New measurements quantify atmospheric greenhouse effect

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Atreyee

    2012-10-01

    In spite of a large body of existing measurements of incoming short-wave solar radiation and outgoing long-wave terrestrial radiation at the surface of the Earth and, more recently, in the upper atmosphere, there are few observations documenting how radiation profiles change through the atmosphere—information that is necessary to fully quantify the greenhouse effect of Earth's atmosphere. Through the use of existing technology but employing improvements in observational techniques it may now be possible not only to quantify but also to understand how different components of the atmosphere (e.g., concentration of gases, cloud cover, moisture, and aerosols) contribute to the greenhouse effect. Using weather balloons equipped with radiosondes, Philipona et al. continuously measured radiation fluxes from the surface of Earth up to altitudes of 35 kilometers in the upper stratosphere. Combining data from flights conducted during both day and night with continuous 24-hour measurements made at the surface of the Earth, the researchers created radiation profiles of all four components necessary to fully capture the radiation budget of Earth, namely, the upward and downward short-wave and long-wave radiation as a function of altitude.

  15. Quantifying meta-correlations in financial markets

    NASA Astrophysics Data System (ADS)

    Kenett, Dror Y.; Preis, Tobias; Gur-Gershgoren, Gitit; Ben-Jacob, Eshel

    2012-08-01

    Financial markets are modular multi-level systems, in which the relationships between the individual components are not constant in time. Sudden changes in these relationships significantly affect the stability of the entire system, and vice versa. Our analysis is based on historical daily closing prices of the 30 components of the Dow Jones Industrial Average (DJIA) from March 15th, 1939 until December 31st, 2010. We quantify the correlation among these components by determining Pearson correlation coefficients, to investigate whether mean correlation of the entire portfolio can be used as a precursor for changes in the index return. To this end, we quantify the meta-correlation - the correlation of mean correlation and index return. We find that changes in index returns are significantly correlated with changes in mean correlation. Furthermore, we study the relationship between the index return and correlation volatility - the standard deviation of correlations for a given time interval. This parameter provides further evidence of the effect of the index on market correlations and their fluctuations. Our empirical findings provide new information and quantification of the index leverage effect, and have implications to risk management, portfolio optimization, and to the increased stability of financial markets.

  16. Using nonlinear methods to quantify changes in infant limb movements and vocalizations.

    PubMed

    Abney, Drew H; Warlaumont, Anne S; Haussman, Anna; Ross, Jessica M; Wallot, Sebastian

    2014-01-01

    The pairing of dynamical systems theory and complexity science brings novel concepts and methods to the study of infant motor development. Accordingly, this longitudinal case study presents a new approach to characterizing the dynamics of infant limb and vocalization behaviors. A single infant's vocalizations and limb movements were recorded from 51-days to 305-days of age. On each recording day, accelerometers were placed on all four of the infant's limbs and an audio recorder was worn on the child's chest. Using nonlinear time series analysis methods, such as recurrence quantification analysis and Allan factor, we quantified changes in the stability and multiscale properties of the infant's behaviors across age as well as how these dynamics relate across modalities and effectors. We observed that particular changes in these dynamics preceded or coincided with the onset of various developmental milestones. For example, the largest changes in vocalization dynamics preceded the onset of canonical babbling. The results show that nonlinear analyses can help to understand the functional co-development of different aspects of infant behavior.

  17. Using nonlinear methods to quantify changes in infant limb movements and vocalizations

    PubMed Central

    Abney, Drew H.; Warlaumont, Anne S.; Haussman, Anna; Ross, Jessica M.; Wallot, Sebastian

    2014-01-01

    The pairing of dynamical systems theory and complexity science brings novel concepts and methods to the study of infant motor development. Accordingly, this longitudinal case study presents a new approach to characterizing the dynamics of infant limb and vocalization behaviors. A single infant's vocalizations and limb movements were recorded from 51-days to 305-days of age. On each recording day, accelerometers were placed on all four of the infant's limbs and an audio recorder was worn on the child's chest. Using nonlinear time series analysis methods, such as recurrence quantification analysis and Allan factor, we quantified changes in the stability and multiscale properties of the infant's behaviors across age as well as how these dynamics relate across modalities and effectors. We observed that particular changes in these dynamics preceded or coincided with the onset of various developmental milestones. For example, the largest changes in vocalization dynamics preceded the onset of canonical babbling. The results show that nonlinear analyses can help to understand the functional co-development of different aspects of infant behavior. PMID:25161629

  18. Quantifying similarity of pore-geometry in nanoporous materials

    DOE PAGES

    Lee, Yongjin; Barthel, Senja D.; Dłotko, Paweł; ...

    2017-05-23

    In most applications of nanoporous materials the pore structure is as important as the chemical composition as a determinant of performance. For example, one can alter performance in applications like carbon capture or methane storage by orders of magnitude by only modifying the pore structure. For these applications it is therefore important to identify the optimal pore geometry and use this information to find similar materials. But, the mathematical language and tools to identify materials with similar pore structures, but different composition, has been lacking. We develop a pore recognition approach to quantify similarity of pore structures and classify themmore » using topological data analysis. This then allows us to identify materials with similar pore geometries, and to screen for materials that are similar to given top-performing structures. Using methane storage as a case study, we also show that materials can be divided into topologically distinct classes requiring different optimization strategies.« less

  19. Validation Results for LEWICE 2.0

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Rutkowski, Adam

    1999-01-01

    A research project is underway at NASA Lewis to produce a computer code which can accurately predict ice growth under any meteorological conditions for any aircraft surface. This report will present results from version 2.0 of this code, which is called LEWICE. This version differs from previous releases due to its robustness and its ability to reproduce results accurately for different spacing and time step criteria across computing platform. It also differs in the extensive amount of effort undertaken to compare the results in a quantified manner against the database of ice shapes which have been generated in the NASA Lewis Icing Research Tunnel (IRT). The results of the shape comparisons are analyzed to determine the range of meteorological conditions under which LEWICE 2.0 is within the experimental repeatability. This comparison shows that the average variation of LEWICE 2.0 from the experimental data is 7.2% while the overall variability of the experimental data is 2.5%.

  20. Quantifying postfire aeolian sediment transport using rare earth element tracers

    USGS Publications Warehouse

    Dukes, David; Gonzales, Howell B.; Ravi, Sujith; Grandstaff, David E.; Van Pelt, R. Scott; Li, Junran; Wang, Guan; Sankey, Joel B.

    2018-01-01

    Grasslands, which provide fundamental ecosystem services in many arid and semiarid regions of the world, are undergoing rapid increases in fire activity and are highly susceptible to postfire-accelerated soil erosion by wind. A quantitative assessment of physical processes that integrates fire-wind erosion feedbacks is therefore needed relative to vegetation change, soil biogeochemical cycling, air quality, and landscape evolution. We investigated the applicability of a novel tracer technique—the use of multiple rare earth elements (REE)—to quantify soil transport by wind and to identify sources and sinks of wind-blown sediments in both burned and unburned shrub-grass transition zone in the Chihuahuan Desert, NM, USA. Results indicate that the horizontal mass flux of wind-borne sediment increased approximately threefold following the fire. The REE tracer analysis of wind-borne sediments shows that the source of the horizontal mass flux in the unburned site was derived from bare microsites (88.5%), while in the burned site it was primarily sourced from shrub (42.3%) and bare (39.1%) microsites. Vegetated microsites which were predominantly sinks of aeolian sediments in the unburned areas became sediment sources following the fire. The burned areas showed a spatial homogenization of sediment tracers, highlighting a potential negative feedback on landscape heterogeneity induced by shrub encroachment into grasslands. Though fires are known to increase aeolian sediment transport, accompanying changes in the sources and sinks of wind-borne sediments may influence biogeochemical cycling and land degradation dynamics. Furthermore, our experiment demonstrated that REEs can be used as reliable tracers for field-scale aeolian studies.

  1. Quantifying capital goods for waste incineration.

    PubMed

    Brogaard, L K; Riber, C; Christensen, T H

    2013-06-01

    Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000-240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000-26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000-5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7-14 kg CO2 per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2-3% with respect to kg CO2 per tonne of waste combusted. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Statistical physics approach to quantifying differences in myelinated nerve fibers

    PubMed Central

    Comin, César H.; Santos, João R.; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L.; Gabrielli, Andrea; da F. Costa, Luciano; Stanley, H. Eugene

    2014-01-01

    We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross–sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum. PMID:24676146

  3. Infrared imaging to quantify the effects of nicotine-induced vasoconstriction in humans

    NASA Astrophysics Data System (ADS)

    Brunner, Siegfried; Kargel, Christian

    2009-05-01

    Smoking is the most significant source of preventable morbidity and premature mortality worldwide (WHO-2008). One of the many effects of nicotine is vasoconstriction which is triggered by the autonomic nervous system. The constriction of blood vessels e.g. of the skin's vascular bed is responsible for a decrease of the supply with oxygen and nutrients and a lowering of the skin temperature. We used infrared imaging to quantify temperature decreases caused by cigarette smoking in the extremities of smokers and also monitored heart rate as well as blood pressure. The results - including thermograms showing "temporary amputations" of the fingertips due to a significant temperature drop - can help increase the awareness of the dangers of smoking and the success of withdrawal programs. Surprisingly, in our control persons (3 brave non-smoking volunteers who smoked a cigarette) we also found temperature increases suggesting that vasodilation (widening of blood vessels) was provoked by cigarettes. To verify this unexpected finding and eliminate effects from the 4000 chemical compounds in the smoke, we repeated the experiment following a stringent protocol ruling out physiological and psychological influences with 9 habitual smokers and 17 nonsmokers who all chew gums with 2 mg of nicotine. Task-optimized digital image processing techniques (target detection, image-registration and -segmentation) were applied to the acquired infrared image sequences to automatically yield temperature plots of the fingers and palm. In this paper we present the results of our study in detail and show that smokers and non-smokers respond differently to the administration of nicotine.

  4. Quantifying Attachment and Antibiotic Resistance of from Conventional and Organic Swine Manure.

    PubMed

    Zwonitzer, Martha R; Soupir, Michelle L; Jarboe, Laura R; Smith, Douglas R

    2016-03-01

    Broad-spectrum antibiotics are often administered to swine, contributing to the occurrence of antibiotic-resistant bacteria in their manure. During land application, the bacteria in swine manure preferentially attach to particles in the soil, affecting their transport in overland flow. However, a quantitative understanding of these attachment mechanisms is lacking, and their relationship to antibiotic resistance is unknown. The objective of this study is to examine the relationships between antibiotic resistance and attachment to very fine silica sand in collected from swine manure. A total of 556 isolates were collected from six farms, two organic and four conventional (antibiotics fed prophylactically). Antibiotic resistance was quantified using 13 antibiotics at three minimum inhibitory concentrations: resistant, intermediate, and susceptible. Of the 556 isolates used in the antibiotic resistance assays, 491 were subjected to an attachment assay. Results show that isolates from conventional systems were significantly more resistant to amoxicillin, ampicillin, chlortetracycline, erythromycin, kanamycin, neomycin, streptomycin, tetracycline, and tylosin ( < 0.001). Results also indicate that isolated from conventional systems attached to very fine silica sand at significantly higher levels than those from organic systems ( < 0.001). Statistical analysis showed that a significant relationship did not exist between antibiotic resistance levels and attachment in from conventional systems but did for organic systems ( < 0.001). Better quantification of these relationships is critical to understanding the behavior of in the environment and preventing exposure of human populations to antibiotic-resistant bacteria. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  5. A framework for quantifying net benefits of alternative prognostic models.

    PubMed

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-30

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd.

  6. A framework for quantifying net benefits of alternative prognostic models‡

    PubMed Central

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd. PMID:21905066

  7. Quantifying Stock Return Distributions in Financial Markets

    PubMed Central

    Botta, Federico; Moat, Helen Susannah; Stanley, H. Eugene; Preis, Tobias

    2015-01-01

    Being able to quantify the probability of large price changes in stock markets is of crucial importance in understanding financial crises that affect the lives of people worldwide. Large changes in stock market prices can arise abruptly, within a matter of minutes, or develop across much longer time scales. Here, we analyze a dataset comprising the stocks forming the Dow Jones Industrial Average at a second by second resolution in the period from January 2008 to July 2010 in order to quantify the distribution of changes in market prices at a range of time scales. We find that the tails of the distributions of logarithmic price changes, or returns, exhibit power law decays for time scales ranging from 300 seconds to 3600 seconds. For larger time scales, we find that the distributions tails exhibit exponential decay. Our findings may inform the development of models of market behavior across varying time scales. PMID:26327593

  8. The meaning of vaguely quantified frequency response options on a quality of life scale depends on respondents’ medical status and age

    PubMed Central

    Schneider, Stefan; Stone, Arthur A.

    2017-01-01

    Purpose Self-report items in quality of life (QoL) scales commonly use vague quantifiers like “sometimes” or “often” to measure the frequency of health-related experiences. This study examined whether the meaning of such vaguely quantified response options differs depending on people’s medical status and age, which may undermine the validity of QoL group comparisons. Methods Respondents (n = 600) rated the frequency of positive and negative QoL experiences using vague quantifiers (never, rarely, sometimes, often, always) and provided open-ended numeric frequency counts for the same items. Negative binomial regression analyses examined whether the numeric frequencies associated with each vague quantifier differed between medical status (no vs. one or more medical conditions) and age (18–40 years vs. 60+ years) groups. Results Compared to respondents without a chronic condition, those with a medical condition assigned a higher numeric frequency to the same vague quantifiers for negative QoL experiences; this effect was not evident for positive QoL experiences. Older respondents’ numeric frequencies were more extreme (i.e., lower at the low end and somewhat higher at the high end of the response range) than those of younger respondents. After adjusting for these effects, differences in QoL became somewhat more pronounced between medical status groups, but not between age groups. Conclusions The results suggest that people with different medical backgrounds and age do not interpret vague frequency quantifiers on a QoL scale in the same way. Open-ended numeric frequency reports may be useful to detect and potentially correct for differences in the meaning of vague quantifiers. PMID:27071685

  9. Live cell interferometry quantifies dynamics of biomass partitioning during cytokinesis.

    PubMed

    Zangle, Thomas A; Teitell, Michael A; Reed, Jason

    2014-01-01

    The equal partitioning of cell mass between daughters is the usual and expected outcome of cytokinesis for self-renewing cells. However, most studies of partitioning during cell division have focused on daughter cell shape symmetry or segregation of chromosomes. Here, we use live cell interferometry (LCI) to quantify the partitioning of daughter cell mass during and following cytokinesis. We use adherent and non-adherent mouse fibroblast and mouse and human lymphocyte cell lines as models and show that, on average, mass asymmetries present at the time of cleavage furrow formation persist through cytokinesis. The addition of multiple cytoskeleton-disrupting agents leads to increased asymmetry in mass partitioning which suggests the absence of active mass partitioning mechanisms after cleavage furrow positioning.

  10. Dome growth at Mount Cleveland, Aleutian Arc, quantified by time-series TerraSAR-X imagery

    USGS Publications Warehouse

    Wang, Teng; Poland, Michael; Lu, Zhong

    2016-01-01

    Synthetic aperture radar imagery is widely used to study surface deformation induced by volcanic activity; however, it is rarely applied to quantify the evolution of lava domes, which is important for understanding hazards and magmatic system characteristics. We studied dome formation associated with eruptive activity at Mount Cleveland, Aleutian Volcanic Arc, in 2011–2012 using TerraSAR-X imagery. Interferometry and offset tracking show no consistent deformation and only motion of the crater rim, suggesting that ascending magma may pass through a preexisting conduit system without causing appreciable surface deformation. Amplitude imagery has proven useful for quantifying rates of vertical and areal growth of the lava dome within the crater from formation to removal by explosive activity to rebirth. We expect that this approach can be applied at other volcanoes that host growing lava domes and where hazards are highly dependent on dome geometry and growth rates.

  11. A Field Study of NMR Logging to Quantify Petroleum Contamination in Subsurface Sediments

    NASA Astrophysics Data System (ADS)

    Fay, E. L.; Knight, R. J.; Grunewald, E. D.

    2016-12-01

    Nuclear magnetic resonance (NMR) measurements are directly sensitive to hydrogen-bearing fluids including water and petroleum products. NMR logging tools can be used to detect and quantify petroleum hydrocarbon contamination in the sediments surrounding a well or borehole. An advantage of the NMR method is that data can be collected in both cased and uncased holes. In order to estimate the volume of in-situ hydrocarbon, there must be sufficient contrast between either the relaxation times (T2) or the diffusion coefficients (D) of water and the contaminant. In a field study conducted in Pine Ridge, South Dakota, NMR logging measurements were used to investigate an area of hydrocarbon contamination from leaking underground storage tanks. A contaminant sample recovered from a monitoring well at the site was found to be consistent with a mixture of gasoline and diesel fuel. NMR measurements were collected in two PVC-cased monitoring wells; D and T2 measurements were used together to detect and quantify contaminant in the sediments above and below the water table at both of the wells. While the contrast in D between the fluids was found to be inadequate for fluid typing, the T2 contrast between the contaminant and water in silt enabled the estimation of the water and contaminant volumes. This study shows that NMR logging can be used to detect and quantify in-situ contamination, but also highlights the importance of sediment and contaminant properties that lead to a sufficiently large contrast in T2 or D.

  12. An immunological method for quantifying antibacterial activity in Salmo salar (Linnaeus, 1758) skin mucus.

    PubMed

    Narvaez, Edgar; Berendsen, Jorge; Guzmán, Fanny; Gallardo, José A; Mercado, Luis

    2010-01-01

    Antimicrobial peptides (AMPs) are a pivotal component of innate immunity in lower vertebrates. The aim of this study was to develop an immunological method for quantifying AMPs in Salmo salar skin mucus. A known antimicrobial peptide derived from histone H1 previously purified and described from S. salar skin mucus (SAMP H1) was chemically synthesized and used to obtain antibodies for the quantification of the molecule via ELISA. Using skin mucus samples, a correlation of bacterial growth inhibition versus SAMP H1 concentration (ELISA) was established. The results provide the first evidence for quantifying the presence of active AMPs in the skin mucus of S. salar through the use of an immunological method. Copyright 2009 Elsevier Ltd. All rights reserved.

  13. Quantifiable outcomes from corporate and higher education learning collaborations

    NASA Astrophysics Data System (ADS)

    Devine, Thomas G.

    The study investigated the existence of measurable learning outcomes that emerged out of the shared strengths of collaborating sponsors. The study identified quantifiable learning outcomes that confirm corporate, academic and learner participation in learning collaborations. Each of the three hypotheses and the synergy indicator quantitatively and qualitatively confirmed learning outcomes benefiting participants. The academic-indicator quantitatively confirmed that learning outcomes attract learners to the institution. The corporate-indicator confirmed that learning outcomes include knowledge exchange and enhanced workforce talents for careers in the energy-utility industry. The learner-indicator confirmed that learning outcomes provide professional development opportunities for employment. The synergy-indicator confirmed that best learning practices in learning collaborations emanate out of the sponsors' shared strengths, and that partnerships can be elevated to strategic alliances, going beyond response to the desires of sponsors to create learner-centered cultures. The synergy-indicator confirmed the value of organizational processes that elevate sponsors' interactions to sharing strength, to create a learner-centered culture. The study's series of qualitative questions confirmed prior success factors, while verifying the hypothesis results and providing insight not available from quantitative data. The direct benefactors of the study are the energy-utility learning-collaboration participants of the study, and corporation, academic institutions, and learners of the collaboration. The indirect benefactors are the stakeholders of future learning collaborations, through improved knowledge of the existence or absence of quantifiable learning outcomes.

  14. Alcohol Content in the ‘Hyper-Reality’ MTV Show ‘Geordie Shore’

    PubMed Central

    Lowe, Eden; Britton, John

    2018-01-01

    Abstract Aim To quantify the occurrence of alcohol content, including alcohol branding, in the popular primetime television UK Reality TV show ‘Geordie Shore’ Series 11. Methods A 1-min interval coding content analysis of alcohol content in the entire DVD Series 11 of ‘Geordie Shore’ (10 episodes). Occurrence of alcohol use, implied use, other alcohol reference/paraphernalia or branding was recorded. Results All categories of alcohol were present in all episodes. ‘Any alcohol’ content occurred in 78%, ‘actual alcohol use’ in 30%, ‘inferred alcohol use’ in 72%, and all ‘other’ alcohol references occurred in 59% of all coding intervals (ACIs), respectively. Brand appearances occurred in 23% of ACIs. The most frequently observed alcohol brand was Smirnoff which appeared in 43% of all brand appearances. Episodes categorized as suitable for viewing by adolescents below the legal drinking age of 18 years comprised of 61% of all brand appearances. Conclusions Alcohol content, including branding, is highly prevalent in the UK Reality TV show ‘Geordie Shore’ Series 11. Two-thirds of all alcohol branding occurred in episodes age-rated by the British Board of Film Classification (BBFC) as suitable for viewers aged 15 years. The organizations OfCom, Advertising Standards Authority (ASA) and the Portman Group should implement more effective policies to reduce adolescent exposure to on-screen drinking. The drinks industry should consider demanding the withdrawal of their brands from the show. Short Summary Alcohol content, including branding, is highly prevalent in the MTV reality TV show ‘Geordie Shore’ Series 11. Current alcohol regulation is failing to protect young viewers from exposure to such content. PMID:29365032

  15. Quantifying the morphodynamics of river restoration schemes using Unmanned Aerial Vehicles (UAVs)

    NASA Astrophysics Data System (ADS)

    Williams, Richard; Byrne, Patrick; Gilles, Eric; Hart, John; Hoey, Trevor; Maniatis, George; Moir, Hamish; Reid, Helen; Ves, Nikolas

    2017-04-01

    River restoration schemes are particularly sensitive to morphological adjustment during the first set of high-flow events that they are subjected to. Quantifying elevation change associated with morphological adjustment can contribute to improved adaptive decision making to ensure river restoration scheme objectives are achieved. To date the relatively high cost, technical demands and challenging logistics associated with acquiring repeat, high-resolution topographic surveys has resulted in a significant barrier to monitoring the three-dimensional morphodynamics of river restoration schemes. The availability of low-cost, consumer grade Unmanned Aerial Vehicles that are capable of acquiring imagery for processing using Structure-from-Motion Multi-View Stereo (SfM MVS) photogrammetry has the potential to transform the survey the morphodynamics of river restoration schemes. Application guidance does, however, need to be developed to fully exploit the advances of UAV technology and SfM MVS processing techniques. In particular, there is a need to quantify the effect of the number and spatial distribution of ground targets on vertical error. This is particularly significant because vertical errors propagate when mapping morphological change, and thus determine the evidence that is available for decision making. This presentation presents results from a study that investigated how the number and spatial distribution of targets influenced vertical error, and then used the findings to determine survey protocols for a monitoring campaign that has quantified morphological change across a number of restoration schemes. At the Swindale river restoration scheme, Cumbria, England, 31 targets were distributed across a 700 m long reach and the centre of each target was surveyed using RTK-GPS. Using the targets as General Control Points (GCPs) or checkpoints, they were divided into three different spatial patterns (centre, edge and random) and used for processing images acquired

  16. A compact clinical instrument for quantifying suppression.

    PubMed

    Black, Joanne M; Thompson, Benjamin; Maehara, Goro; Hess, Robert F

    2011-02-01

    We describe a compact and convenient clinical apparatus for the measurement of suppression based on a previously reported laboratory-based approach. In addition, we report and validate a novel, rapid psychophysical method for measuring suppression using this apparatus, which makes the technique more applicable to clinical practice. By using a Z800 dual pro head-mounted display driven by a MAC laptop, we provide dichoptic stimulation. Global motion stimuli composed of arrays of moving dots are presented to each eye. One set of dots move in a coherent direction (termed signal) whereas another set of dots move in a random direction (termed noise). To quantify performance, we measure the signal/noise ratio corresponding to a direction-discrimination threshold. Suppression is quantified by assessing the extent to which it matters which eye sees the signal and which eye sees the noise. A space-saving, head-mounted display using current video technology offers an ideal solution for clinical practice. In addition, our optimized psychophysical method provided results that were in agreement with those produced using the original technique. We made measures of suppression on a group of nine adult amblyopic participants using this apparatus with both the original and new psychophysical paradigms. All participants had measurable suppression ranging from mild to severe. The two different psychophysical methods gave a strong correlation for the strength of suppression (rho = -0.83, p = 0.006). Combining the new apparatus and new psychophysical method creates a convenient and rapid technique for parametric measurement of interocular suppression. In addition, this apparatus constitutes the ideal platform for suppressors to combine information between their eyes in a similar way to binocularly normal people. This provides a convenient way for clinicians to implement the newly proposed binocular treatment of amblyopia that is based on antisuppression training.

  17. Quantifying fibrosis in head and neck cancer treatment: An overview.

    PubMed

    Moloney, Emma C; Brunner, Markus; Alexander, Ashlin J; Clark, Jonathan

    2015-08-01

    Fibrosis is a common late complication of radiotherapy and/or surgical treatment for head and neck cancers. Fibrosis is difficult to quantify and formal methods of measure are not well recognized. The purpose of this review was to summarize the methods available to quantify neck fibrosis. A PubMed search of articles was carried out using key words "neck" and "fibrosis." Many methods have been used to assess fibrosis, however, there is no preferred methodology. Specific to neck fibrosis, most studies have relied upon hand palpation rating scales. Indentation and suction techniques have been used to mechanically quantify neck fibrosis. There is scope to develop applications of ultrasound, dielectric, bioimpedance, and MRI techniques for use in the neck region. Quantitative assessment of neck fibrosis is sought after in order to compare treatment regimens and improve quality of life outcomes in patients with head and neck cancer. © 2014 Wiley Periodicals, Inc.

  18. Quantifying uncertainties in the structural response of SSME blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.

    1987-01-01

    To quantify the uncertainties associated with the geometry and material properties of a Space Shuttle Main Engine (SSME) turbopump blade, a computer code known as STAEBL was used. A finite element model of the blade used 80 triangular shell elements with 55 nodes and five degrees of freedom per node. The whole study was simulated on the computer and no real experiments were conducted. The structural response has been evaluated in terms of three variables which are natural frequencies, root (maximum) stress, and blade tip displacements. The results of the study indicate that only the geometric uncertainties have significant effects on the response. Uncertainties in material properties have insignificant effects.

  19. Quantifying spatial and temporal trends in beach-dune volumetric changes using spatial statistics

    NASA Astrophysics Data System (ADS)

    Eamer, Jordan B. R.; Walker, Ian J.

    2013-06-01

    Spatial statistics are generally underutilized in coastal geomorphology, despite offering great potential for identifying and quantifying spatial-temporal trends in landscape morphodynamics. In particular, local Moran's Ii provides a statistical framework for detecting clusters of significant change in an attribute (e.g., surface erosion or deposition) and quantifying how this changes over space and time. This study analyzes and interprets spatial-temporal patterns in sediment volume changes in a beach-foredune-transgressive dune complex following removal of invasive marram grass (Ammophila spp.). Results are derived by detecting significant changes in post-removal repeat DEMs derived from topographic surveys and airborne LiDAR. The study site was separated into discrete, linked geomorphic units (beach, foredune, transgressive dune complex) to facilitate sub-landscape scale analysis of volumetric change and sediment budget responses. Difference surfaces derived from a pixel-subtraction algorithm between interval DEMs and the LiDAR baseline DEM were filtered using the local Moran's Ii method and two different spatial weights (1.5 and 5 m) to detect statistically significant change. Moran's Ii results were compared with those derived from a more spatially uniform statistical method that uses a simpler student's t distribution threshold for change detection. Morphodynamic patterns and volumetric estimates were similar between the uniform geostatistical method and Moran's Ii at a spatial weight of 5 m while the smaller spatial weight (1.5 m) consistently indicated volumetric changes of less magnitude. The larger 5 m spatial weight was most representative of broader site morphodynamics and spatial patterns while the smaller spatial weight provided volumetric changes consistent with field observations. All methods showed foredune deflation immediately following removal with increased sediment volumes into the spring via deposition at the crest and on lobes in the lee

  20. Quantifying, Analysing and Modeling Rockfall Activity in two Different Alpine Catchments using Terrestrial Laserscanning

    NASA Astrophysics Data System (ADS)

    Haas, F.; Heckmann, T.; Wichmann, V.; Becht, M.

    2011-12-01

    Rockfall processes play a major role as a natural hazard, especially if the rock faces are located close to infrastructure. However these processes cause also the retreat of the steep rock faces by weathering and the growth of the corresponding talus cones by routing debris down the talus cones. That's why this process plays also an important role for the geomorphic system and the sediment budget of high mountain catchments. The presented investigation deals with the use of TLS for quantification and for analysis of rockfall activity in two study areas located in the Alps. The rockfaces of both catchments and the corresponding talus cones were scanned twice a year from different distances. Figure 1 shows an example for the spatial distribution of surface changes at a rockface in the Northern Dolomites between 2008 and 2010. The measured surface changes at this location yields to a mean rockwall retreat of 0.04 cm/a. But high resolution TLS data are not only applicable to quantify rockfall activity they can also be used to characterize the surface properties of the corresponding talus cones and the runout distances of bigger boulders and this can lead to a better process understanding. Therefore the surface roughness of talus cones in both catchments was characterized from the TLS point clouds by a GIS approach. The resulting detailed maps of the surface conditions on the talus cones were used to improve an existing process model which is able to model runout distances on the talus cones using distributed friction parameters. Beside this the investigations showed, that also the shape of the boulders has an influence on the runout distance. That's why the interrelationships between rock fragment morphology and runout distance of over 600 single boulders were analysed at the site of a large rockfall event. The submitted poster will show the results of the quantification of the rockfall activity and additionally it will show the results of the analyses of the talus

  1. Analysis of competition performance in dressage and show jumping of Dutch Warmblood horses.

    PubMed

    Rovere, G; Ducro, B J; van Arendonk, J A M; Norberg, E; Madsen, P

    2016-12-01

    Most Warmblood horse studbooks aim to improve the performance in dressage and show jumping. The Dutch Royal Warmblood Studbook (KWPN) includes the highest score achieved in competition by a horse to evaluate its genetic ability of performance. However, the records collected during competition are associated with some aspects that might affect the quality of the genetic evaluation based on these records. These aspects include the influence of rider, censoring and preselection of the data. The aim of this study was to quantify the impact of rider effect, censoring and preselection on the genetic analysis of competition data of dressage and show jumping of KWPN. Different models including rider effect were evaluated. To assess the impact of censoring, genetic parameters were estimated in data sets that differed in the degree of censoring. The effect of preselection on variance components was analysed by defining a binary trait (sport-status) depending on whether the horse has a competition record or not. This trait was included in a bivariate model with the competition trait and used all horses registered by KWPN since 1984. Results showed that performance in competition for dressage and show jumping is a heritable trait (h 2 ~ 0.11-0.13) and that it is important to account for the effect of rider in the genetic analysis. Censoring had a small effect on the genetic parameter for highest performance achieved by the horse. A moderate heritability obtained for sport-status indicates that preselection has a genetic basis, but the effect on genetic parameters was relatively small. © 2016 Blackwell Verlag GmbH.

  2. Quantifying Arabia-Eurasia convergence accommodated in the Greater Caucasus by paleomagnetic reconstruction

    NASA Astrophysics Data System (ADS)

    van der Boon, A.; van Hinsbergen, D. J. J.; Rezaeian, M.; Gürer, D.; Honarmand, M.; Pastor-Galán, D.; Krijgsman, W.; Langereis, C. G.

    2018-01-01

    Since the late Eocene, convergence and subsequent collision between Arabia and Eurasia was accommodated both in the overriding Eurasian plate forming the Greater Caucasus orogen and the Iranian plateau, and by subduction and accretion of the Neotethys and Arabian margin forming the East Anatolian plateau and the Zagros. To quantify how much Arabia-Eurasia convergence was accommodated in the Greater Caucasus region, we here provide new paleomagnetic results from 97 volcanic sites (∼500 samples) in the Talysh Mountains of NW Iran, that show ∼15° net clockwise rotation relative to Eurasia since the Eocene. We apply a first-order kinematic restoration of the northward convex orocline that formed to the south of the Greater Caucasus, integrating our new data with previously published constraints on rotations of the Eastern Pontides and Lesser Caucasus. This suggests that north of the Talysh ∼120 km of convergence must have been accommodated. North of the Eastern Pontides and Lesser Caucasus this is significantly more: 200-280 km. Our reconstruction independently confirms previous Caucasus convergence estimates. Moreover, we show for the first time a sharp contrast of convergence between the Lesser Caucasus and the Talysh. This implies that the ancient Paleozoic-Mesozoic transform plate boundary, preserved between the Iranian and East-Anatolian plateaus, was likely reactivated as a right-lateral transform fault since late Eocene time.

  3. Quantifying changes in water use and groundwater availability in a megacity using novel integrated systems modeling

    NASA Astrophysics Data System (ADS)

    Hyndman, D. W.; Xu, T.; Deines, J. M.; Cao, G.; Nagelkirk, R.; Viña, A.; McConnell, W.; Basso, B.; Kendall, A. D.; Li, S.; Luo, L.; Lupi, F.; Ma, D.; Winkler, J. A.; Yang, W.; Zheng, C.; Liu, J.

    2017-08-01

    Water sustainability in megacities is a growing challenge with far-reaching effects. Addressing sustainability requires an integrated, multidisciplinary approach able to capture interactions among hydrology, population growth, and socioeconomic factors and to reflect changes due to climate variability and land use. We developed a new systems modeling framework to quantify the influence of changes in land use, crop growth, and urbanization on groundwater storage for Beijing, China. This framework was then used to understand and quantify causes of observed decreases in groundwater storage from 1993 to 2006, revealing that the expansion of Beijing's urban areas at the expense of croplands has enhanced recharge while reducing water lost to evapotranspiration, partially ameliorating groundwater declines. The results demonstrate the efficacy of such a systems approach to quantify the impacts of changes in climate and land use on water sustainability for megacities, while providing a quantitative framework to improve mitigation and adaptation strategies that can help address future water challenges.

  4. Study Quantifies Physical Demands of Yoga in Seniors

    MedlinePlus

    ... Z Study Quantifies Physical Demands of Yoga in Seniors Share: A recent NCCAM-funded study measured the ... performance of seven standing poses commonly taught in senior yoga classes: Chair, Wall Plank, Tree, Warrior II, ...

  5. Quantifying hyporheic exchange dynamics in a highly regulated large river reach

    NASA Astrophysics Data System (ADS)

    Zhou, T.; Bao, J.; Huang, M.; Hou, Z.; Arntzen, E.; Mackley, R.; Harding, S.; Crump, A.; Xu, Y.; Song, X.; Chen, X.; Stegen, J.; Hammond, G. E.; Thorne, P. D.; Zachara, J. M.

    2016-12-01

    Hyporheic exchange is an important mechanism taking place in riverbanks and riverbed sediments, where the river water and shallow groundwater mix and interact with each other. The direction and magnitude of hyporheic flux that penetrates the river bed and residence time of river water in the hyporheic zone are critical for biogeochemical processes such as carbon and nitrogen cycling, and biodegradation of organic contaminants. Hyporheic flux can be quantified using many direct and indirect measurements as well as analytical and numerical modeling tools. However, in a relatively large river, these methods can be limited by the accessibility, spatial constraints, complexity of geomorphologic features and subsurface properties, and computational power. In rivers regulated by hydroelectric dams, quantifying hyporheic fluxes becomes more challenging due to frequent hydropeaking events created by dam operations. In this study, we developed and validated methods that combined field measurements and numerical modeling for estimating hyporheic fluxes across the river bed in a 7-km long reach of the highly regulated Columbia River. The reach has a minimum width of about 800 meters and variations in river stage within a day could be up to two meters due to the upstream dam operations. In shallow water along the shoreline, vertical thermal profiles measured by self-recording thermistors were combined with time series of hydraulic gradient derived from river stage and water level at in-land wells to estimate the hyporheic flux rate. For the deep section, a high resolution computational fluid dynamics (CFD) modeling framework was developed to characterize the spatial distribution of flux rates at the river bed and the residence time of hyporheic flow at different river flow conditions. Our modeling results show that the rates of hyporheic exchange and residence time are controlled by (1) hydrostatic pressure induced by river stage fluctuations, and (2) hydrodynamic drivers

  6. Quantifying hypoxia in human cancers using static PET imaging.

    PubMed

    Taylor, Edward; Yeung, Ivan; Keller, Harald; Wouters, Bradley G; Milosevic, Michael; Hedley, David W; Jaffray, David A

    2016-11-21

    Compared to FDG, the signal of 18 F-labelled hypoxia-sensitive tracers in tumours is low. This means that in addition to the presence of hypoxic cells, transport properties contribute significantly to the uptake signal in static PET images. This sensitivity to transport must be minimized in order for static PET to provide a reliable standard for hypoxia quantification. A dynamic compartmental model based on a reaction-diffusion formalism was developed to interpret tracer pharmacokinetics and applied to static images of FAZA in twenty patients with pancreatic cancer. We use our model to identify tumour properties-well-perfused without substantial necrosis or partitioning-for which static PET images can reliably quantify hypoxia. Normalizing the measured activity in a tumour voxel by the value in blood leads to a reduction in the sensitivity to variations in 'inter-corporal' transport properties-blood volume and clearance rate-as well as imaging study protocols. Normalization thus enhances the correlation between static PET images and the FAZA binding rate K 3 , a quantity which quantifies hypoxia in a biologically significant way. The ratio of FAZA uptake in spinal muscle and blood can vary substantially across patients due to long muscle equilibration times. Normalized static PET images of hypoxia-sensitive tracers can reliably quantify hypoxia for homogeneously well-perfused tumours with minimal tissue partitioning. The ideal normalizing reference tissue is blood, either drawn from the patient before PET scanning or imaged using PET. If blood is not available, uniform, homogeneously well-perfused muscle can be used. For tumours that are not homogeneously well-perfused or for which partitioning is significant, only an analysis of dynamic PET scans can reliably quantify hypoxia.

  7. Quantifying hypoxia in human cancers using static PET imaging

    NASA Astrophysics Data System (ADS)

    Taylor, Edward; Yeung, Ivan; Keller, Harald; Wouters, Bradley G.; Milosevic, Michael; Hedley, David W.; Jaffray, David A.

    2016-11-01

    Compared to FDG, the signal of 18F-labelled hypoxia-sensitive tracers in tumours is low. This means that in addition to the presence of hypoxic cells, transport properties contribute significantly to the uptake signal in static PET images. This sensitivity to transport must be minimized in order for static PET to provide a reliable standard for hypoxia quantification. A dynamic compartmental model based on a reaction-diffusion formalism was developed to interpret tracer pharmacokinetics and applied to static images of FAZA in twenty patients with pancreatic cancer. We use our model to identify tumour properties—well-perfused without substantial necrosis or partitioning—for which static PET images can reliably quantify hypoxia. Normalizing the measured activity in a tumour voxel by the value in blood leads to a reduction in the sensitivity to variations in ‘inter-corporal’ transport properties—blood volume and clearance rate—as well as imaging study protocols. Normalization thus enhances the correlation between static PET images and the FAZA binding rate K 3, a quantity which quantifies hypoxia in a biologically significant way. The ratio of FAZA uptake in spinal muscle and blood can vary substantially across patients due to long muscle equilibration times. Normalized static PET images of hypoxia-sensitive tracers can reliably quantify hypoxia for homogeneously well-perfused tumours with minimal tissue partitioning. The ideal normalizing reference tissue is blood, either drawn from the patient before PET scanning or imaged using PET. If blood is not available, uniform, homogeneously well-perfused muscle can be used. For tumours that are not homogeneously well-perfused or for which partitioning is significant, only an analysis of dynamic PET scans can reliably quantify hypoxia.

  8. Quantifying Fire's Impacts on Total and Pyrogenic Carbon Stocks in Mixed-Conifer Forests: Results from Pre- and Post-Fire Measurements in Active Wildfire Incidents

    NASA Astrophysics Data System (ADS)

    Miesel, J. R.; Reiner, A. L.; Ewell, C. M.; Sanderman, J.; Maestrini, B.; Adkins, J.

    2016-12-01

    Widespread US fire suppression policy has contributed to an accumulation of vegetation in many western forests relative to historic conditions, and these changes can exacerbate wildfire severity and carbon (C) emissions. Serious concern exists about positive feedbacks between wildfire emissions and global climate; however, fires not only release C from terrestrial to atmospheric pools, they also create "black" or pyrogenic C (PyC) which contributes to longer-term C stability. Our objective was to quantify wildfire impacts on aboveground and belowground total C and PyC stocks in California mixed-conifer forests. We worked with incident management teams to access five active wildfires to establish and measure plots within days before and after fire. We measured pre- and post-fire aboveground forest structure and woody fuels to calculate aboveground biomass, biomass C, and PyC, and we collected pre- and post-fire forest floor and 0-5 cm mineral soil samples to measure belowground C and PyC stocks. Our preliminary results show that fire had minimal impact on the number of trees per hectare, whereas C losses from the tree layer occurred via consumption of foliage, and PyC gain occurred in tree bark. Fire released 54% to 100% of surface fuel C. In the forest floor layer, we observed 33 to 100% C loss, whereas changes in PyC stocks ranged from 100% loss to 186% gain relative to pre-fire samples. In general, fire had minimal to no impact on 0-5 cm mineral soil C. We will present relationships between total C, PyC and post-fire C and N dynamics in one of the five wildfire sites. Our data are unique because they represent nearly immediate pre- and post-fire measurements in major wildfires in a widespread western U.S. forest type. This research advances understanding of the role of fire on forest C fluxes and C sequestration potential as PyC.

  9. Quantifying the impact of between-study heterogeneity in multivariate meta-analyses

    PubMed Central

    Jackson, Dan; White, Ian R; Riley, Richard D

    2012-01-01

    Measures that quantify the impact of heterogeneity in univariate meta-analysis, including the very popular I2 statistic, are now well established. Multivariate meta-analysis, where studies provide multiple outcomes that are pooled in a single analysis, is also becoming more commonly used. The question of how to quantify heterogeneity in the multivariate setting is therefore raised. It is the univariate R2 statistic, the ratio of the variance of the estimated treatment effect under the random and fixed effects models, that generalises most naturally, so this statistic provides our basis. This statistic is then used to derive a multivariate analogue of I2, which we call . We also provide a multivariate H2 statistic, the ratio of a generalisation of Cochran's heterogeneity statistic and its associated degrees of freedom, with an accompanying generalisation of the usual I2 statistic, . Our proposed heterogeneity statistics can be used alongside all the usual estimates and inferential procedures used in multivariate meta-analysis. We apply our methods to some real datasets and show how our statistics are equally appropriate in the context of multivariate meta-regression, where study level covariate effects are included in the model. Our heterogeneity statistics may be used when applying any procedure for fitting the multivariate random effects model. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22763950

  10. Quantifying tasks, ergonomic exposures and injury rates among school custodial workers.

    PubMed

    Village, J; Koehoorn, M; Hossain, S; Ostry, A

    2009-06-01

    A job exposure matrix of ergonomics risk factors was constructed for school custodial workers in one large school district in the province of British Columbia using 100 h of 1-min fixed-interval observations, participatory worker consensus on task durations and existing employment and school characteristic data. Significant differences in ergonomics risk factors were found by tasks and occupations. Cleaning and moving furniture, handling garbage, cleaning washrooms and cleaning floors were associated with the most physical risks and the exposure was often higher during the summer vs. the school year. Injury rates over a 4-year period showed the custodian injury rate was four times higher than the overall injury rate across all occupations in the school district. Injury rates were significantly higher in the school year compared with summer (12.2 vs. 7.0 per 100 full-time equivalents per year, p < 0.05). Custodial workers represent a considerable proportion of the labour force and have high injury rates, yet ergonomic studies are disproportionately few. Previous studies that quantified risk factors in custodial workers tended to focus on a few tasks or specific risk factors. This study, using participatory ergonomics and observational methods, systematically quantifies the broad range of musculoskeletal risk factors across multiple tasks performed by custodial workers in schools, adding considerably to the methodological literature.

  11. Introducing Co-Activation Pattern Metrics to Quantify Spontaneous Brain Network Dynamics

    PubMed Central

    Chen, Jingyuan E.; Chang, Catie; Greicius, Michael D.; Glover, Gary H.

    2015-01-01

    Recently, fMRI researchers have begun to realize that the brain's intrinsic network patterns may undergo substantial changes during a single resting state (RS) scan. However, despite the growing interest in brain dynamics, metrics that can quantify the variability of network patterns are still quite limited. Here, we first introduce various quantification metrics based on the extension of co-activation pattern (CAP) analysis, a recently proposed point-process analysis that tracks state alternations at each individual time frame and relies on very few assumptions; then apply these proposed metrics to quantify changes of brain dynamics during a sustained 2-back working memory (WM) task compared to rest. We focus on the functional connectivity of two prominent RS networks, the default-mode network (DMN) and executive control network (ECN). We first demonstrate less variability of global Pearson correlations with respect to the two chosen networks using a sliding-window approach during WM task compared to rest; then we show that the macroscopic decrease in variations in correlations during a WM task is also well characterized by the combined effect of a reduced number of dominant CAPs, increased spatial consistency across CAPs, and increased fractional contributions of a few dominant CAPs. These CAP metrics may provide alternative and more straightforward quantitative means of characterizing brain network dynamics than time-windowed correlation analyses. PMID:25662866

  12. Path Similarity Analysis: A Method for Quantifying Macromolecular Pathways

    PubMed Central

    Seyler, Sean L.; Kumar, Avishek; Thorpe, M. F.; Beckstein, Oliver

    2015-01-01

    Diverse classes of proteins function through large-scale conformational changes and various sophisticated computational algorithms have been proposed to enhance sampling of these macromolecular transition paths. Because such paths are curves in a high-dimensional space, it has been difficult to quantitatively compare multiple paths, a necessary prerequisite to, for instance, assess the quality of different algorithms. We introduce a method named Path Similarity Analysis (PSA) that enables us to quantify the similarity between two arbitrary paths and extract the atomic-scale determinants responsible for their differences. PSA utilizes the full information available in 3N-dimensional configuration space trajectories by employing the Hausdorff or Fréchet metrics (adopted from computational geometry) to quantify the degree of similarity between piecewise-linear curves. It thus completely avoids relying on projections into low dimensional spaces, as used in traditional approaches. To elucidate the principles of PSA, we quantified the effect of path roughness induced by thermal fluctuations using a toy model system. Using, as an example, the closed-to-open transitions of the enzyme adenylate kinase (AdK) in its substrate-free form, we compared a range of protein transition path-generating algorithms. Molecular dynamics-based dynamic importance sampling (DIMS) MD and targeted MD (TMD) and the purely geometric FRODA (Framework Rigidity Optimized Dynamics Algorithm) were tested along with seven other methods publicly available on servers, including several based on the popular elastic network model (ENM). PSA with clustering revealed that paths produced by a given method are more similar to each other than to those from another method and, for instance, that the ENM-based methods produced relatively similar paths. PSA applied to ensembles of DIMS MD and FRODA trajectories of the conformational transition of diphtheria toxin, a particularly challenging example, showed that

  13. A field comparison of multiple techniques to quantify groundwater - surface-water interactions

    USGS Publications Warehouse

    González-Pinzón, Ricardo; Ward, Adam S; Hatch, Christine E; Wlostowski, Adam N; Singha, Kamini; Gooseff, Michael N.; Haggerty, Roy; Harvey, Judson; Cirpka, Olaf A; Brock, James T

    2015-01-01

    Groundwater–surface-water (GW-SW) interactions in streams are difficult to quantify because of heterogeneity in hydraulic and reactive processes across a range of spatial and temporal scales. The challenge of quantifying these interactions has led to the development of several techniques, from centimeter-scale probes to whole-system tracers, including chemical, thermal, and electrical methods. We co-applied conservative and smart reactive solute-tracer tests, measurement of hydraulic heads, distributed temperature sensing, vertical profiles of solute tracer and temperature in the stream bed, and electrical resistivity imaging in a 450-m reach of a 3rd-order stream. GW-SW interactions were not spatially expansive, but were high in flux through a shallow hyporheic zone surrounding the reach. NaCl and resazurin tracers suggested different surface–subsurface exchange patterns in the upper ⅔ and lower ⅓ of the reach. Subsurface sampling of tracers and vertical thermal profiles quantified relatively high fluxes through a 10- to 20-cm deep hyporheic zone with chemical reactivity of the resazurin tracer indicated at 3-, 6-, and 9-cm sampling depths. Monitoring of hydraulic gradients along transects with MINIPOINT streambed samplers starting ∼40 m from the stream indicated that groundwater discharge prevented development of a larger hyporheic zone, which progressively decreased from the stream thalweg toward the banks. Distributed temperature sensing did not detect extensive inflow of ground water to the stream, and electrical resistivity imaging showed limited large-scale hyporheic exchange. We recommend choosing technique(s) based on: 1) clear definition of the questions to be addressed (physical, biological, or chemical processes), 2) explicit identification of the spatial and temporal scales to be covered and those required to provide an appropriate context for interpretation, and 3) maximizing generation of mechanistic understanding and reducing costs of

  14. Quantifying hydrogen-deuterium exchange of meteoritic dicarboxylic acids during aqueous extraction

    NASA Astrophysics Data System (ADS)

    Fuller, M.; Huang, Y.

    2003-03-01

    Hydrogen isotope ratios of organic compounds in carbonaceous chondrites provide critical information about their origins and evolutionary history. However, because many of these compounds are obtained by aqueous extraction, the degree of hydrogen-deuterium (H/D) exchange that occurs during the process needs to be quantitatively evaluated. This study uses compound- specific hydrogen isotopic analysis to quantify the H/D exchange during aqueous extraction. Three common meteoritic dicarboxylic acids (succinic, glutaric, and 2-methyl glutaric acids) were refluxed under conditions simulating the extraction process. Changes in D values of the dicarboxylic acids were measured following the reflux experiments. A pseudo-first order rate law was used to model the H/D exchange rates which were then used to calculate the isotope exchange resulting from aqueous extraction. The degree of H/D exchange varies as a result of differences in molecular structure, the alkalinity of the extraction solution and presence/absence of meteorite powder. However, our model indicates that succinic, glutaric, and 2-methyl glutaric acids with a D of 1800 would experience isotope changes of 38, 10, and 6, respectively during the extraction process. Therefore, the overall change in D values of the dicarboxylic acids during the aqueous extraction process is negligible. We also demonstrate that H/D exchange occurs on the chiral -carbon in 2-methyl glutaric acid. The results suggest that the racemic mixture of 2-methyl glutaric acid in the Tagish Lake meteorite could result from post-synthesis aqueous alteration. The approach employed in this study can also be used to quantify H/D exchange for other important meteoritic compounds such as amino acids.

  15. Quantifying white matter structural integrity with high-definition fiber tracking in traumatic brain injury.

    PubMed

    Presson, Nora; Krishnaswamy, Deepa; Wagener, Lauren; Bird, William; Jarbo, Kevin; Pathak, Sudhir; Puccio, Ava M; Borasso, Allison; Benso, Steven; Okonkwo, David O; Schneider, Walter

    2015-03-01

    There is an urgent, unmet demand for definitive biological diagnosis of traumatic brain injury (TBI) to pinpoint the location and extent of damage. We have developed High-Definition Fiber Tracking, a 3 T magnetic resonance imaging-based diffusion spectrum imaging and tractography analysis protocol, to quantify axonal injury in military and civilian TBI patients. A novel analytical methodology quantified white matter integrity in patients with TBI and healthy controls. Forty-one subjects (23 TBI, 18 controls) were scanned with the High-Definition Fiber Tracking diffusion spectrum imaging protocol. After reconstruction, segmentation was used to isolate bilateral hemisphere homologues of eight major tracts. Integrity of segmented tracts was estimated by calculating homologue correlation and tract coverage. Both groups showed high correlations for all tracts. TBI patients showed reduced homologue correlation and tract spread and increased outlier count (correlations>2.32 SD below control mean). On average, 6.5% of tracts in the TBI group were outliers with substantial variability among patients. Number and summed deviation of outlying tracts correlated with initial Glasgow Coma Scale score and 6-month Glasgow Outcome Scale-Extended score. The correlation metric used here can detect heterogeneous damage affecting a low proportion of tracts, presenting a potential mechanism for advancing TBI diagnosis. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  16. Full Viral Suppression, Low-Level Viremia, and Quantifiable Plasma HIV-RNA at the End of Pregnancy in HIV-Infected Women on Antiretroviral Treatment.

    PubMed

    Baroncelli, Silvia; Pirillo, Maria F; Tamburrini, Enrica; Guaraldi, Giovanni; Pinnetti, Carmela; Degli Antoni, Anna; Galluzzo, Clementina M; Stentarelli, Chiara; Amici, Roberta; Floridia, Marco

    2015-07-01

    There is limited information on full viral suppression and low-level HIV-RNA viremia in HIV-infected women at the end of pregnancy. We investigated HIV-RNA levels close to delivery in women on antiretroviral treatment in order to define rates of complete suppression, low-level viremia, and quantifiable HIV-RNA, exploring as potential determinants some clinical and viroimmunological variables. Plasma samples from a national study in Italy, collected between 2003 and 2012, were used. According to plasma HIV-RNA levels, three groups were defined: full suppression (target not detected), low-level viremia (target detected but <37 copies/ml), and quantifiable HIV-RNA (≥37 copies/ml). Multivariable logistic regression was used to define determinants of full viral suppression and of quantifiable HIV-RNA. Among 107 women evaluated at a median gestational age of 35 weeks, 90 (84.1%) had HIV-RNA <37 copies/ml. Most of them (59/90, 65.6%) had full suppression, with the remaining (31/90, 34.4%) showing low-level viremia (median: 11.9 copies/ml; IQR 7.4-16.3). Among the 17 women with quantifiable viral load, median HIV-RNA was 109 copies/ml (IQR 46-251), with only one case showing resistance (mutation M184V; rate: 9.1%). In multivariable analyses, women with higher baseline HIV-RNA levels and with hepatitis C virus (HCV) coinfection were significantly more likely to have quantifiable HIV-RNA in late pregnancy. Full viral suppression was significantly more likely with nonnucleoside reverse transcriptase inhibitor (NNRTI)-based regimens and significantly less likely with higher HIV-RNA in early pregnancy. No cases of HIV transmission occurred. In conclusion, HIV-infected pregnant women showed a high rate of viral suppression and a low resistance rate before delivery. In most cases no target HIV-RNA was detected in plasma, suggesting a low risk of subsequent virological rebound and development of resistance. Women with high levels of HIV-RNA in early pregnancy and those who have

  17. Basinsoft, a computer program to quantify drainage basin characteristics

    USGS Publications Warehouse

    Harvey, Craig A.; Eash, David A.

    2001-01-01

    In 1988, the USGS began developing a program called Basinsoft. The initial program quantified 16 selected drainage basin characteristics from three source-data layers that were manually digitized from topographic maps using the versions of ARC/INFO, Fortran programs, and prime system Command Programming Language (CPL) programs available in 1988 (Majure and Soenksen, 1991). By 1991, Basinsoft was enhanced to quantify 27 selected drainage-basin characteristics from three source-data layers automatically generated from digital elevation model (DEM) data using a set of Fortran programs (Majure and Eash, 1991: Jenson and Dominique, 1988). Due to edge-matching problems encountered in 1991 with the preprocessing

  18. Upscaling nitrogen-mycorrhizal effects to quantify CO2 fertilization.

    NASA Astrophysics Data System (ADS)

    Terrer, C.; Franklin, O.; Kaiser, C.; Vicca, S.; Stocker, B.; Prentice, I. C.; Soudzilovskaia, N.

    2016-12-01

    Terrestrial ecosystems sequester annually about a quarter of anthropogenic carbon dioxide (CO2) emissions. However, it has been proposed that nitrogen (N) availability will limit plants' capacity to absorb increasing quantities of CO2 in the atmosphere. Experiments in which plants are fumigated with elevated CO2 show contrasting results, leaving open the debate of whether the magnitude of the CO2 fertilization effect will be limited by N. By synthesizing data from CO2 experiments through meta-analysis, we found that the magnitude of the CO2 fertilization effect can be explained based on the interaction between N availability and type of mycorrhizal association. Indeed, N availability is the most important driver of the CO2 fertilization effect, however, plants that associate with ectomycorrhizal fungi can overcome N limitations and grow about 30% more under 650ppm than under 400ppm of atmospheric CO2. On the other hand, plants that associate with arbuscular mycorrhizal fungi show no CO2 fertilization effect under low N availability. Using this framework, we quantified biomass responses to CO2 as a function of the soil parameters that determine N availability for the two mycorrhizal types. Then, by overlaying the distribution of mycorrhizal plants with global projections of the soil parameters that determine N availability, we estimated the amount of extra CO2 that terrestrial plants can sequester in biomass for an increase in CO2, as well as the distribution of the CO2 fertilization effect. This synthesis reconciles contrasting views of the role of N in terrestrial carbon uptake and emphasizes the plant control on N availability through interaction with ectomycorrhizal fungi. Large-scale ecosystem models should account for the influence of nitrogen and mycorrhizae reported here, which will improve representation of the CO2 fertilization effect, critical for projecting ecosystem responses and feedbacks to climate change.

  19. Complexity and approximability of quantified and stochastic constraint satisfaction problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, H. B.; Stearns, R. L.; Marathe, M. V.

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SAT{sub c}(S)). Here, we study simultaneously the complexity of and the existence of efficient approximation algorithms for a number of variants of the problems SAT(S) and SAT{sub c}(S), and for many different D, C, and S.more » These problem variants include decision and optimization problems, for formulas, quantified formulas stochastically-quantified formulas. We denote these problems by Q-SAT(S), MAX-Q-SAT(S), S-SAT(S), MAX-S-SAT(S) MAX-NSF-Q-SAT(S) and MAX-NSF-S-SAT(S). The main contribution is the development of a unified predictive theory for characterizing the the complexity of these problems. Our unified approach is based on the following basic two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic representability. Let k {ge} 2. Let S be a finite set of finite-arity relations on {Sigma}{sub k} with the following condition on S: All finite arity relations on {Sigma}{sub k} can be represented as finite existentially-quantified conjunctions of relations in S applied to variables (to variables and constant symbols in C), Then we prove the following new results: (1) The problems SAT(S) and SAT{sub c}(S) are both NQL-complete and {le}{sub logn}{sup bw}-complete for NP. (2) The problems Q-SAT(S), Q-SAT{sub c}(S), are PSPACE-complete. Letting k = 2, the problem S-SAT(S) and S-SAT{sub c}(S) are PSPACE-complete. (3) {exists} {epsilon} > 0 for which approximating the problems MAX-Q-SAT(S) within {epsilon} times optimum is PSPACE-hard. Letting k =: 2, {exists} {epsilon} > 0 for which approximating the problems MAX-S-SAT(S) within {epsilon} times optimum is PSPACE

  20. In vivo proton MRS to quantify anesthetic effects of pentobarbital on cerebral metabolism and brain activity in rat.

    PubMed

    Du, Fei; Zhang, Yi; Iltis, Isabelle; Marjanska, Malgorzata; Zhu, Xiao-Hong; Henry, Pierre-Gilles; Chen, Wei

    2009-12-01

    To quantitatively investigate the effects of pentobarbital anesthesia on brain activity, brain metabolite concentrations and cerebral metabolic rate of glucose, in vivo proton MR spectra, and electroencephalography were measured in the rat brain with various doses of pentobarbital. The results show that (1) the resonances attributed to propylene glycol, a solvent in pentobarbital injection solution, can be robustly detected and quantified in the brain; (2) the concentration of most brain metabolites remained constant under the isoelectric state (silent electroencephalography) with a high dose of pentobarbital compared to mild isoflurane anesthesia condition, except for a reduction of 61% in the brain glucose level, which was associated with a 37% decrease in cerebral metabolic rate of glucose, suggesting a significant amount of "housekeeping" energy for maintaining brain cellular integrity under the isoelectric state; and (3) electroencephalography and cerebral metabolic activities were tightly coupled to the pentobarbital anesthesia depth and they can be indirectly quantified by the propylene glycol resonance signal at 1.13 ppm. This study indicates that in vivo proton MR spectroscopy can be used to measure changes in cerebral metabolite concentrations and cerebral metabolic rate of glucose under varied pentobarbital anesthesia states; moreover, the propylene glycol signal provides a sensitive biomarker for quantitatively monitoring these changes and anesthesia depth noninvasively. (c) 2009 Wiley-Liss, Inc.

  1. Quantifying cadherin mechanotransduction machinery assembly/disassembly dynamics using fluorescence covariance analysis.

    PubMed

    Vedula, Pavan; Cruz, Lissette A; Gutierrez, Natasha; Davis, Justin; Ayee, Brian; Abramczyk, Rachel; Rodriguez, Alexis J

    2016-06-30

    Quantifying multi-molecular complex assembly in specific cytoplasmic compartments is crucial to understand how cells use assembly/disassembly of these complexes to control function. Currently, biophysical methods like Fluorescence Resonance Energy Transfer and Fluorescence Correlation Spectroscopy provide quantitative measurements of direct protein-protein interactions, while traditional biochemical approaches such as sub-cellular fractionation and immunoprecipitation remain the main approaches used to study multi-protein complex assembly/disassembly dynamics. In this article, we validate and quantify multi-protein adherens junction complex assembly in situ using light microscopy and Fluorescence Covariance Analysis. Utilizing specific fluorescently-labeled protein pairs, we quantified various stages of adherens junction complex assembly, the multiprotein complex regulating epithelial tissue structure and function following de novo cell-cell contact. We demonstrate: minimal cadherin-catenin complex assembly in the perinuclear cytoplasm and subsequent localization to the cell-cell contact zone, assembly of adherens junction complexes, acto-myosin tension-mediated anchoring, and adherens junction maturation following de novo cell-cell contact. Finally applying Fluorescence Covariance Analysis in live cells expressing fluorescently tagged adherens junction complex proteins, we also quantified adherens junction complex assembly dynamics during epithelial monolayer formation.

  2. Quantifying the effects of land use and climate on Holocene vegetation in Europe

    NASA Astrophysics Data System (ADS)

    Marquer, Laurent; Gaillard, Marie-José; Sugita, Shinya; Poska, Anneli; Trondman, Anna-Kari; Mazier, Florence; Nielsen, Anne Birgitte; Fyfe, Ralph M.; Jönsson, Anna Maria; Smith, Benjamin; Kaplan, Jed O.; Alenius, Teija; Birks, H. John B.; Bjune, Anne E.; Christiansen, Jörg; Dodson, John; Edwards, Kevin J.; Giesecke, Thomas; Herzschuh, Ulrike; Kangur, Mihkel; Koff, Tiiu; Latałowa, Małgorzata; Lechterbeck, Jutta; Olofsson, Jörgen; Seppä, Heikki

    2017-09-01

    Early agriculture can be detected in palaeovegetation records, but quantification of the relative importance of climate and land use in influencing regional vegetation composition since the onset of agriculture is a topic that is rarely addressed. We present a novel approach that combines pollen-based REVEALS estimates of plant cover with climate, anthropogenic land-cover and dynamic vegetation modelling results. This is used to quantify the relative impacts of land use and climate on Holocene vegetation at a sub-continental scale, i.e. northern and western Europe north of the Alps. We use redundancy analysis and variation partitioning to quantify the percentage of variation in vegetation composition explained by the climate and land-use variables, and Monte Carlo permutation tests to assess the statistical significance of each variable. We further use a similarity index to combine pollen-based REVEALS estimates with climate-driven dynamic vegetation modelling results. The overall results indicate that climate is the major driver of vegetation when the Holocene is considered as a whole and at the sub-continental scale, although land use is important regionally. Four critical phases of land-use effects on vegetation are identified. The first phase (from 7000 to 6500 BP) corresponds to the early impacts on vegetation of farming and Neolithic forest clearance and to the dominance of climate as a driver of vegetation change. During the second phase (from 4500 to 4000 BP), land use becomes a major control of vegetation. Climate is still the principal driver, although its influence decreases gradually. The third phase (from 2000 to 1500 BP) is characterised by the continued role of climate on vegetation as a consequence of late-Holocene climate shifts and specific climate events that influence vegetation as well as land use. The last phase (from 500 to 350 BP) shows an acceleration of vegetation changes, in particular during the last century, caused by new farming

  3. Where does boreal stream DOC come from? - Quantifying the contribution from different landscape compartments using stable C isotope ratios.

    NASA Astrophysics Data System (ADS)

    Brink Bylund, J.; Bastviken, D.; Morth, C.; Laudon, H.; Giesler, R.; Buffam, I.

    2007-12-01

    Stable carbon isotope (δ13C) ratios are frequently used as a source tracer of e.g. organic matter (OM) produced in terrestrial versus aquatic environments. To our knowledge there has been no previous attempt to quantify the relative contribution of dissolved organic carbon (DOC) from various landscape compartments in catchments of different sizes. Here, we test to what extent δ13C values can be used also to quantify the relative contribution of DOC from wetlands/riparian zones along streams, and off stream forest habitats, respectively. We present data on spatial and temporal variability of DOC concentrations and δ13C-DOC values, during the year of 2005 in Krycklan catchment, a boreal stream network in northern Sweden. Ten stream sites, ranging from order 1 to 4, were monitored in sub catchments with different wetland coverage. Spatial variation of DOC concentration showed a weak but statistically significant relationship with wetland area, with higher concentration with increasing percent of wetland in the drainage area. During base flow the difference in δ13C-DOC values was significantly different between forest (-27.5‰) and wetland (-28.1‰). This spatial pattern disappears during spring peak flow when higher discharge flushing upper soil layer and the riparian zone on DOC in the catchments. A simple mixing model using DOC and δ13C-DOC showed that stream water DOC could be describe as a mixture of DOC coming from forest (deep) groundwater and wetland/riparian zone water. The result indicates that during spring peak flow almost all stream DOC (84-100%) is derived from wetlands and riparian zones. The wetland/riparian water dominates the stream DOC flux at all hydrological events, except for two sites, one forest dominated and one mixed catchment, where the forest groundwater dominated the DOC transport during base flow. Although the total wetland area in Krycklan catchment only represent 8.3%, it contributed, together with riparian zones, to as much as 83

  4. Quantifying evapotranspiration from urban green roofs: a comparison of chamber measurements with commonly used predictive methods.

    PubMed

    Marasco, Daniel E; Hunter, Betsy N; Culligan, Patricia J; Gaffin, Stuart R; McGillis, Wade R

    2014-09-02

    Quantifying green roof evapotranspiration (ET) in urban climates is important for assessing environmental benefits, including stormwater runoff attenuation and urban heat island mitigation. In this study, a dynamic chamber method was developed to quantify ET on two extensive green roofs located in New York City, NY. Hourly chamber measurements taken from July 2009 to December 2009 and April 2012 to October 2013 illustrate both diurnal and seasonal variations in ET. Observed monthly total ET depth ranged from 0.22 cm in winter to 15.36 cm in summer. Chamber results were compared to two predictive methods for estimating ET; namely the Penman-based ASCE Standardized Reference Evapotranspiration (ASCE RET) equation, and an energy balance model, both parametrized using on-site environmental conditions. Dynamic chamber ET results were similar to ASCE RET estimates; however, the ASCE RET equation overestimated bottommost ET values during the winter months, and underestimated peak ET values during the summer months. The energy balance method was shown to underestimate ET compared the ASCE RET equation. The work highlights the utility of the chamber method for quantifying green roof evapotranspiration and indicates green roof ET might be better estimated by Penman-based evapotranspiration equations than energy balance methods.

  5. Quantifying learning in biotracer studies.

    PubMed

    Brown, Christopher J; Brett, Michael T; Adame, Maria Fernanda; Stewart-Koster, Ben; Bunn, Stuart E

    2018-04-12

    Mixing models have become requisite tools for analyzing biotracer data, most commonly stable isotope ratios, to infer dietary contributions of multiple sources to a consumer. However, Bayesian mixing models will always return a result that defaults to their priors if the data poorly resolve the source contributions, and thus, their interpretation requires caution. We describe an application of information theory to quantify how much has been learned about a consumer's diet from new biotracer data. We apply the approach to two example data sets. We find that variation in the isotope ratios of sources limits the precision of estimates for the consumer's diet, even with a large number of consumer samples. Thus, the approach which we describe is a type of power analysis that uses a priori simulations to find an optimal sample size. Biotracer data are fundamentally limited in their ability to discriminate consumer diets. We suggest that other types of data, such as gut content analysis, must be used as prior information in model fitting, to improve model learning about the consumer's diet. Information theory may also be used to identify optimal sampling protocols in situations where sampling of consumers is limited due to expense or ethical concerns.

  6. A calibrated Monte Carlo approach to quantify the impacts of misorientation and different driving forces on texture development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liangzhe Zhang; Anthony D. Rollett; Timothy Bartel

    2012-02-01

    A calibrated Monte Carlo (cMC) approach, which quantifies grain boundary kinetics within a generic setting, is presented. The influence of misorientation is captured by adding a scaling coefficient in the spin flipping probability equation, while the contribution of different driving forces is weighted using a partition function. The calibration process relies on the established parametric links between Monte Carlo (MC) and sharp-interface models. The cMC algorithm quantifies microstructural evolution under complex thermomechanical environments and remedies some of the difficulties associated with conventional MC models. After validation, the cMC approach is applied to quantify the texture development of polycrystalline materials withmore » influences of misorientation and inhomogeneous bulk energy across grain boundaries. The results are in good agreement with theory and experiments.« less

  7. Validation of a new device to quantify groundwater-surface water exchange

    NASA Astrophysics Data System (ADS)

    Cremeans, Mackenzie M.; Devlin, J. F.

    2017-11-01

    Distributions of flow across the groundwater-surface water interface should be expected to be as complex as the geologic deposits associated with stream or lake beds and their underlying aquifers. In these environments, the conventional Darcy-based method of characterizing flow systems (near streams) has significant limitations, including reliance on parameters with high uncertainties (e.g., hydraulic conductivity), the common use of drilled wells in the case of streambank investigations, and potentially lengthy measurement times for aquifer characterization and water level measurements. Less logistically demanding tools for quantifying exchanges across streambeds have been developed and include drive-point mini-piezometers, seepage meters, and temperature profiling tools. This project adds to that toolbox by introducing the Streambed Point Velocity Probe (SBPVP), a reusable tool designed to quantify groundwater-surface water interactions (GWSWI) at the interface with high density sampling, which can effectively, rapidly, and accurately complement conventional methods. The SBPVP is a direct push device that measures in situ water velocities at the GWSWI with a small-scale tracer test on the probe surface. Tracer tests do not rely on hydraulic conductivity or gradient information, nor do they require long equilibration times. Laboratory testing indicated that the SBPVP has an average accuracy of ± 3% and an average precision of ± 2%. Preliminary field testing, conducted in the Grindsted Å in Jutland, Denmark, yielded promising agreement between groundwater fluxes determined by conventional methods and those estimated from the SBPVP tests executed at similar scales. These results suggest the SBPVP is a viable tool to quantify groundwater-surface water interactions in high definition in sandy streambeds.

  8. TU-EF-304-09: Quantifying the Biological Effects of Therapeutic Protons by LET Spectrum Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guan, F; Bronk, L; Kerr, M

    2015-06-15

    Purpose: To correlate in vitro cell kill with linear energy transfer (LET) spectra using Monte Carlo simulations and knowledge obtained from previous high-throughput in vitro proton relative biological effectiveness (RBE) measurements. Methods: The Monte Carlo simulation toolkit Geant4 was used to design the experimental setups and perform the dose, dose-averaged LET, and LET spectra calculations. The clonogenic assay was performed using the H460 lung cancer cell line in standard 6-well plates. Using two different experimental setups, the same dose and dose-averaged LET (12.6 keV/µm) was delivered to the cell layer; however, each respective energy or LET spectrum was different. Wemore » quantified the dose contributions from high-LET (≥10 keV/µm, threshold determined by previous RBE measurements) events in the LET spectra separately for these two setups as 39% and 53%. 8 dose levels with 1 Gy increments were delivered. The photon reference irradiation was performed using 6 MV x-rays from a LINAC. Results: The survival curves showed that both proton irradiations demonstrated an increased RBE compared to the reference photon irradiation. Within the proton-irradiated cells, the setup with 53% dose contribution from high-LET events exhibited the higher biological effectiveness. Conclusion: The experimental results indicate that the dose-averaged LET may not be an appropriate indicator to quantify the biological effects of protons when the LET spectrum is broad enough to contain both low- and high-LET events. Incorporating the LET spectrum distribution into robust intensity-modulated proton therapy optimization planning may provide more accurate biological dose distribution than using the dose-averaged LET. NIH Program Project Grant 2U19CA021239-35.« less

  9. Computational Approaches Toward Integrating Quantified Self Sensing and Social Media

    PubMed Central

    De Choudhury, Munmun; Kumar, Mrinal; Weber, Ingmar

    2017-01-01

    The growing amount of data collected by quantified self tools and social media hold great potential for applications in personalized medicine. Whereas the first includes health-related physiological signals, the latter provides insights into a user’s behavior. However, the two sources of data have largely been studied in isolation. We analyze public data from users who have chosen to connect their MyFitnessPal and Twitter accounts. We show that a user’s diet compliance success, measured via their self-logged food diaries, can be predicted using features derived from social media: linguistic, activity, and social capital. We find that users with more positive affect and a larger social network are more successful in succeeding in their dietary goals. Using a Granger causality methodology, we also show that social media can help predict daily changes in diet compliance success or failure with an accuracy of 77%, that improves over baseline techniques by 17%. We discuss the implications of our work in the design of improved health interventions for behavior change. PMID:28840199

  10. Use of a vision model to quantify the significance of factors effecting target conspicuity

    NASA Astrophysics Data System (ADS)

    Gilmore, M. A.; Jones, C. K.; Haynes, A. W.; Tolhurst, D. J.; To, M.; Troscianko, T.; Lovell, P. G.; Parraga, C. A.; Pickavance, K.

    2006-05-01

    When designing camouflage it is important to understand how the human visual system processes the information to discriminate the target from the background scene. A vision model has been developed to compare two images and detect differences in local contrast in each spatial frequency channel. Observer experiments are being undertaken to validate this vision model so that the model can be used to quantify the relative significance of different factors affecting target conspicuity. Synthetic imagery can be used to design improved camouflage systems. The vision model is being used to compare different synthetic images to understand what features in the image are important to reproduce accurately and to identify the optimum way to render synthetic imagery for camouflage effectiveness assessment. This paper will describe the vision model and summarise the results obtained from the initial validation tests. The paper will also show how the model is being used to compare different synthetic images and discuss future work plans.

  11. Quantifying polypeptide conformational space: sensitivity to conformation and ensemble definition.

    PubMed

    Sullivan, David C; Lim, Carmay

    2006-08-24

    Quantifying the density of conformations over phase space (the conformational distribution) is needed to model important macromolecular processes such as protein folding. In this work, we quantify the conformational distribution for a simple polypeptide (N-mer polyalanine) using the cumulative distribution function (CDF), which gives the probability that two randomly selected conformations are separated by less than a "conformational" distance and whose inverse gives conformation counts as a function of conformational radius. An important finding is that the conformation counts obtained by the CDF inverse depend critically on the assignment of a conformation's distance span and the ensemble (e.g., unfolded state model): varying ensemble and conformation definition (1 --> 2 A) varies the CDF-based conformation counts for Ala(50) from 10(11) to 10(69). In particular, relatively short molecular dynamics (MD) relaxation of Ala(50)'s random-walk ensemble reduces the number of conformers from 10(55) to 10(14) (using a 1 A root-mean-square-deviation radius conformation definition) pointing to potential disconnections in comparing the results from simplified models of unfolded proteins with those from all-atom MD simulations. Explicit waters are found to roughen the landscape considerably. Under some common conformation definitions, the results herein provide (i) an upper limit to the number of accessible conformations that compose unfolded states of proteins, (ii) the optimal clustering radius/conformation radius for counting conformations for a given energy and solvent model, (iii) a means of comparing various studies, and (iv) an assessment of the applicability of random search in protein folding.

  12. The Urban Forest Effects (UFORE) model: quantifying urban forest structure and functions

    Treesearch

    David J. Nowak; Daniel E. Crane

    2000-01-01

    The Urban Forest Effects (UFORE) computer model was developed to help managers and researchers quantify urban forest structure and functions. The model quantifies species composition and diversity, diameter distribution, tree density and health, leaf area, leaf biomass, and other structural characteristics; hourly volatile organic compound emissions (emissions that...

  13. Quantifying Antimicrobial Resistance at Veal Calf Farms

    PubMed Central

    Bosman, Angela B.; Wagenaar, Jaap; Stegeman, Arjan; Vernooij, Hans; Mevius, Dik

    2012-01-01

    This study was performed to determine a sampling strategy to quantify the prevalence of antimicrobial resistance on veal calf farms, based on the variation in antimicrobial resistance within and between calves on five farms. Faecal samples from 50 healthy calves (10 calves/farm) were collected. From each individual sample and one pooled faecal sample per farm, 90 selected Escherichia coli isolates were tested for their resistance against 25 mg/L amoxicillin, 25 mg/L tetracycline, 0.5 mg/L cefotaxime, 0.125 mg/L ciprofloxacin and 8/152 mg/L trimethoprim/sulfamethoxazole (tmp/s) by replica plating. From each faecal sample another 10 selected E. coli isolates were tested for their resistance by broth microdilution as a reference. Logistic regression analysis was performed to compare the odds of testing an isolate resistant between both test methods (replica plating vs. broth microdilution) and to evaluate the effect of pooling faecal samples. Bootstrap analysis was used to investigate the precision of the estimated prevalence of resistance to each antimicrobial obtained by several simulated sampling strategies. Replica plating showed similar odds of E. coli isolates tested resistant compared to broth microdilution, except for ciprofloxacin (OR 0.29, p≤0.05). Pooled samples showed in general lower odds of an isolate being resistant compared to individual samples, although these differences were not significant. Bootstrap analysis showed that within each antimicrobial the various compositions of a pooled sample provided consistent estimates for the mean proportion of resistant isolates. Sampling strategies should be based on the variation in resistance among isolates within faecal samples and between faecal samples, which may vary by antimicrobial. In our study, the optimal sampling strategy from the perspective of precision of the estimated levels of resistance and practicality consists of a pooled faecal sample from 20 individual animals, of which 90 isolates are

  14. Quantifying Safety Performance of Driveways on State Highways

    DOT National Transportation Integrated Search

    2012-08-01

    This report documents a research effort to quantify the safety performance of driveways in the State of Oregon. In : particular, this research effort focuses on driveways located adjacent to principal arterial state highways with urban or : rural des...

  15. Methods for detecting, quantifying, and adjusting for dissemination bias in meta-analysis are described.

    PubMed

    Mueller, Katharina Felicitas; Meerpohl, Joerg J; Briel, Matthias; Antes, Gerd; von Elm, Erik; Lang, Britta; Motschall, Edith; Schwarzer, Guido; Bassler, Dirk

    2016-12-01

    To systematically review methodological articles which focus on nonpublication of studies and to describe methods of detecting and/or quantifying and/or adjusting for dissemination in meta-analyses. To evaluate whether the methods have been applied to an empirical data set for which one can be reasonably confident that all studies conducted have been included. We systematically searched Medline, the Cochrane Library, and Web of Science, for methodological articles that describe at least one method of detecting and/or quantifying and/or adjusting for dissemination bias in meta-analyses. The literature search retrieved 2,224 records, of which we finally included 150 full-text articles. A great variety of methods to detect, quantify, or adjust for dissemination bias were described. Methods included graphical methods mainly based on funnel plot approaches, statistical methods, such as regression tests, selection models, sensitivity analyses, and a great number of more recent statistical approaches. Only few methods have been validated in empirical evaluations using unpublished studies obtained from regulators (Food and Drug Administration, European Medicines Agency). We present an overview of existing methods to detect, quantify, or adjust for dissemination bias. It remains difficult to advise which method should be used as they are all limited and their validity has rarely been assessed. Therefore, a thorough literature search remains crucial in systematic reviews, and further steps to increase the availability of all research results need to be taken. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Quantifying temporal change in biodiversity: challenges and opportunities

    PubMed Central

    Dornelas, Maria; Magurran, Anne E.; Buckland, Stephen T.; Chao, Anne; Chazdon, Robin L.; Colwell, Robert K.; Curtis, Tom; Gaston, Kevin J.; Gotelli, Nicholas J.; Kosnik, Matthew A.; McGill, Brian; McCune, Jenny L.; Morlon, Hélène; Mumby, Peter J.; Øvreås, Lise; Studeny, Angelika; Vellend, Mark

    2013-01-01

    Growing concern about biodiversity loss underscores the need to quantify and understand temporal change. Here, we review the opportunities presented by biodiversity time series, and address three related issues: (i) recognizing the characteristics of temporal data; (ii) selecting appropriate statistical procedures for analysing temporal data; and (iii) inferring and forecasting biodiversity change. With regard to the first issue, we draw attention to defining characteristics of biodiversity time series—lack of physical boundaries, uni-dimensionality, autocorrelation and directionality—that inform the choice of analytic methods. Second, we explore methods of quantifying change in biodiversity at different timescales, noting that autocorrelation can be viewed as a feature that sheds light on the underlying structure of temporal change. Finally, we address the transition from inferring to forecasting biodiversity change, highlighting potential pitfalls associated with phase-shifts and novel conditions. PMID:23097514

  17. A Bayesian model for quantifying the change in mortality associated with future ozone exposures under climate change.

    PubMed

    Alexeeff, Stacey E; Pfister, Gabriele G; Nychka, Doug

    2016-03-01

    Climate change is expected to have many impacts on the environment, including changes in ozone concentrations at the surface level. A key public health concern is the potential increase in ozone-related summertime mortality if surface ozone concentrations rise in response to climate change. Although ozone formation depends partly on summertime weather, which exhibits considerable inter-annual variability, previous health impact studies have not incorporated the variability of ozone into their prediction models. A major source of uncertainty in the health impacts is the variability of the modeled ozone concentrations. We propose a Bayesian model and Monte Carlo estimation method for quantifying health effects of future ozone. An advantage of this approach is that we include the uncertainty in both the health effect association and the modeled ozone concentrations. Using our proposed approach, we quantify the expected change in ozone-related summertime mortality in the contiguous United States between 2000 and 2050 under a changing climate. The mortality estimates show regional patterns in the expected degree of impact. We also illustrate the results when using a common technique in previous work that averages ozone to reduce the size of the data, and contrast these findings with our own. Our analysis yields more realistic inferences, providing clearer interpretation for decision making regarding the impacts of climate change. © 2015, The International Biometric Society.

  18. Estimating the irreversible pressure drop across a stenosis by quantifying turbulence production using 4D Flow MRI

    PubMed Central

    Ha, Hojin; Lantz, Jonas; Ziegler, Magnus; Casas, Belen; Karlsson, Matts; Dyverfeldt, Petter; Ebbers, Tino

    2017-01-01

    The pressure drop across a stenotic vessel is an important parameter in medicine, providing a commonly used and intuitive metric for evaluating the severity of the stenosis. However, non-invasive estimation of the pressure drop under pathological conditions has remained difficult. This study demonstrates a novel method to quantify the irreversible pressure drop across a stenosis using 4D Flow MRI by calculating the total turbulence production of the flow. Simulation MRI acquisitions showed that the energy lost to turbulence production can be accurately quantified with 4D Flow MRI within a range of practical spatial resolutions (1–3 mm; regression slope = 0.91, R2 = 0.96). The quantification of the turbulence production was not substantially influenced by the signal-to-noise ratio (SNR), resulting in less than 2% mean bias at SNR > 10. Pressure drop estimation based on turbulence production robustly predicted the irreversible pressure drop, regardless of the stenosis severity and post-stenosis dilatation (regression slope = 0.956, R2 = 0.96). In vitro validation of the technique in a 75% stenosis channel confirmed that pressure drop prediction based on the turbulence production agreed with the measured pressure drop (regression slope = 1.15, R2 = 0.999, Bland-Altman agreement = 0.75 ± 3.93 mmHg). PMID:28425452

  19. A semi-automatic technique to quantify complex tuberculous lung lesions on 18F-fluorodeoxyglucose positron emission tomography/computerised tomography images.

    PubMed

    Malherbe, Stephanus T; Dupont, Patrick; Kant, Ilse; Ahlers, Petri; Kriel, Magdalena; Loxton, André G; Chen, Ray Y; Via, Laura E; Thienemann, Friedrich; Wilkinson, Robert J; Barry, Clifton E; Griffith-Richards, Stephanie; Ellman, Annare; Ronacher, Katharina; Winter, Jill; Walzl, Gerhard; Warwick, James M

    2018-06-25

    There is a growing interest in the use of 18 F-FDG PET-CT to monitor tuberculosis (TB) treatment response. However, TB causes complex and widespread pathology, which is challenging to segment and quantify in a reproducible manner. To address this, we developed a technique to standardise uptake (Z-score), segment and quantify tuberculous lung lesions on PET and CT concurrently, in order to track changes over time. We used open source tools and created a MATLAB script. The technique was optimised on a training set of five pulmonary tuberculosis (PTB) cases after standard TB therapy and 15 control patients with lesion-free lungs. We compared the proposed method to a fixed threshold (SUV > 1) and manual segmentation by two readers and piloted the technique successfully on scans of five control patients and five PTB cases (four cured and one failed treatment case), at diagnosis and after 1 and 6 months of treatment. There was a better correlation between the Z-score-based segmentation and manual segmentation than SUV > 1 and manual segmentation in terms of overall spatial overlap (measured in Dice similarity coefficient) and specificity (1 minus false positive volume fraction). However, SUV > 1 segmentation appeared more sensitive. Both the Z-score and SUV > 1 showed very low variability when measuring change over time. In addition, total glycolytic activity, calculated using segmentation by Z-score and lesion-to-background ratio, correlated well with traditional total glycolytic activity calculations. The technique quantified various PET and CT parameters, including the total glycolytic activity index, metabolic lesion volume, lesion volumes at different CT densities and combined PET and CT parameters. The quantified metrics showed a marked decrease in the cured cases, with changes already apparent at month one, but remained largely unchanged in the failed treatment case. Our technique is promising to segment and quantify the lung scans of pulmonary

  20. Quantifying Unnecessary Normal Tissue Complication Risks due to Suboptimal Planning: A Secondary Study of RTOG 0126.

    PubMed

    Moore, Kevin L; Schmidt, Rachel; Moiseenko, Vitali; Olsen, Lindsey A; Tan, Jun; Xiao, Ying; Galvin, James; Pugh, Stephanie; Seider, Michael J; Dicker, Adam P; Bosch, Walter; Michalski, Jeff; Mutic, Sasa

    2015-06-01

    The purpose of this study was to quantify the frequency and clinical severity of quality deficiencies in intensity modulated radiation therapy (IMRT) planning in the Radiation Therapy Oncology Group 0126 protocol. A total of 219 IMRT patients from the high-dose arm (79.2 Gy) of RTOG 0126 were analyzed. To quantify plan quality, we used established knowledge-based methods for patient-specific dose-volume histogram (DVH) prediction of organs at risk and a Lyman-Kutcher-Burman (LKB) model for grade ≥2 rectal complications to convert DVHs into normal tissue complication probabilities (NTCPs). The LKB model was validated by fitting dose-response parameters relative to observed toxicities. The 90th percentile (22 of 219) of plans with the lowest excess risk (difference between clinical and model-predicted NTCP) were used to create a model for the presumed best practices in the protocol (pDVH0126,top10%). Applying the resultant model to the entire sample enabled comparisons between DVHs that patients could have received to DVHs they actually received. Excess risk quantified the clinical impact of suboptimal planning. Accuracy of pDVH predictions was validated by replanning 30 of 219 patients (13.7%), including equal numbers of presumed "high-quality," "low-quality," and randomly sampled plans. NTCP-predicted toxicities were compared to adverse events on protocol. Existing models showed that bladder-sparing variations were less prevalent than rectum quality variations and that increased rectal sparing was not correlated with target metrics (dose received by 98% and 2% of the PTV, respectively). Observed toxicities were consistent with current LKB parameters. Converting DVH and pDVH0126,top10% to rectal NTCPs, we observed 94 of 219 patients (42.9%) with ≥5% excess risk, 20 of 219 patients (9.1%) with ≥10% excess risk, and 2 of 219 patients (0.9%) with ≥15% excess risk. Replanning demonstrated the predicted NTCP reductions while maintaining the volume of the PTV

  1. Quantifying aquatic invasion patterns through space and time

    EPA Science Inventory

    The objective of my study was to quantify the apparent spatio-temporal relationship between anthropogenic introduction pathway intensity and non-native aquatic species presence throughout the Laurentian Great Lakes. Non-native aquatic species early detection programs are based pr...

  2. Mental workload during n-back task-quantified in the prefrontal cortex using fNIRS.

    PubMed

    Herff, Christian; Heger, Dominic; Fortmann, Ole; Hennrich, Johannes; Putze, Felix; Schultz, Tanja

    2013-01-01

    When interacting with technical systems, users experience mental workload. Particularly in multitasking scenarios (e.g., interacting with the car navigation system while driving) it is desired to not distract the users from their primary task. For such purposes, human-machine interfaces (HCIs) are desirable which continuously monitor the users' workload and dynamically adapt the behavior of the interface to the measured workload. While memory tasks have been shown to elicit hemodynamic responses in the brain when averaging over multiple trials, a robust single trial classification is a crucial prerequisite for the purpose of dynamically adapting HCIs to the workload of its user. The prefrontal cortex (PFC) plays an important role in the processing of memory and the associated workload. In this study of 10 subjects, we used functional Near-Infrared Spectroscopy (fNIRS), a non-invasive imaging modality, to sample workload activity in the PFC. The results show up to 78% accuracy for single-trial discrimination of three levels of workload from each other. We use an n-back task (n ∈ {1, 2, 3}) to induce different levels of workload, forcing subjects to continuously remember the last one, two, or three of rapidly changing items. Our experimental results show that measuring hemodynamic responses in the PFC with fNIRS, can be used to robustly quantify and classify mental workload. Single trial analysis is still a young field that suffers from a general lack of standards. To increase comparability of fNIRS methods and results, the data corpus for this study is made available online.

  3. Mental workload during n-back task—quantified in the prefrontal cortex using fNIRS

    PubMed Central

    Herff, Christian; Heger, Dominic; Fortmann, Ole; Hennrich, Johannes; Putze, Felix; Schultz, Tanja

    2014-01-01

    When interacting with technical systems, users experience mental workload. Particularly in multitasking scenarios (e.g., interacting with the car navigation system while driving) it is desired to not distract the users from their primary task. For such purposes, human-machine interfaces (HCIs) are desirable which continuously monitor the users' workload and dynamically adapt the behavior of the interface to the measured workload. While memory tasks have been shown to elicit hemodynamic responses in the brain when averaging over multiple trials, a robust single trial classification is a crucial prerequisite for the purpose of dynamically adapting HCIs to the workload of its user. The prefrontal cortex (PFC) plays an important role in the processing of memory and the associated workload. In this study of 10 subjects, we used functional Near-Infrared Spectroscopy (fNIRS), a non-invasive imaging modality, to sample workload activity in the PFC. The results show up to 78% accuracy for single-trial discrimination of three levels of workload from each other. We use an n-back task (n ∈ {1, 2, 3}) to induce different levels of workload, forcing subjects to continuously remember the last one, two, or three of rapidly changing items. Our experimental results show that measuring hemodynamic responses in the PFC with fNIRS, can be used to robustly quantify and classify mental workload. Single trial analysis is still a young field that suffers from a general lack of standards. To increase comparability of fNIRS methods and results, the data corpus for this study is made available online. PMID:24474913

  4. Validated methodology for quantifying infestation levels of dreissenid mussels in environmental DNA (eDNA) samples.

    PubMed

    Peñarrubia, Luis; Alcaraz, Carles; Vaate, Abraham Bij de; Sanz, Nuria; Pla, Carles; Vidal, Oriol; Viñas, Jordi

    2016-12-14

    The zebra mussel (Dreissena polymorpha Pallas, 1771) and the quagga mussel (D. rostriformis Deshayes, 1838) are successful invasive bivalves with substantial ecological and economic impacts in freshwater systems once they become established. Since their eradication is extremely difficult, their detection at an early stage is crucial to prevent spread. In this study, we optimized and validated a qPCR detection method based on the histone H2B gene to quantify combined infestation levels of zebra and quagga mussels in environmental DNA samples. Our results show specific dreissenid DNA present in filtered water samples for which microscopic diagnostic identification for larvae failed. Monitoring a large number of locations for invasive dreissenid species based on a highly specific environmental DNA qPCR assay may prove to be an essential tool for management and control plans focused on prevention of establishment of dreissenid mussels in new locations.

  5. Validated methodology for quantifying infestation levels of dreissenid mussels in environmental DNA (eDNA) samples

    PubMed Central

    Peñarrubia, Luis; Alcaraz, Carles; Vaate, Abraham bij de; Sanz, Nuria; Pla, Carles; Vidal, Oriol; Viñas, Jordi

    2016-01-01

    The zebra mussel (Dreissena polymorpha Pallas, 1771) and the quagga mussel (D. rostriformis Deshayes, 1838) are successful invasive bivalves with substantial ecological and economic impacts in freshwater systems once they become established. Since their eradication is extremely difficult, their detection at an early stage is crucial to prevent spread. In this study, we optimized and validated a qPCR detection method based on the histone H2B gene to quantify combined infestation levels of zebra and quagga mussels in environmental DNA samples. Our results show specific dreissenid DNA present in filtered water samples for which microscopic diagnostic identification for larvae failed. Monitoring a large number of locations for invasive dreissenid species based on a highly specific environmental DNA qPCR assay may prove to be an essential tool for management and control plans focused on prevention of establishment of dreissenid mussels in new locations. PMID:27966602

  6. Quantifying the provenance of aeolian sediments using multiple composite fingerprints

    NASA Astrophysics Data System (ADS)

    Liu, Benli; Niu, Qinghe; Qu, Jianjun; Zu, Ruiping

    2016-09-01

    We introduce a new fingerprinting method that uses multiple composite fingerprints for studies of aeolian sediment provenance. We used this method to quantify the provenance of sediments on both sides of the Qinghai-Tibetan Railway (QTR) in the Cuona Lake section of the Tibetan Plateau (TP), in an environment characterized by aeolian and fluvial interactions. The method involves repeatedly solving a linear mixing model based on mass conservation; the model is not limited to spatial scale or transport types and uses all the tracer groups that passed the range check, Kruskal-Wallis H-test, and a strict analytical solution screening. The proportional estimates that result from using different composite fingerprints are highly variable; however, the average of these fingerprints has a greater accuracy and certainty than any single fingerprint. The results show that sand from the lake beach, hilly surface, and gullies contribute, respectively, 48%, 31% and 21% to the western railway sediments and 43%, 33% and 24% to the eastern railway sediments. The difference between contributions from various sources on either side of the railway, which may increase in the future, was clearly related to variations in local transport characteristics, a conclusion that is supported by grain size analysis. The construction of the QTR changed the local cycling of materials, and the difference in provenance between the sediments that are separated by the railway reflects the changed sedimentary conditions on either side of the railway. The effectiveness of this method suggests that it will be useful in other studies of aeolian sediments.

  7. Maps Showing Seismic Landslide Hazards in Anchorage, Alaska

    USGS Publications Warehouse

    Jibson, Randall W.; Michael, John A.

    2009-01-01

    The devastating landslides that accompanied the great 1964 Alaska earthquake showed that seismically triggered landslides are one of the greatest geologic hazards in Anchorage. Maps quantifying seismic landslide hazards are therefore important for planning, zoning, and emergency-response preparation. The accompanying maps portray seismic landslide hazards for the following conditions: (1) deep, translational landslides, which occur only during great subduction-zone earthquakes that have return periods of =~300-900 yr; (2) shallow landslides for a peak ground acceleration (PGA) of 0.69 g, which has a return period of 2,475 yr, or a 2 percent probability of exceedance in 50 yr; and (3) shallow landslides for a PGA of 0.43 g, which has a return period of 475 yr, or a 10 percent probability of exceedance in 50 yr. Deep, translational landslide hazard zones were delineated based on previous studies of such landslides, with some modifications based on field observations of locations of deep landslides. Shallow-landslide hazards were delineated using a Newmark-type displacement analysis for the two probabilistic ground motions modeled.

  8. Maps showing seismic landslide hazards in Anchorage, Alaska

    USGS Publications Warehouse

    Jibson, Randall W.

    2014-01-01

    The devastating landslides that accompanied the great 1964 Alaska earthquake showed that seismically triggered landslides are one of the greatest geologic hazards in Anchorage. Maps quantifying seismic landslide hazards are therefore important for planning, zoning, and emergency-response preparation. The accompanying maps portray seismic landslide hazards for the following conditions: (1) deep, translational landslides, which occur only during great subduction-zone earthquakes that have return periods of =300-900 yr; (2) shallow landslides for a peak ground acceleration (PGA) of 0.69 g, which has a return period of 2,475 yr, or a 2 percent probability of exceedance in 50 yr; and (3) shallow landslides for a PGA of 0.43 g, which has a return period of 475 yr, or a 10 percent probability of exceedance in 50 yr. Deep, translational landslide hazards were delineated based on previous studies of such landslides, with some modifications based on field observations of locations of deep landslides. Shallow-landslide hazards were delineated using a Newmark-type displacement analysis for the two probabilistic ground motions modeled.

  9. Quantifying the Frictional Forces between Skin and Nonwoven Fabrics

    PubMed Central

    Jayawardana, Kavinda; Ovenden, Nicholas C.; Cottenden, Alan

    2017-01-01

    When a compliant sheet of material is dragged over a curved surface of a body, the frictional forces generated can be many times greater than they would be for a planar interface. This phenomenon is known to contribute to the abrasion damage to skin often suffered by wearers of incontinence pads and bed/chairbound people susceptible to pressure sores. Experiments that attempt to quantify these forces often use a simple capstan-type equation to obtain a characteristic coefficient of friction. In general, the capstan approach assumes the ratio of applied tensions depends only on the arc of contact and the coefficient of friction, and ignores other geometric and physical considerations; this approach makes it straightforward to obtain explicitly a coefficient of friction from the tensions measured. In this paper, two mathematical models are presented that compute the material displacements and surface forces generated by, firstly, a membrane under tension in moving contact with a rigid obstacle and, secondly, a shell-membrane under tension in contact with a deformable substrate. The results show that, while the use of a capstan equation remains fairly robust in some cases, effects such as the curvature and flaccidness of the underlying body, and the mass density of the fabric can lead to significant variations in stresses generated in the contact region. Thus, the coefficient of friction determined by a capstan model may not be an accurate reflection of the true frictional behavior of the contact region. PMID:28321192

  10. Signal enhancement ratio (SER) quantified from breast DCE-MRI and breast cancer risk

    NASA Astrophysics Data System (ADS)

    Wu, Shandong; Kurland, Brenda F.; Berg, Wendie A.; Zuley, Margarita L.; Jankowitz, Rachel C.; Sumkin, Jules; Gur, David

    2015-03-01

    Breast magnetic resonance imaging (MRI) is recommended as an adjunct to mammography for women who are considered at elevated risk of developing breast cancer. As a key component of breast MRI, dynamic contrast-enhanced MRI (DCE-MRI) uses a contrast agent to provide high intensity contrast between breast tissues, making it sensitive to tissue composition and vascularity. Breast DCE-MRI characterizes certain physiologic properties of breast tissue that are potentially related to breast cancer risk. Studies have shown that increased background parenchymal enhancement (BPE), which is the contrast enhancement occurring in normal cancer-unaffected breast tissues in post-contrast sequences, predicts increased breast cancer risk. Signal enhancement ratio (SER) computed from pre-contrast and post-contrast sequences in DCE-MRI measures change in signal intensity due to contrast uptake over time and is a measure of contrast enhancement kinetics. SER quantified in breast tumor has been shown potential as a biomarker for characterizing tumor response to treatments. In this work we investigated the relationship between quantitative measures of SER and breast cancer risk. A pilot retrospective case-control study was performed using a cohort of 102 women, consisting of 51 women who had diagnosed with unilateral breast cancer and 51 matched controls (by age and MRI date) with a unilateral biopsy-proven benign lesion. SER was quantified using fully-automated computerized algorithms and three SER-derived quantitative volume measures were compared between the cancer cases and controls using logistic regression analysis. Our preliminary results showed that SER is associated with breast cancer risk, after adjustment for the Breast Imaging Reporting and Data System (BI-RADS)-based mammographic breast density measures. This pilot study indicated that SER has potential for use as a risk factor for breast cancer risk assessment in women at elevated risk of developing breast cancer.

  11. Quantifying carbon footprint reduction opportunities for U.S. households and communities.

    PubMed

    Jones, Christopher M; Kammen, Daniel M

    2011-05-01

    Carbon management is of increasing interest to individuals, households, and communities. In order to effectively assess and manage their climate impacts, individuals need information on the financial and greenhouse gas benefits of effective mitigation opportunities. We use consumption-based life cycle accounting techniques to quantify the carbon footprints of typical U.S. households in 28 cities for 6 household sizes and 12 income brackets. The model includes emissions embodied in transportation, energy, water, waste, food, goods, and services. We further quantify greenhouse gas and financial savings from 13 potential mitigation actions across all household types. The model suggests that the size and composition of carbon footprints vary dramatically between geographic regions and within regions based on basic demographic characteristics. Despite these differences, large cash-positive carbon footprint reductions are evident across all household types and locations; however, realizing this potential may require tailoring policies and programs to different population segments with very different carbon footprint profiles. The results of this model have been incorporated into an open access online carbon footprint management tool designed to enable behavior change at the household level through personalized feedback.

  12. Quantifying discipline practices using absolute versus relative frequencies: clinical and research implications for child welfare.

    PubMed

    Lindhiem, Oliver; Shaffer, Anne; Kolko, David J

    2014-01-01

    In the parent intervention outcome literatures, discipline practices are generally quantified as absolute frequencies or, less commonly, as relative frequencies. These differences in methodology warrant direct comparison as they have critical implications for study results and conclusions among treatments targeted at reducing parental aggression and harsh discipline. In this study, we directly compared the absolute frequency method and the relative frequency method for quantifying physically aggressive, psychologically aggressive, and nonaggressive discipline practices. Longitudinal data over a 3-year period came from an existing data set of a clinical trial examining the effectiveness of a psychosocial treatment in reducing parental physical and psychological aggression and improving child behavior (N = 139). Discipline practices (aggressive and nonaggressive) were assessed using the Conflict Tactics Scale. The two methods yielded different patterns of results, particularly for nonaggressive discipline strategies. We suggest that each method makes its own unique contribution to a more complete understanding of the association between parental aggression and intervention effects.

  13. Population exposure to smoking and tobacco branding in the UK reality show 'Love Island'.

    PubMed

    Barker, Alexander B; Opazo Breton, Magdalena; Cranwell, Jo; Britton, John; Murray, Rachael L

    2018-02-05

    Reality television shows are popular with children and young adults; inclusion of tobacco imagery in these programmes is likely to cause smoking in these groups. Series 3 of the UK reality show Love Island, broadcast in 2017, attracted widespread media criticism for high levels of smoking depicted. We have quantified this tobacco content and estimated the UK population exposure to generic and branded tobacco imagery generated by the show. We used 1-min interval coding to quantify actual or implied tobacco use, tobacco paraphernalia or branding, in alternate episodes of series 3 of Love Island, and Census data and viewing figures from Kantar Media to estimate gross and per capita tobacco impressions. We coded 21 episodes comprising 1001 min of content. Tobacco imagery occurred in 204 (20%) intervals; the frequency of appearances fell significantly after media criticism. An identifiable cigarette brand, Lucky Strike Double Click, appeared in 16 intervals. The 21 episodes delivered an estimated 559 million gross tobacco impressions to the UK population, predominantly to women, including 47 million to children aged <16 and 44 million gross impressions of Lucky Strike branding, including 4 million to children <16. Despite advertising legislation and broadcasting regulations intended to protect children from smoking imagery in UK television, series 3 of Love Island delivered millions of general and branded tobacco impressions both to children and adults in the UK. More stringent controls on tobacco content in television programmes are urgently needed. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. New methods to quantify the cracking performance of cementitious systems made with internal curing

    NASA Astrophysics Data System (ADS)

    Schlitter, John L.

    which is located outside of the sample to provide restraint against expansion. Second, the standard ring test is a passive test that only relies on the autogenous and drying shrinkage of the mixture to induce cracking. The dual ring test can be an active test because it has the ability to vary the temperature of the specimen in order to induce thermal stress and produce cracking. This ability enables the study of the restrained cracking capacity as the mixture ages in order to quantify crack sensitive periods of time. Measurements made with the dual ring quantify the benefits from using larger amounts of internal curing. Mixtures that resupplied internal curing water to match that of chemical shrinkage could sustain three times the magnitude of thermal change before cracking. The second device discussed in this thesis is a large scale slab testing device. This device tests the cracking potential of 15' long by 4" thick by 24" wide slab specimens in an environmentally controlled chamber. The current standard testing devices can be considered small scale and encounter problems when linking their results to the field due to size effects. Therefore, the large scale slab testing device was developed in order to calibrate the results of smaller scale tests to real world field conditions such as a pavement or bridge deck. Measurements made with the large scale testing device showed that the cracking propensity of the internally cured mixtures was reduced and that a significant benefit could be realized.

  15. Quantifying the dynamics of emotional expressions in family therapy of patients with anorexia nervosa.

    PubMed

    Pezard, Laurent; Doba, Karyn; Lesne, Annick; Nandrino, Jean-Louis

    2017-07-01

    Emotional interactions have been considered dynamical processes involved in the affective life of humans and their disturbances may induce mental disorders. Most studies of emotional interactions have focused on dyadic behaviors or self-reports of emotional states but neglected the dynamical processes involved in family therapy. The main objective of this study is to quantify the dynamics of emotional expressions and their changes using the family therapy of patients with anorexia nervosa as an example. Nonlinear methods characterize the variability of the dynamics at the level of the whole therapeutic system and reciprocal influence between the participants during family therapy. Results show that the variability of the dynamics is higher at the end of the therapy than at the beginning. The reciprocal influences between therapist and each member of the family and between mother and patient decrease with the course of family therapy. Our results support the development of new interpersonal strategies of emotion regulation during family therapy. The quantification of emotional dynamics can help understanding the emotional processes underlying psychopathology and evaluating quantitatively the changes achieved by the therapeutic intervention. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  16. Probability distributions of whisker-surface contact: quantifying elements of the rat vibrissotactile natural scene.

    PubMed

    Hobbs, Jennifer A; Towal, R Blythe; Hartmann, Mitra J Z

    2015-08-01

    Analysis of natural scene statistics has been a powerful approach for understanding neural coding in the auditory and visual systems. In the field of somatosensation, it has been more challenging to quantify the natural tactile scene, in part because somatosensory signals are so tightly linked to the animal's movements. The present work takes a step towards quantifying the natural tactile scene for the rat vibrissal system by simulating rat whisking motions to systematically investigate the probabilities of whisker-object contact in naturalistic environments. The simulations permit an exhaustive search through the complete space of possible contact patterns, thereby allowing for the characterization of the patterns that would most likely occur during long sequences of natural exploratory behavior. We specifically quantified the probabilities of 'concomitant contact', that is, given that a particular whisker makes contact with a surface during a whisk, what is the probability that each of the other whiskers will also make contact with the surface during that whisk? Probabilities of concomitant contact were quantified in simulations that assumed increasingly naturalistic conditions: first, the space of all possible head poses; second, the space of behaviorally preferred head poses as measured experimentally; and third, common head poses in environments such as cages and burrows. As environments became more naturalistic, the probability distributions shifted from exhibiting a 'row-wise' structure to a more diagonal structure. Results also reveal that the rat appears to use motor strategies (e.g. head pitches) that generate contact patterns that are particularly well suited to extract information in the presence of uncertainty. © 2015. Published by The Company of Biologists Ltd.

  17. Quantifying the Adaptive Cycle.

    PubMed

    Angeler, David G; Allen, Craig R; Garmestani, Ahjond S; Gunderson, Lance H; Hjerne, Olle; Winder, Monika

    2015-01-01

    The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation) in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994-2011) data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems.

  18. Quantifying innovation in surgery.

    PubMed

    Hughes-Hallett, Archie; Mayer, Erik K; Marcus, Hani J; Cundy, Thomas P; Pratt, Philip J; Parston, Greg; Vale, Justin A; Darzi, Ara W

    2014-08-01

    The objectives of this study were to assess the applicability of patents and publications as metrics of surgical technology and innovation; evaluate the historical relationship between patents and publications; develop a methodology that can be used to determine the rate of innovation growth in any given health care technology. The study of health care innovation represents an emerging academic field, yet it is limited by a lack of valid scientific methods for quantitative analysis. This article explores and cross-validates 2 innovation metrics using surgical technology as an exemplar. Electronic patenting databases and the MEDLINE database were searched between 1980 and 2010 for "surgeon" OR "surgical" OR "surgery." Resulting patent codes were grouped into technology clusters. Growth curves were plotted for these technology clusters to establish the rate and characteristics of growth. The initial search retrieved 52,046 patents and 1,801,075 publications. The top performing technology cluster of the last 30 years was minimally invasive surgery. Robotic surgery, surgical staplers, and image guidance were the most emergent technology clusters. When examining the growth curves for these clusters they were found to follow an S-shaped pattern of growth, with the emergent technologies lying on the exponential phases of their respective growth curves. In addition, publication and patent counts were closely correlated in areas of technology expansion. This article demonstrates the utility of publically available patent and publication data to quantify innovations within surgical technology and proposes a novel methodology for assessing and forecasting areas of technological innovation.

  19. Quantifying the adaptive cycle

    USGS Publications Warehouse

    Angeler, David G.; Allen, Craig R.; Garmestani, Ahjond S.; Gunderson, Lance H.; Hjerne, Olle; Winder, Monika

    2015-01-01

    The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation) in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994–2011) data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems.

  20. Identifying and quantifying the stromal fibrosis in muscularis propria of colorectal carcinoma by multiphoton microscopy

    NASA Astrophysics Data System (ADS)

    Chen, Sijia; Yang, Yinghong; Jiang, Weizhong; Feng, Changyin; Chen, Zhifen; Zhuo, Shuangmu; Zhu, Xiaoqin; Guan, Guoxian; Chen, Jianxin

    2014-10-01

    The examination of stromal fibrosis within colorectal cancer is overlooked, not only because the routine pathological examinations seem to focus more on tumour staging and precise surgical margins, but also because of the lack of efficient diagnostic methods. Multiphoton microscopy (MPM) can be used to study the muscularis stroma of normal and colorectal carcinoma tissue at the molecular level. In this work, we attempt to show the feasibility of MPM for discerning the microstructure of the normal human rectal muscle layer and fibrosis colorectal carcinoma tissue practicably. Three types of muscularis propria stromal fibrosis beneath the colorectal cancer infiltration were first observed through the MPM imaging system by providing intercellular microstructural details in fresh, unstained tissue samples. Our approach also presents the capability of quantifying the extent of stromal fibrosis from both amount and orientation of collagen, which may further characterize the severity of fibrosis. By comparing with the pathology analysis, these results show that the MPM has potential advantages in becoming a histological tool for detecting the stromal fibrosis and collecting prognosis evidence, which may guide subsequent therapy procedures for patients into good prognosis.

  1. Quantifying camouflage: how to predict detectability from appearance.

    PubMed

    Troscianko, Jolyon; Skelhorn, John; Stevens, Martin

    2017-01-06

    Quantifying the conspicuousness of objects against particular backgrounds is key to understanding the evolution and adaptive value of animal coloration, and in designing effective camouflage. Quantifying detectability can reveal how colour patterns affect survival, how animals' appearances influence habitat preferences, and how receiver visual systems work. Advances in calibrated digital imaging are enabling the capture of objective visual information, but it remains unclear which methods are best for measuring detectability. Numerous descriptions and models of appearance have been used to infer the detectability of animals, but these models are rarely empirically validated or directly compared to one another. We compared the performance of human 'predators' to a bank of contemporary methods for quantifying the appearance of camouflaged prey. Background matching was assessed using several established methods, including sophisticated feature-based pattern analysis, granularity approaches and a range of luminance and contrast difference measures. Disruptive coloration is a further camouflage strategy where high contrast patterns disrupt they prey's tell-tale outline, making it more difficult to detect. Disruptive camouflage has been studied intensely over the past decade, yet defining and measuring it have proven far more problematic. We assessed how well existing disruptive coloration measures predicted capture times. Additionally, we developed a new method for measuring edge disruption based on an understanding of sensory processing and the way in which false edges are thought to interfere with animal outlines. Our novel measure of disruptive coloration was the best predictor of capture times overall, highlighting the importance of false edges in concealment over and above pattern or luminance matching. The efficacy of our new method for measuring disruptive camouflage together with its biological plausibility and computational efficiency represents a substantial

  2. A novel approach to quantify cybersecurity for electric power systems

    NASA Astrophysics Data System (ADS)

    Kaster, Paul R., Jr.

    Electric Power grid cybersecurity is a topic gaining increased attention in academia, industry, and government circles, yet a method of quantifying and evaluating a system's security is not yet commonly accepted. In order to be useful, a quantification scheme must be able to accurately reflect the degree to which a system is secure, simply determine the level of security in a system using real-world values, model a wide variety of attacker capabilities, be useful for planning and evaluation, allow a system owner to publish information without compromising the security of the system, and compare relative levels of security between systems. Published attempts at quantifying cybersecurity fail at one or more of these criteria. This document proposes a new method of quantifying cybersecurity that meets those objectives. This dissertation evaluates the current state of cybersecurity research, discusses the criteria mentioned previously, proposes a new quantification scheme, presents an innovative method of modeling cyber attacks, demonstrates that the proposed quantification methodology meets the evaluation criteria, and proposes a line of research for future efforts.

  3. Iris texture traits show associations with iris color and genomic ancestry.

    PubMed

    Quillen, Ellen E; Guiltinan, Jenna S; Beleza, Sandra; Rocha, Jorge; Pereira, Rinaldo W; Shriver, Mark D

    2011-01-01

    This study seeks to identify associations among genomic biogeographic ancestry (BGA), quantitative iris color, and iris texture traits contributing to population-level variation in these phenotypes. DNA and iris photographs were collected from 300 individuals across three variably admixed populations (Portugal, Brazil, and Cape Verde). Two raters scored the photos for pigmentation spots, Fuchs' crypts, contraction furrows, and Wolflinn nodes. Iris color was quantified from RGB values. Maximum likelihood estimates of individual BGA were calculated from 176 ancestry informative markers. Pigmentation spots, Fuchs' crypts, contraction furrows, and iris color show significant positive correlation with increasing European BGA. Only contraction furrows are correlated with iris color. The relationship between BGA and iris texture illustrates a genetic contribution to this population-level variation. Copyright © 2011 Wiley-Liss, Inc.

  4. Method for quantifying nitromethane in blood as a potential biomarker of halonitromethane exposure.

    PubMed

    Alwis, K Udeni; Blount, Benjamin C; Silva, Lalith K; Smith, Mitchell M; Loose, Karl-Hermann

    2008-04-01

    The cytotoxicity and genotoxicity of nitromethane and its halogenated analogues in mammals raise concerns about potential toxicity to humans. This study shows that halonitromethanes are not stable in human blood and undergo dehalogenation to form nitromethane. We quantified nitromethane in human blood using solid-phase microextraction (SPME) headspace sampling coupled with gas chromatography (GC) and high resolution mass spectrometry (HRMS). The limit of detection was 0.01 microg/L with a linear calibration curve spanning 3 orders of magnitude. This method employs isotope dilution to precisely quantify trace amounts of nitromethane (coefficient of variation <6%). At three spiked concentrations of nitromethane, method accuracy ranged from 88 to 99%. We applied this method to blood samples collected from 632 people with no known occupational exposure to nitromethane or halonitromethanes. Nitromethane was detected in all blood samples tested (range: 0.28-3.79 microg/L, median: 0.66 microg/L). Time-course experiments with trichloronitromethane- and tribromonitromethane-spiked blood showed that nitromethane was the major product formed (1 nmole tribromonitromethane formed 0.59 nmole of nitromethane, whereas 1 nmole trichloronitromethane formed 0.77 nmole nitromethane). Nitromethane may form endogenously from peroxynitrite: nitromethane concentrations increased proportionately in blood samples spiked with peroxynitrite. Blood nitromethane can be a biomarker of exposure to both nitromethane and halonitromethanes. This sensitive, accurate, and precise analytical method can be used to determine baseline blood nitromethane level in the general population. It can also be used to study the health impact from exposure to nitromethane and halonitromethanes in occupational environments and to assess trichloronitromethane (chloropicrin) exposure in chemical terrorism investigations.

  5. Digital image analysis to quantify carbide networks in ultrahigh carbon steels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hecht, Matthew D.; Webler, Bryan A.; Picard, Yoosuf N., E-mail: ypicard@cmu.edu

    A method has been developed and demonstrated to quantify the degree of carbide network connectivity in ultrahigh carbon steels through digital image processing and analysis of experimental micrographs. It was shown that the network connectivity and carbon content can be correlated to toughness for various ultrahigh carbon steel specimens. The image analysis approach first involved segmenting the carbide network and pearlite matrix into binary contrast representations via a grayscale intensity thresholding operation. Next, the carbide network pixels were skeletonized and parceled into braches and nodes, allowing the determination of a connectivity index for the carbide network. Intermediate image processing stepsmore » to remove noise and fill voids in the network are also detailed. The connectivity indexes of scanning electron micrographs were consistent in both secondary and backscattered electron imaging modes, as well as across two different (50 × and 100 ×) magnifications. Results from ultrahigh carbon steels reported here along with other results from the literature generally showed lower connectivity indexes correlated with higher Charpy impact energy (toughness). A deviation from this trend was observed at higher connectivity indexes, consistent with a percolation threshold for crack propagation across the carbide network. - Highlights: • A method for carbide network analysis in steels is proposed and demonstrated. • ImageJ method extracts a network connectivity index from micrographs. • Connectivity index consistent in different imaging conditions and magnifications. • Impact energy may plateau when a critical network connectivity is exceeded.« less

  6. Quantifying forest fragmentation using Geographic Information Systems and Forest Inventory and Analysis plot data

    Treesearch

    Dacia M. Meneguzzo; Mark H. Hansen

    2009-01-01

    Fragmentation metrics provide a means of quantifying and describing forest fragmentation. The most common method of calculating these metrics is through the use of Geographic Information System software to analyze raster data, such as a satellite or aerial image of the study area; however, the spatial resolution of the imagery has a significant impact on the results....

  7. Critical Zone Co-dynamics: Quantifying Interactions between Subsurface, Land Surface, and Vegetation Properties Using UAV and Geophysical Approaches

    NASA Astrophysics Data System (ADS)

    Dafflon, B.; Leger, E.; Peterson, J.; Falco, N.; Wainwright, H. M.; Wu, Y.; Tran, A. P.; Brodie, E.; Williams, K. H.; Versteeg, R.; Hubbard, S. S.

    2017-12-01

    Improving understanding and modelling of terrestrial systems requires advances in measuring and quantifying interactions among subsurface, land surface and vegetation processes over relevant spatiotemporal scales. Such advances are important to quantify natural and managed ecosystem behaviors, as well as to predict how watershed systems respond to increasingly frequent hydrological perturbations, such as droughts, floods and early snowmelt. Our study focuses on the joint use of UAV-based multi-spectral aerial imaging, ground-based geophysical tomographic monitoring (incl., electrical and electromagnetic imaging) and point-scale sensing (soil moisture sensors and soil sampling) to quantify interactions between above and below ground compartments of the East River Watershed in the Upper Colorado River Basin. We evaluate linkages between physical properties (incl. soil composition, soil electrical conductivity, soil water content), metrics extracted from digital surface and terrain elevation models (incl., slope, wetness index) and vegetation properties (incl., greenness, plant type) in a 500 x 500 m hillslope-floodplain subsystem of the watershed. Data integration and analysis is supported by numerical approaches that simulate the control of soil and geomorphic characteristic on hydrological processes. Results provide an unprecedented window into critical zone interactions, revealing significant below- and above-ground co-dynamics. Baseline geophysical datasets provide lithological structure along the hillslope, which includes a surface soil horizon, underlain by a saprolite layer and the fractured Mancos shale. Time-lapse geophysical data show very different moisture dynamics in various compartments and locations during the winter and growing season. Integration with aerial imaging reveals a significant linkage between plant growth and the subsurface wetness, soil characteristics and the topographic gradient. The obtained information about the organization and

  8. SOME IS NOT ENOUGH: QUANTIFIER COMPREHENSION IN CORTICOBASAL SYNDROME AND BEHAVIORAL VARIANT FRONTOTEMPORAL DEMENTIA

    PubMed Central

    Morgan, Brianna; Gross, Rachel; Clark, Robin; Dreyfuss, Michael; Boller, Ashley; Camp, Emily; Liang, Tsao-Wei; Avants, Brian; McMillan, Corey; Grossman, Murray

    2011-01-01

    Quantifiers are very common in everyday speech, but we know little about their cognitive basis or neural representation. The present study examined comprehension of three classes of quantifiers that depend on different cognitive components in patients with focal neurodegenerative diseases. Patients evaluated the truth-value of a sentence containing a quantifier relative to a picture illustrating a small number of familiar objects, and performance was related to MRI grey matter atrophy using voxel-based morphometry. We found that patients with corticobasal syndrome (CBS) and posterior cortical atrophy (PCA) are significantly impaired in their comprehension of Cardinal Quantifiers (e.g. “At least three birds are on the branch”), due in part to their deficit in quantity knowledge. MRI analyses related this deficit to temporal-parietal atrophy found in CBS/PCA. We also found that patients with behavioral variant frontotemporal dementia (bvFTD) are significantly impaired in their comprehension of Logical Quantifiers (e.g. “Some the birds are on the branch”), associated with a simple form of perceptual logic, and this correlated with their deficit on executive measures. This deficit was related to disease in rostral prefrontal cortex in bvFTD. These patients were also impaired in their comprehension of Majority Quantifiers (e.g. “At least half of the birds are on the branch”), and this too was correlated with their deficit on executive measures. This was related to disease in the basal ganglia interrupting a frontal-striatal loop critical for executive functioning. These findings suggest that a large-scale frontal-parietal neural network plays a crucial role in quantifier comprehension, and that comprehension of specific classes of quantifiers may be selectively impaired in patients with focal neurodegenerative conditions in these areas. PMID:21930136

  9. Quantifying the spoilage and shelf-life of yoghurt with fruits.

    PubMed

    Mataragas, M; Dimitriou, V; Skandamis, P N; Drosinos, E H

    2011-05-01

    The aim of the present study was to develop a predictive model to quantify the spoilage of yoghurt with fruits. Product samples were stored at various temperatures (5-20 °C). Samples were subjected to microbiological (total viable counts, lactic acid bacteria-LAB, yeasts and moulds) and physico-chemical analysis (pH, titratable acidity and sugars). LAB was the dominant micro-flora. Yeasts population increased at all temperatures but a delay was observed during the first days of storage. Titratable acidity and pH remained almost constant at low temperatures (5 and 10 °C). However, at higher temperatures (>10 °C), an increase in titratable acidity and reduction in pH was observed. Sugar concentration (fructose, lactose and glucose) decreased during storage. A mathematical model was developed for shelf-life determination of the product. It was successfully validated at a temperature (17 °C) not used during model development. The results showed that shelf-life of this product could not be established based only on microbiological data and use of other parameters such as sensory or/and physico-chemical analysis is required. Shelf-life determination by spoilage tests is time-consuming and the need for new rapid techniques has been raised. The developed model could help dairy industries to establish shelf-life predictions on yoghurt with fruits stored under constant temperature conditions. Copyright © 2010 Elsevier Ltd. All rights reserved.

  10. A novel method for quantifying arm motion similarity.

    PubMed

    Zhi Li; Hauser, Kris; Roldan, Jay Ryan; Milutinovic, Dejan; Rosen, Jacob

    2015-08-01

    This paper proposes a novel task-independent method for quantifying arm motion similarity that can be applied to any kinematic/dynamic variable of interest. Given two arm motions for the same task, not necessarily with the same completion time, it plots the time-normalized curves against one another and generates four real-valued features. To validate these features we apply them to quantify the relationship between healthy and paretic arm motions of chronic stroke patients. Studying both unimanual and bimanual arm motions of eight chronic stroke patients, we find that inter-arm coupling that tends to synchronize the motions of both arms in bimanual motions, has a stronger effect at task-relevant joints than at task-irrelevant joints. It also revealed that the paretic arm suppresses the shoulder flexion of the non-paretic arm, while the latter encourages the shoulder rotation of the former.

  11. Ibuprofen reverts antifungal resistance on Candida albicans showing overexpression of CDR genes.

    PubMed

    Ricardo, Elisabete; Costa-de-Oliveira, Sofia; Dias, Ana Silva; Guerra, José; Rodrigues, Acácio Gonçalves; Pina-Vaz, Cidália

    2009-06-01

    Several mechanisms may be associated with Candida albicans resistance to azoles. Ibuprofen was described as being able to revert resistance related to efflux activity in Candida. The aim of this study was to uncover the molecular base of antifungal resistance in C. albicans clinical strains that could be reverted by ibuprofen. Sixty-two clinical isolates and five control strains of C. albicans were studied: the azole susceptibility phenotype was determined according to the Clinical Laboratory for Standards Institute, M27-A2 protocol and minimal inhibitory concentration values were recalculated with ibuprofen (100 microg mL(-1)); synergistic studies between fluconazole and FK506, a Cdr1p inhibitor, were performed using an agar disk diffusion assay and were compared with ibuprofen results. Gene expression was quantified by real-time PCR, with and without ibuprofen, regarding CDR1, CDR2, MDR1, encoding for efflux pumps, and ERG11, encoding for azole target protein. A correlation between susceptibility phenotype and resistance gene expression profiles was determined. Ibuprofen and FK506 showed a clear synergistic effect when combined with fluconazole. Resistant isolates reverting to susceptible after incubation with ibuprofen showed CDR1 and CDR2 overexpression especially of the latter. Conversely, strains that did not revert displayed a remarkable increase in ERG11 expression along with CDR genes. Ibuprofen did not alter resistance gene expression significantly (P>0.05), probably acting as a Cdrp blocker.

  12. Quantifying the negative impact of brain drain on the integration of European science

    PubMed Central

    Doria Arrieta, Omar A.; Pammolli, Fabio; Petersen, Alexander M.

    2017-01-01

    The 2004/2007 European Union (EU) enlargement by 12 member states offers a unique opportunity to quantify the impact of EU efforts to expand and integrate the scientific competitiveness of the European Research Area (ERA). We apply two causal estimation schemes to cross-border collaboration data extracted from millions of academic publications from 1996 to 2012, which are disaggregated across 14 subject areas and 32 European countries. Our results illustrate the unintended consequences following the 2004/2007 enlargement, namely, its negative impact on cross-border collaboration in science. First, we use the synthetic control method to show that levels of European cross-border collaboration would have been higher without EU enlargement, despite the 2004/2007 EU entrants gaining access to EU resources incentivizing cross-border integration. Second, we implement a difference-in-difference panel regression, incorporating official intra-European high-skilled mobility statistics, to identify migration imbalance—principally from entrant to incumbent EU member states—as a major factor underlying the divergence in cross-border integration between Western and Eastern Europe. These results challenge central tenets underlying ERA integration policies that unifying labor markets will increase the international competitiveness of the ERA, thereby calling attention to the need for effective home-return incentives and policies. PMID:28439544

  13. Quantifying the negative impact of brain drain on the integration of European science.

    PubMed

    Doria Arrieta, Omar A; Pammolli, Fabio; Petersen, Alexander M

    2017-04-01

    The 2004/2007 European Union (EU) enlargement by 12 member states offers a unique opportunity to quantify the impact of EU efforts to expand and integrate the scientific competitiveness of the European Research Area (ERA). We apply two causal estimation schemes to cross-border collaboration data extracted from millions of academic publications from 1996 to 2012, which are disaggregated across 14 subject areas and 32 European countries. Our results illustrate the unintended consequences following the 2004/2007 enlargement, namely, its negative impact on cross-border collaboration in science. First, we use the synthetic control method to show that levels of European cross-border collaboration would have been higher without EU enlargement, despite the 2004/2007 EU entrants gaining access to EU resources incentivizing cross-border integration. Second, we implement a difference-in-difference panel regression, incorporating official intra-European high-skilled mobility statistics, to identify migration imbalance-principally from entrant to incumbent EU member states-as a major factor underlying the divergence in cross-border integration between Western and Eastern Europe. These results challenge central tenets underlying ERA integration policies that unifying labor markets will increase the international competitiveness of the ERA, thereby calling attention to the need for effective home-return incentives and policies.

  14. The 2017 Fertilizer Emissions Airborne Study (FEAST): Quantifying N2O emissions from croplands and fertilizer plants in the Mississippi River Valley.

    NASA Astrophysics Data System (ADS)

    Kort, E. A.; Gvakharia, A.; Smith, M. L.; Conley, S.; Frauhammer, K.

    2017-12-01

    Nitrous Oxide (N2O) is a crucial atmospheric trace gas that drives 21st century stratospheric ozone depletion and substantively impacts climate. Anthropogenic emissions drive the global imbalance and annual growth of N2O, and the dominant anthropogenic source is fertilizer production and application, both of which have large uncertainties. In this presentation we will discuss the FEAST campaign, a study designed to demonstrate new approaches to quantify N2O emissions from fertilizer production and usage with aircraft measurements. In the FEAST campaign we deployed new instrumentation along with experienced flight sensors onboard the Scientific Aviation Mooney aircraft to make 40 hours of continuous 1Hz measurements of N2O, CO2, CO, H2O, CH4, O3, T, and winds. The Mississippi River Valley provided an optimal target as this location includes significant fertilizer production facilities as well as large cropland areas (dominated by corn, soy, rice, and cotton) with substantive fertilizer application. By leveraging our payload and unique airborne capabilities we directly observe and quantify N2O emissions from individual fertilizer production facilities (as well as CO2 and CH4 emissions from these same facilities). We are also able to quantify N2O fluxes from large cropland areas ( 100's km) employing a mass balance approach, a first for N2O, and will show results highlighting differences between crop types and amounts of applied fertilizer. The ability to quantify fluxes of croplands at 100km scale enables new understanding of processes controlling emissions at spatial scales that has eluded prior studies that either rely on extrapolation of small (flux chamber, towers), or work on 1,000+ km spatial scales (regional-global inversions from atmospheric measurements).

  15. Quantifying autonomous vehicles national fuel consumption impacts: A data-rich approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yuche; Gonder, Jeffrey; Young, Stanley

    Autonomous vehicles are drawing significant attention from governments, manufacturers and consumers. Experts predict them to be the primary means of transportation by the middle of this century. Recent literature shows that vehicle automation has the potential to alter traffic patterns, vehicle ownership, and land use, which may affect fuel consumption from the transportation sector. In this paper, we developed a data-rich analytical framework to quantify system-wide fuel impacts of automation in the United States by integrating (1) a dynamic vehicle sales, stock, and usage model, (2) an historical transportation network-level vehicle miles traveled (VMT)/vehicle activity database, and (3) estimates ofmore » automation's impacts on fuel efficiency and travel demand. The vehicle model considers dynamics in vehicle fleet turnover and fuel efficiency improvements of conventional and advanced vehicle fleet. The network activity database contains VMT, free-flow speeds, and historical speeds of road links that can help us accurately identify fuel-savings opportunities of automation. Based on the model setup and assumptions, we found that the impacts of automation on fuel consumption are quite wide-ranging - with the potential to reduce fuel consumption by 45% in our 'Optimistic' case or increase it by 30% in our 'Pessimistic' case. Second, implementing automation on urban roads could potentially result in larger fuel savings compared with highway automation because of the driving features of urban roads. Lastly, through scenario analysis, we showed that the proposed framework can be used for refined assessments as better data on vehicle-level fuel efficiency and travel demand impacts of automation become available.« less

  16. Quantifying autonomous vehicles national fuel consumption impacts: A data-rich approach

    DOE PAGES

    Chen, Yuche; Gonder, Jeffrey; Young, Stanley; ...

    2017-11-06

    Autonomous vehicles are drawing significant attention from governments, manufacturers and consumers. Experts predict them to be the primary means of transportation by the middle of this century. Recent literature shows that vehicle automation has the potential to alter traffic patterns, vehicle ownership, and land use, which may affect fuel consumption from the transportation sector. In this paper, we developed a data-rich analytical framework to quantify system-wide fuel impacts of automation in the United States by integrating (1) a dynamic vehicle sales, stock, and usage model, (2) an historical transportation network-level vehicle miles traveled (VMT)/vehicle activity database, and (3) estimates ofmore » automation's impacts on fuel efficiency and travel demand. The vehicle model considers dynamics in vehicle fleet turnover and fuel efficiency improvements of conventional and advanced vehicle fleet. The network activity database contains VMT, free-flow speeds, and historical speeds of road links that can help us accurately identify fuel-savings opportunities of automation. Based on the model setup and assumptions, we found that the impacts of automation on fuel consumption are quite wide-ranging - with the potential to reduce fuel consumption by 45% in our 'Optimistic' case or increase it by 30% in our 'Pessimistic' case. Second, implementing automation on urban roads could potentially result in larger fuel savings compared with highway automation because of the driving features of urban roads. Lastly, through scenario analysis, we showed that the proposed framework can be used for refined assessments as better data on vehicle-level fuel efficiency and travel demand impacts of automation become available.« less

  17. Quantifying archaeal community autotrophy in the mesopelagic ocean using natural radiocarbon

    PubMed Central

    Ingalls, Anitra E.; Shah, Sunita R.; Hansman, Roberta L.; Aluwihare, Lihini I.; Santos, Guaciara M.; Druffel, Ellen R. M.; Pearson, Ann

    2006-01-01

    An ammonia-oxidizing, carbon-fixing archaeon, Candidatus “Nitrosopumilus maritimus,” recently was isolated from a salt-water aquarium, definitively confirming that chemoautotrophy exists among the marine archaea. However, in other incubation studies, pelagic archaea also were capable of using organic carbon. It has remained unknown what fraction of the total marine archaeal community is autotrophic in situ. If archaea live primarily as autotrophs in the natural environment, a large ammonia-oxidizing population would play a significant role in marine nitrification. Here we use the natural distribution of radiocarbon in archaeal membrane lipids to quantify the bulk carbon metabolism of archaea at two depths in the subtropical North Pacific gyre. Our compound-specific radiocarbon data show that the archaea in surface waters incorporate modern carbon into their membrane lipids, and archaea at 670 m incorporate carbon that is slightly more isotopically enriched than inorganic carbon at the same depth. An isotopic mass balance model shows that the dominant metabolism at depth indeed is autotrophy (83%), whereas heterotrophic consumption of modern organic carbon accounts for the remainder of archaeal biomass. These results reflect the in situ production of the total community that produces tetraether lipids and are not subject to biases associated with incubation and/or culture experiments. The data suggest either that the marine archaeal community includes both autotrophs and heterotrophs or is a single population with a uniformly mixotrophic metabolism. The metabolic and phylogenetic diversity of the marine archaea warrants further exploration; these organisms may play a major role in the marine cycles of nitrogen and carbon. PMID:16614070

  18. Quantifying excessive mirror overflow in children with attention-deficit/hyperactivity disorder

    PubMed Central

    MacNeil, L.K.; Xavier, P.; Garvey, M.A.; Gilbert, D.L.; Ranta, M.E.; Denckla, M.B.

    2011-01-01

    Objectives: Qualitative observations have revealed that children with attention-deficit/hyperactivity disorder (ADHD) show increased overflow movements, a motor sign thought to reflect impaired inhibitory control. The goal of this study was to develop and implement methods for quantifying excessive mirror overflow movements in children with ADHD. Methods: Fifty right-handed children aged 8.2–13.3 years, 25 with ADHD (12 girls) and 25 typically developing (TD) control children (10 girls), performed a sequential finger-tapping task, completing both left-handed (LHFS) and right-handed finger sequencing (RHFS). Phasic overflow of the index and ring fingers was assessed in 34 children with video recording, and total overflow in 48 children was measured by calculating the total angular displacement of the index and ring fingers with electrogoniometer recordings. Results: Phasic overflow and total overflow across both hands were greater in children with ADHD than in TD children, particularly during LHFS. Separate gender analyses revealed that boys, but not girls, with ADHD showed significantly more total phasic overflow and total overflow than did their gender-matched control children. Conclusions: The quantitative overflow measures used in this study support past qualitative findings that motor overflow persists to a greater degree in children with ADHD than in age-matched TD peers. The quantitative findings further suggest that persistence of mirror overflow is more prominent during task execution of the nondominant hand and reveal gender-based differences in developmental neural systems critical to motor control. These quantitative measures will assist future physiologic investigation of the brain basis of motor control in ADHD. PMID:21321336

  19. Integrating quantitative PCR and Bayesian statistics in quantifying human adenoviruses in small volumes of source water.

    PubMed

    Wu, Jianyong; Gronewold, Andrew D; Rodriguez, Roberto A; Stewart, Jill R; Sobsey, Mark D

    2014-02-01

    Rapid quantification of viral pathogens in drinking and recreational water can help reduce waterborne disease risks. For this purpose, samples in small volume (e.g. 1L) are favored because of the convenience of collection, transportation and processing. However, the results of viral analysis are often subject to uncertainty. To overcome this limitation, we propose an approach that integrates Bayesian statistics, efficient concentration methods, and quantitative PCR (qPCR) to quantify viral pathogens in water. Using this approach, we quantified human adenoviruses (HAdVs) in eighteen samples of source water collected from six drinking water treatment plants. HAdVs were found in seven samples. In the other eleven samples, HAdVs were not detected by qPCR, but might have existed based on Bayesian inference. Our integrated approach that quantifies uncertainty provides a better understanding than conventional assessments of potential risks to public health, particularly in cases when pathogens may present a threat but cannot be detected by traditional methods. © 2013 Elsevier B.V. All rights reserved.

  20. Quantifying aboveground forest carbon pools and fluxes from repeat LiDAR surveys

    Treesearch

    Andrew T. Hudak; Eva K. Strand; Lee A. Vierling; John C. Byrne; Jan U. H. Eitel; Sebastian Martinuzzi; Michael J. Falkowski

    2012-01-01

    Sound forest policy and management decisions to mitigate rising atmospheric CO2 depend upon accurate methodologies to quantify forest carbon pools and fluxes over large tracts of land. LiDAR remote sensing is a rapidly evolving technology for quantifying aboveground biomass and thereby carbon pools; however, little work has evaluated the efficacy of repeat LiDAR...

  1. GRACE, GLDAS and measured groundwater data products show water storage loss in Western Jilin, China.

    PubMed

    Moiwo, Juana Paul; Lu, Wenxi; Tao, Fulu

    2012-01-01

    Water storage depletion is a worsening hydrological problem that limits agricultural production in especially arid/semi-arid regions across the globe. Quantifying water storage dynamics is critical for developing water resources management strategies that are sustainable and protective of the environment. This study uses GRACE (Gravity Recovery and Climate Experiment), GLDAS (Global Land Data Assimilation System) and measured groundwater data products to quantify water storage in Western Jilin (a proxy for semi-arid wetland ecosystems) for the period from January 2002 to December 2009. Uncertainty/bias analysis shows that the data products have an average error <10% (p < 0.05). Comparisons of the storage variables show favorable agreements at various temporal cycles, with R(2) = 0.92 and RMSE = 7.43 mm at the average seasonal cycle. There is a narrowing soil moisture storage change, a widening groundwater storage loss, and an overall storage depletion of 0.85 mm/month in the region. There is possible soil-pore collapse, and land subsidence due to storage depletion in the study area. Invariably, storage depletion in this semi-arid region could have negative implications for agriculture, valuable/fragile wetland ecosystems and people's livelihoods. For sustainable restoration and preservation of wetland ecosystems in the region, it is critical to develop water resources management strategies that limit groundwater extraction rate to that of recharge rate.

  2. Quantifying clustered DNA damage induction and repair by gel electrophoresis, electronic imaging and number average length analysis

    NASA Technical Reports Server (NTRS)

    Sutherland, Betsy M.; Georgakilas, Alexandros G.; Bennett, Paula V.; Laval, Jacques; Sutherland, John C.; Gewirtz, A. M. (Principal Investigator)

    2003-01-01

    Assessing DNA damage induction, repair and consequences of such damages requires measurement of specific DNA lesions by methods that are independent of biological responses to such lesions. Lesions affecting one DNA strand (altered bases, abasic sites, single strand breaks (SSB)) as well as damages affecting both strands (clustered damages, double strand breaks) can be quantified by direct measurement of DNA using gel electrophoresis, gel imaging and number average length analysis. Damage frequencies as low as a few sites per gigabase pair (10(9)bp) can be quantified by this approach in about 50ng of non-radioactive DNA, and single molecule methods may allow such measurements in DNA from single cells. This review presents the theoretical basis, biochemical requirements and practical aspects of this approach, and shows examples of their applications in identification and quantitation of complex clustered damages.

  3. CellTrans: An R Package to Quantify Stochastic Cell State Transitions.

    PubMed

    Buder, Thomas; Deutsch, Andreas; Seifert, Michael; Voss-Böhme, Anja

    2017-01-01

    Many normal and cancerous cell lines exhibit a stable composition of cells in distinct states which can, e.g., be defined on the basis of cell surface markers. There is evidence that such an equilibrium is associated with stochastic transitions between distinct states. Quantifying these transitions has the potential to better understand cell lineage compositions. We introduce CellTrans, an R package to quantify stochastic cell state transitions from cell state proportion data from fluorescence-activated cell sorting and flow cytometry experiments. The R package is based on a mathematical model in which cell state alterations occur due to stochastic transitions between distinct cell states whose rates only depend on the current state of a cell. CellTrans is an automated tool for estimating the underlying transition probabilities from appropriately prepared data. We point out potential analytical challenges in the quantification of these cell transitions and explain how CellTrans handles them. The applicability of CellTrans is demonstrated on publicly available data on the evolution of cell state compositions in cancer cell lines. We show that CellTrans can be used to (1) infer the transition probabilities between different cell states, (2) predict cell line compositions at a certain time, (3) predict equilibrium cell state compositions, and (4) estimate the time needed to reach this equilibrium. We provide an implementation of CellTrans in R, freely available via GitHub (https://github.com/tbuder/CellTrans).

  4. Pendulum Underwater--An Approach for Quantifying Viscosity

    ERIC Educational Resources Information Center

    Leme, José Costa; Oliveira, Agostinho

    2017-01-01

    The purpose of the experiment presented in this paper is to quantify the viscosity of a liquid. Viscous effects are important in the flow of fluids in pipes, in the bloodstream, in the lubrication of engine parts, and in many other situations. In the present paper, the authors explore the oscillations of a physical pendulum in the form of a long…

  5. Quantifying the Thermal Fatigue of CPV Modules

    NASA Astrophysics Data System (ADS)

    Bosco, Nick; Kurtz, Sarah

    2010-10-01

    A method is presented to quantify thermal fatigue in the CPV die-attach from meteorological data. A comparative study between cities demonstrates a significant difference in the accumulated damage. These differences are most sensitive to the number of larger (ΔT) thermal cycles experienced for a location. High frequency data (<1/min) may be required to most accurately employ this method.

  6. Suitability of ANSI standards for quantifying communication satellite system performance

    NASA Technical Reports Server (NTRS)

    Cass, Robert D.

    1988-01-01

    A study on the application of American National Standards X3.102 and X3.141 to various classes of communication satellite systems from the simple analog bent-pipe to NASA's Advanced Communications Technology Satellite (ACTS) is discussed. These standards are proposed as means for quantifying the end-to-end communication system performance of communication satellite systems. An introductory overview of the two standards are given followed by a review of the characteristics, applications, and advantages of using X3.102 and X3.141 to quantify with a description of the application of these standards to ACTS.

  7. Quantifying the Temporal Inequality of Nutrient Loads with a Novel Metric

    NASA Astrophysics Data System (ADS)

    Gall, H. E.; Schultz, D.; Rao, P. S.; Jawitz, J. W.; Royer, M.

    2015-12-01

    Inequality is an emergent property of many complex systems. For a given series of stochastic events, some events generate a disproportionately large contribution to system responses compared to other events. In catchments, such responses cause streamflow and solute loads to exhibit strong temporal inequality, with the vast majority of discharge and solute loads exported during short periods of time during which high-flow events occur. These periods of time are commonly referred to as "hot moments". Although this temporal inequality is widely recognized, there is currently no uniform metric for assessing it. We used a novel application of Lorenz Inequality, a method commonly used in economics to quantify income inequality, to quantify the spatial and temporal inequality of streamflow and nutrient (nitrogen and phosphorus) loads exported to the Chesapeake Bay. Lorenz Inequality and the corresponding Gini Coefficient provide an analytical tool for quantifying inequality that can be applied at any temporal or spatial scale. The Gini coefficient (G) is a formal measure of inequality that varies from 0 to 1, with a value of 0 indicating perfect equality (i.e., fluxes and loads are constant in time) and 1 indicating perfect inequality (i.e., all of the discharge and solute loads are exported during one instant in time). Therefore, G is a simple yet powerful tool for providing insight into the temporal inequality of nutrient transport. We will present the results of our detailed analysis of streamflow and nutrient time series data collected since the early 1980's at 30 USGS gauging stations in the Chesapeake Bay watershed. The analysis is conducted at an annual time scale, enabling trends and patterns to be assessed both temporally (over time at each station) and spatially (for the same period of time across stations). The results of this analysis have the potential to create a transformative new framework for identifying "hot moments", improving our ability to temporally

  8. Correlation between plasma endothelin-1 levels and severity of septic liver failure quantified by maximal liver function capacity (LiMAx test). A prospective study

    PubMed Central

    Kaffarnik, Magnus F.; Ahmadi, Navid; Lock, Johan F.; Wuensch, Tilo; Pratschke, Johann; Stockmann, Martin; Malinowski, Maciej

    2017-01-01

    Aim To investigate the relationship between the degree of liver dysfunction, quantified by maximal liver function capacity (LiMAx test) and endothelin-1, TNF-α and IL-6 in septic surgical patients. Methods 28 septic patients (8 female, 20 male, age range 35–80y) were prospectively investigated on a surgical intensive care unit. Liver function, defined by LiMAx test, and measurements of plasma levels of endothelin-1, TNF-α and IL-6 were carried out within the first 24 hours after onset of septic symptoms, followed by day 2, 5 and 10. Patients were divided into 2 groups (group A: LiMAx ≥100 μg/kg/h, moderate liver dysfunction; group B: LiMAx <100 μg/kg/h, severe liver dysfunction) for analysis and investigated regarding the correlation between endothelin-1 and the severity of liver failure, quantified by LiMAx test. Results Group B showed significant higher results for endothelin-1 than patients in group A (P = 0.01, d5; 0.02, d10). For TNF-α, group B revealed higher results than group A, with a significant difference on day 10 (P = 0.005). IL-6 showed a non-significant trend to higher results in group B. The Spearman's rank correlation coefficient revealed a significant correlation between LiMAx and endothelin-1 (-0.434; P <0.001), TNF-α (-0.515; P <0.001) and IL-6 (-0.590; P <0.001). Conclusions Sepsis-related hepatic dysfunction is associated with elevated plasma levels of endothelin-1, TNF-α and IL-6. Low LiMAx results combined with increased endothelin-1 and TNF-α and a favourable correlation between LiMAx and cytokine values support the findings of a crucial role of Endothelin-1 and TNF-α in development of septic liver failure. PMID:28542386

  9. Nonlinear Least-Squares Based Method for Identifying and Quantifying Single and Mixed Contaminants in Air with an Electronic Nose

    PubMed Central

    Zhou, Hanying; Homer, Margie L.; Shevade, Abhijit V.; Ryan, Margaret A.

    2006-01-01

    The Jet Propulsion Laboratory has recently developed and built an electronic nose (ENose) using a polymer-carbon composite sensing array. This ENose is designed to be used for air quality monitoring in an enclosed space, and is designed to detect, identify and quantify common contaminants at concentrations in the parts-per-million range. Its capabilities were demonstrated in an experiment aboard the National Aeronautics and Space Administration's Space Shuttle Flight STS-95. This paper describes a modified nonlinear least-squares based algorithm developed to analyze data taken by the ENose, and its performance for the identification and quantification of single gases and binary mixtures of twelve target analytes in clean air. Results from laboratory-controlled events demonstrate the effectiveness of the algorithm to identify and quantify a gas event if concentration exceeds the ENose detection threshold. Results from the flight test demonstrate that the algorithm correctly identifies and quantifies all registered events (planned or unplanned, as singles or mixtures) with no false positives and no inconsistencies with the logged events and the independent analysis of air samples.

  10. Quantifying side-chain conformational variations in protein structure

    PubMed Central

    Miao, Zhichao; Cao, Yang

    2016-01-01

    Protein side-chain conformation is closely related to their biological functions. The side-chain prediction is a key step in protein design, protein docking and structure optimization. However, side-chain polymorphism comprehensively exists in protein as various types and has been long overlooked by side-chain prediction. But such conformational variations have not been quantitatively studied and the correlations between these variations and residue features are vague. Here, we performed statistical analyses on large scale data sets and found that the side-chain conformational flexibility is closely related to the exposure to solvent, degree of freedom and hydrophilicity. These analyses allowed us to quantify different types of side-chain variabilities in PDB. The results underscore that protein side-chain conformation prediction is not a single-answer problem, leading us to reconsider the assessment approaches of side-chain prediction programs. PMID:27845406

  11. Quantifying side-chain conformational variations in protein structure

    NASA Astrophysics Data System (ADS)

    Miao, Zhichao; Cao, Yang

    2016-11-01

    Protein side-chain conformation is closely related to their biological functions. The side-chain prediction is a key step in protein design, protein docking and structure optimization. However, side-chain polymorphism comprehensively exists in protein as various types and has been long overlooked by side-chain prediction. But such conformational variations have not been quantitatively studied and the correlations between these variations and residue features are vague. Here, we performed statistical analyses on large scale data sets and found that the side-chain conformational flexibility is closely related to the exposure to solvent, degree of freedom and hydrophilicity. These analyses allowed us to quantify different types of side-chain variabilities in PDB. The results underscore that protein side-chain conformation prediction is not a single-answer problem, leading us to reconsider the assessment approaches of side-chain prediction programs.

  12. Quantifying side-chain conformational variations in protein structure.

    PubMed

    Miao, Zhichao; Cao, Yang

    2016-11-15

    Protein side-chain conformation is closely related to their biological functions. The side-chain prediction is a key step in protein design, protein docking and structure optimization. However, side-chain polymorphism comprehensively exists in protein as various types and has been long overlooked by side-chain prediction. But such conformational variations have not been quantitatively studied and the correlations between these variations and residue features are vague. Here, we performed statistical analyses on large scale data sets and found that the side-chain conformational flexibility is closely related to the exposure to solvent, degree of freedom and hydrophilicity. These analyses allowed us to quantify different types of side-chain variabilities in PDB. The results underscore that protein side-chain conformation prediction is not a single-answer problem, leading us to reconsider the assessment approaches of side-chain prediction programs.

  13. Quantifying spatial genetic structuring in mesophotic populations of the precious coral Corallium rubrum.

    PubMed

    Costantini, Federica; Carlesi, Lorenzo; Abbiati, Marco

    2013-01-01

    While shallow water red coral populations have been overharvested in the past, nowadays, commercial harvesting shifted its pressure on mesophotic organisms. An understanding of red coral population structure, particularly larval dispersal patterns and connectivity among harvested populations is paramount to the viability of the species. In order to determine patterns of genetic spatial structuring of deep water Corallium rubrum populations, for the first time, colonies found between 58-118 m depth within the Tyrrhenian Sea were collected and analyzed. Ten microsatellite loci and two regions of mitochondrial DNA (mtMSH and mtC) were used to quantify patterns of genetic diversity within populations and to define population structuring at spatial scales from tens of metres to hundreds of kilometres. Microsatellites showed heterozygote deficiencies in all populations. Significant levels of genetic differentiation were observed at all investigated spatial scales, suggesting that populations are likely to be isolated. This differentiation may by the results of biological interactions, occurring within a small spatial scale and/or abiotic factors acting at a larger scale. Mitochondrial markers revealed significant genetic structuring at spatial scales greater then 100 km showing the occurrence of a barrier to gene flow between northern and southern Tyrrhenian populations. These findings provide support for the establishment of marine protected areas in the deep sea and off-shore reefs, in order to effectively maintain genetic diversity of mesophotic red coral populations.

  14. Quantifying the value of redundant measurements at GCOS Reference Upper-Air Network sites

    DOE PAGES

    Madonna, F.; Rosoldi, M.; Güldner, J.; ...

    2014-11-19

    The potential for measurement redundancy to reduce uncertainty in atmospheric variables has not been investigated comprehensively for climate observations. We evaluated the usefulness of entropy and mutual correlation concepts, as defined in information theory, for quantifying random uncertainty and redundancy in time series of the integrated water vapour (IWV) and water vapour mixing ratio profiles provided by five highly instrumented GRUAN (GCOS, Global Climate Observing System, Reference Upper-Air Network) stations in 2010–2012. Results show that the random uncertainties on the IWV measured with radiosondes, global positioning system, microwave and infrared radiometers, and Raman lidar measurements differed by less than 8%.more » Comparisons of time series of IWV content from ground-based remote sensing instruments with in situ soundings showed that microwave radiometers have the highest redundancy with the IWV time series measured by radiosondes and therefore the highest potential to reduce the random uncertainty of the radiosondes time series. Moreover, the random uncertainty of a time series from one instrument can be reduced by ~ 60% by constraining the measurements with those from another instrument. The best reduction of random uncertainty is achieved by conditioning Raman lidar measurements with microwave radiometer measurements. In conclusion, specific instruments are recommended for atmospheric water vapour measurements at GRUAN sites. This approach can be applied to the study of redundant measurements for other climate variables.« less

  15. A Multiple Degree of Freedom Lower Extremity Isometric Device to Simultaneously Quantify Hip, Knee and Ankle Torques

    PubMed Central

    Sánchez, Natalia; Acosta, Ana Maria; Stienen, Arno H.A.

    2015-01-01

    Characterization of the joint torque coupling strategies used in the lower extremity to generate maximal and submaximal levels of torque at either the hip, knee or ankle is lacking. Currently, there are no available isometric devices that quantify all concurrent joint torques in the hip, knee and ankle of a single leg during maximum voluntary torque generation. Thus, joint-torque coupling strategies in the hip, knee and concurrent torques at ankle and/or coupling patterns at the hip and knee driven by the ankle have yet to be quantified. This manuscript describes the design, implementation and validation of a multiple degree of freedom, lower extremity isometric device (the MultiLEIT) that accurately quantifies simultaneous torques at the hip, knee and ankle. The system was mechanically validated and then implemented with two healthy control individuals and two post-stroke individuals to test usability and patient acceptance. Data indicated different joint torque coupling strategies used by both healthy individuals. In contrast, data showed the same torque coupling patterns in both post-stroke individuals, comparable to those described in the clinic. Successful implementation of the MultiLEIT can contribute to the understanding of the underlying mechanisms responsible for abnormal movement patterns and aid in the design of therapeutic interventions. PMID:25163064

  16. Visualising uncertainty: interpreting quantified geoscientific inversion outputs for a diverse user community.

    NASA Astrophysics Data System (ADS)

    Reading, A. M.; Morse, P. E.; Staal, T.

    2017-12-01

    Geoscientific inversion outputs, such as seismic tomography contour images, are finding increasing use amongst scientific user communities that have limited knowledge of the impact of output parameter uncertainty on subsequent interpretations made from such images. We make use of a newly written computer application which enables seismic tomography images to be displayed in a performant 3D graphics environment. This facilitates the mapping of colour scales to the human visual sensorium for the interactive interpretation of contoured inversion results incorporating parameter uncertainty. Two case examples of seismic tomography inversions or contoured compilations are compared from the southern hemisphere continents of Australia and Antarctica. The Australian example is based on the AuSREM contoured seismic wavespeed model while the Antarctic example is a valuable but less well constrained result. Through adjusting the multiple colour gradients, layer separations, opacity, illumination, shadowing and background effects, we can optimise the insights obtained from the 3D structure in the inversion compilation or result. Importantly, we can also limit the display to show information in a way that is mapped to the uncertainty in the 3D result. Through this practical application, we demonstrate that the uncertainty in the result can be handled through a well-posed mapping of the parameter values to displayed colours in the knowledge of what is perceived visually by a typical human. We found that this approach maximises the chance of a useful tectonic interpretation by a diverse scientific user community. In general, we develop the idea that quantified inversion uncertainty can be used to tailor the way that the output is presented to the analyst for scientific interpretation.

  17. Quantifying Evaporation and Evaluating Runoff Estimation Methods in a Permeable Pavement System - abstract

    EPA Science Inventory

    Studies on quantifying evaporation in permeable pavement systems are limited to few laboratory studies that used a scale to weigh evaporative losses and a field application with a tunnel-evaporation gauge. A primary objective of this research was to quantify evaporation for a la...

  18. "Quenchbodies": quench-based antibody probes that show antigen-dependent fluorescence.

    PubMed

    Abe, Ryoji; Ohashi, Hiroyuki; Iijima, Issei; Ihara, Masaki; Takagi, Hiroaki; Hohsaka, Takahiro; Ueda, Hiroshi

    2011-11-02

    Here, we describe a novel reagentless fluorescent biosensor strategy based on the antigen-dependent removal of a quenching effect on a fluorophore attached to antibody domains. Using a cell-free translation-mediated position-specific protein labeling system, we found that an antibody single chain variable region (scFv) that had been fluorolabeled at the N-terminal region showed a significant antigen-dependent fluorescence enhancement. Investigation of the enhancement mechanism by mutagenesis of the carboxytetramethylrhodamine (TAMRA)-labeled anti-osteocalcin scFv showed that antigen-dependency was dependent on semiconserved tryptophan residues near the V(H)/V(L) interface. This suggested that the binding of the antigen led to the interruption of a quenching effect caused by the proximity of tryptophan residues to the linker-tagged fluorophore. Using TAMRA-scFv, many targets including peptides, proteins, and haptens including morphine-related drugs could be quantified. Similar or higher sensitivities to those observed in competitive ELISA were obtained, even in human plasma. Because of its versatility, this "quenchbody" is expected to have a range of applications, from in vitro diagnostics, to imaging of various targets in situ.

  19. Quantifying a Negative: How Homeland Security Adds Value

    DTIC Science & Technology

    2015-12-01

    access to future victims. The Law Enforcement agency could then identifying and quantifying the value of future crimes. For example, if a serial ... killer is captured with evidence of the next victim or an established pattern of victimization, network theory could be used to identify the next

  20. Thermophoresis in nanoliter droplets to quantify aptamer binding.

    PubMed

    Seidel, Susanne A I; Markwardt, Niklas A; Lanzmich, Simon A; Braun, Dieter

    2014-07-21

    Biomolecule interactions are central to pharmacology and diagnostics. These interactions can be quantified by thermophoresis, the directed molecule movement along a temperature gradient. It is sensitive to binding induced changes in size, charge, or conformation. Established capillary measurements require at least 0.5 μL per sample. We cut down sample consumption by a factor of 50, using 10 nL droplets produced with acoustic droplet robotics (Labcyte). Droplets were stabilized in an oil-surfactant mix and locally heated with an IR laser. Temperature increase, Marangoni flow, and concentration distribution were analyzed by fluorescence microscopy and numerical simulation. In 10 nL droplets, we quantified AMP-aptamer affinity, cooperativity, and buffer dependence. Miniaturization and the 1536-well plate format make the method high-throughput and automation friendly. This promotes innovative applications for diagnostic assays in human serum or label-free drug discovery screening. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Optimization of ELISA Conditions to Quantify Colorectal Cancer Antigen-Antibody Complex Protein (GA733-FcK) Expressed in Transgenic Plant

    PubMed Central

    Ahn, Junsik; Lee, Kyung Jin

    2014-01-01

    The purpose of this study is to optimize ELISA conditions to quantify the colorectal cancer antigen GA733 linked to the Fc antibody fragment fused to KDEL, an ER retention motif (GA733-FcK) expressed in transgenic plant. Variable conditions of capture antibody, blocking buffer, and detection antibody for ELISA were optimized with application of leaf extracts from transgenic plant expressing GA733-FcK. In detection antibody, anti-EpCAM/CD362 IgG recognizing the GA733 did not detect any GA733-FcK whereas anti-human Fc IgG recognizing the human Fc existed in plant leaf extracts. For blocking buffer conditions, 3% BSA buffer clearly blocked the plate, compared to the 5% skim-milk buffer. For capture antibody, monoclonal antibody (MAb) CO17-1A was applied to coat the plate with different amounts (1, 0.5, and 0.25 μg/well). Among the amounts of the capture antibody, 1 and 0.5 μg/well (capture antibody) showed similar absorbance, whereas 0.25 μg/well of the capture antibody showed significantly less absorbance. Taken together, the optimized conditions to quantify plant-derived GA733-FcK were 0.5 μg/well of MAb CO17-1A per well for the capture antibody, 3% BSA for blocking buffer, and anti-human Fc conjugated HRP. To confirm the optimized ELISA conditions, correlation analysis was conducted between the quantified amount of GA733-FcK in ELISA and its protein density values of different leaf samples in Western blot. The co-efficient value R2 between the ELISA quantified value and protein density was 0.85 (p<0.01), which indicates that the optimized ELISA conditions feasibly provides quantitative information of GA733-FcK expression in transgenic plant. PMID:24555929

  2. Quantifying microstructural dynamics and electrochemical activity of graphite and silicon-graphite lithium ion battery anodes

    NASA Astrophysics Data System (ADS)

    Pietsch, Patrick; Westhoff, Daniel; Feinauer, Julian; Eller, Jens; Marone, Federica; Stampanoni, Marco; Schmidt, Volker; Wood, Vanessa

    2016-09-01

    Despite numerous studies presenting advances in tomographic imaging and analysis of lithium ion batteries, graphite-based anodes have received little attention. Weak X-ray attenuation of graphite and, as a result, poor contrast between graphite and the other carbon-based components in an electrode pore space renders data analysis challenging. Here we demonstrate operando tomography of weakly attenuating electrodes during electrochemical (de)lithiation. We use propagation-based phase contrast tomography to facilitate the differentiation between weakly attenuating materials and apply digital volume correlation to capture the dynamics of the electrodes during operation. After validating that we can quantify the local electrochemical activity and microstructural changes throughout graphite electrodes, we apply our technique to graphite-silicon composite electrodes. We show that microstructural changes that occur during (de)lithiation of a pure graphite electrode are of the same order of magnitude as spatial inhomogeneities within it, while strain in composite electrodes is locally pronounced and introduces significant microstructural changes.

  3. Comparison of algorithms to quantify muscle fatigue in upper limb muscles based on sEMG signals.

    PubMed

    Kahl, Lorenz; Hofmann, Ulrich G

    2016-11-01

    This work compared the performance of six different fatigue detection algorithms quantifying muscle fatigue based on electromyographic signals. Surface electromyography (sEMG) was obtained by an experiment from upper arm contractions at three different load levels from twelve volunteers. Fatigue detection algorithms mean frequency (MNF), spectral moments ratio (SMR), the wavelet method WIRM1551, sample entropy (SampEn), fuzzy approximate entropy (fApEn) and recurrence quantification analysis (RQA%DET) were calculated. The resulting fatigue signals were compared considering the disturbances incorporated in fatiguing situations as well as according to the possibility to differentiate the load levels based on the fatigue signals. Furthermore we investigated the influence of the electrode locations on the fatigue detection quality and whether an optimized channel set is reasonable. The results of the MNF, SMR, WIRM1551 and fApEn algorithms fell close together. Due to the small amount of subjects in this study significant differences could not be found. In terms of disturbances the SMR algorithm showed a slight tendency to out-perform the others. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  4. A Rapid Method for Quantifying Viable Mycobacterium avium subsp. paratuberculosis in Cellular Infection Assays

    PubMed Central

    Pooley, Hannah B.; de Silva, Kumudika; Purdie, Auriol C.; Begg, Douglas J.; Whittington, Richard J.

    2016-01-01

    ABSTRACT Determining the viability of bacteria is a key outcome of in vitro cellular infection assays. Currently, this is done by culture, which is problematic for fastidious slow-growing bacteria such as Mycobacterium avium subsp. paratuberculosis, where it can take up to 4 months to confirm growth. This study aimed to identify an assay that can rapidly quantify the number of viable M. avium subsp. paratuberculosis cells in a cellular sample. Three commercially available bacterial viability assays along with a modified liquid culture method coupled with high-throughput quantitative PCR growth detection were assessed. Criteria for assessment included the ability of each assay to differentiate live and dead M. avium subsp. paratuberculosis organisms and their accuracy at low bacterial concentrations. Using the culture-based method, M. avium subsp. paratuberculosis growth was reliably detected and quantified within 2 weeks. There was a strong linear association between the 2-week growth rate and the initial inoculum concentration. The number of viable M. avium subsp. paratuberculosis cells in an unknown sample was quantified based on the growth rate, by using growth standards. In contrast, none of the commercially available viability assays were suitable for use with samples from in vitro cellular infection assays. IMPORTANCE Rapid quantification of the viability of Mycobacterium avium subsp. paratuberculosis in samples from in vitro cellular infection assays is important, as it allows these assays to be carried out on a large scale. In vitro cellular infection assays can function as a preliminary screening tool, for vaccine development or antimicrobial screening, and also to extend findings derived from experimental animal trials. Currently, by using culture, it takes up to 4 months to obtain quantifiable results regarding M. avium subsp. paratuberculosis viability after an in vitro infection assay; however, with the quantitative PCR and liquid culture method

  5. Feasibility of Quantifying Arterial Cerebral Blood Volume Using Multiphase Alternate Ascending/Descending Directional Navigation (ALADDIN).

    PubMed

    Kim, Ki Hwan; Choi, Seung Hong; Park, Sung-Hong

    2016-01-01

    Arterial cerebral blood volume (aCBV) is associated with many physiologic and pathologic conditions. Recently, multiphase balanced steady state free precession (bSSFP) readout was introduced to measure labeled blood signals in the arterial compartment, based on the fact that signal difference between labeled and unlabeled blood decreases with the number of RF pulses that is affected by blood velocity. In this study, we evaluated the feasibility of a new 2D inter-slice bSSFP-based arterial spin labeling (ASL) technique termed, alternate ascending/descending directional navigation (ALADDIN), to quantify aCBV using multiphase acquisition in six healthy subjects. A new kinetic model considering bSSFP RF perturbations was proposed to describe the multiphase data and thus to quantify aCBV. Since the inter-slice time delay (TD) and gap affected the distribution of labeled blood spins in the arterial and tissue compartments, we performed the experiments with two TDs (0 and 500 ms) and two gaps (300% and 450% of slice thickness) to evaluate their roles in quantifying aCBV. Comparison studies using our technique and an existing method termed arterial volume using arterial spin tagging (AVAST) were also separately performed in five subjects. At 300% gap or 500-ms TD, significant tissue perfusion signals were demonstrated, while tissue perfusion signals were minimized and arterial signals were maximized at 450% gap and 0-ms TD. ALADDIN has an advantage of visualizing bi-directional flow effects (ascending/descending) in a single experiment. Labeling efficiency (α) of inter-slice blood flow effects could be measured in the superior sagittal sinus (SSS) (20.8±3.7%.) and was used for aCBV quantification. As a result of fitting to the proposed model, aCBV values in gray matter (1.4-2.3 mL/100 mL) were in good agreement with those from literature. Our technique showed high correlation with AVAST, especially when arterial signals were accentuated (i.e., when TD = 0 ms) (r = 0

  6. Quantifying selective pressures driving bacterial evolution using lineage analysis

    PubMed Central

    Lambert, Guillaume; Kussell, Edo

    2015-01-01

    Organisms use a variety of strategies to adapt to their environments and maximize long-term growth potential, but quantitative characterization of the benefits conferred by the use of such strategies, as well as their impact on the whole population’s rate of growth, remains challenging. Here, we use a path-integral framework that describes how selection acts on lineages –i.e. the life-histories of individuals and their ancestors– to demonstrate that lineage-based measurements can be used to quantify the selective pressures acting on a population. We apply this analysis to E. coli bacteria exposed to cyclical treatments of carbenicillin, an antibiotic that interferes with cell-wall synthesis and affects cells in an age-dependent manner. While the extensive characterization of the life-history of thousands of cells is necessary to accurately extract the age-dependent selective pressures caused by carbenicillin, the same measurement can be recapitulated using lineage-based statistics of a single surviving cell. Population-wide evolutionary pressures can be extracted from the properties of the surviving lineages within a population, providing an alternative and efficient procedure to quantify the evolutionary forces acting on a population. Importantly, this approach is not limited to age-dependent selection, and the framework can be generalized to detect signatures of other trait-specific selection using lineage-based measurements. Our results establish a powerful way to study the evolutionary dynamics of life under selection, and may be broadly useful in elucidating selective pressures driving the emergence of antibiotic resistance and the evolution of survival strategies in biological systems. PMID:26213639

  7. Quantifying Selective Pressures Driving Bacterial Evolution Using Lineage Analysis

    NASA Astrophysics Data System (ADS)

    Lambert, Guillaume; Kussell, Edo

    2015-01-01

    Organisms use a variety of strategies to adapt to their environments and maximize long-term growth potential, but quantitative characterization of the benefits conferred by the use of such strategies, as well as their impact on the whole population's rate of growth, remains challenging. Here, we use a path-integral framework that describes how selection acts on lineages—i.e., the life histories of individuals and their ancestors—to demonstrate that lineage-based measurements can be used to quantify the selective pressures acting on a population. We apply this analysis to Escherichia coli bacteria exposed to cyclical treatments of carbenicillin, an antibiotic that interferes with cell-wall synthesis and affects cells in an age-dependent manner. While the extensive characterization of the life history of thousands of cells is necessary to accurately extract the age-dependent selective pressures caused by carbenicillin, the same measurement can be recapitulated using lineage-based statistics of a single surviving cell. Population-wide evolutionary pressures can be extracted from the properties of the surviving lineages within a population, providing an alternative and efficient procedure to quantify the evolutionary forces acting on a population. Importantly, this approach is not limited to age-dependent selection, and the framework can be generalized to detect signatures of other trait-specific selection using lineage-based measurements. Our results establish a powerful way to study the evolutionary dynamics of life under selection and may be broadly useful in elucidating selective pressures driving the emergence of antibiotic resistance and the evolution of survival strategies in biological systems.

  8. Quantifying post-fire fallen trees using multi-temporal lidar

    NASA Astrophysics Data System (ADS)

    Bohlin, Inka; Olsson, Håkan; Bohlin, Jonas; Granström, Anders

    2017-12-01

    Massive tree-felling due to root damage is a common fire effect on burnt areas in Scandinavia, but has so far not been analyzed in detail. Here we explore if pre- and post-fire lidar data can be used to estimate the proportion of fallen trees. The study was carried out within a large (14,000 ha) area in central Sweden burnt in August 2014, where we had access to airborne lidar data from both 2011 and 2015. Three data-sets of predictor variables were tested: POST (post-fire lidar metrics), DIF (difference between post- and pre-fire lidar metrics) and combination of those two (POST_DIF). Fractional logistic regression was used to predict the proportion of fallen trees. Training data consisted of 61 plots, where the number of fallen and standing trees was calculated both in the field and with interpretation of drone images. The accuracy of the best model was tested based on 100 randomly selected validation plots with a size of 25 × 25 m. Our results showed that multi-temporal lidar together with field-collected training data can be used for quantifying post-fire tree felling over large areas. Several height-, density- and intensity metrics correlated with the proportion of fallen trees. The best model combined metrics from both datasets (POST_DIF), resulting in a RMSE of 0.11. Results were slightly poorer in the validation plots with RMSE of 0.18 using pixel size of 12.5 m and RMSE of 0.15 using pixel size of 6.25 m. Our model performed least well for stands that had been exposed to high-intensity crown fire. This was likely due to the low amount of echoes from the standing black tree skeletons. Wall-to-wall maps produced with this model can be used for landscape level analysis of fire effects and to explore the relationship between fallen trees and forest structure, soil type, fire intensity or topography.

  9. Quantifying Behavior Driven Energy Savings for Hotels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Bing; Wang, Na; Hooks, Edward

    2016-08-12

    Hotel facilities present abundant opportunities for energy savings. In the United States, there are around 25,000 hotels that spend on an average of $2,196 on energy costs per room each year. This amounts to about 6% of the total annual hotel operating cost. However, unlike offices, there are limited studies on establishing appropriate baselines and quantifying hotel energy savings given the variety of services and amenities, unpredictable customer behaviors, and the around-the-clock operation hours. In this study, we investigate behavior driven energy savings for three medium-size (around 90,000 sf2) hotels that offer similar services in different climate zones. We firstmore » used Department of Energy Asset Scoring Tool to establish baseline models. We then conducted energy saving analysis in EnergyPlus based on a behavior model that defines the upper bound and lower bound of customer and hotel staff behavior. Lastly, we presented a probabilistic energy savings outlook for each hotel. The analysis shows behavior driven energy savings up to 25%. We believe this is the first study to incorporate behavioral factors into energy analysis for hotels. It also demonstrates a procedure to quickly create tailored baselines and identify improvement opportunities for hotels.« less

  10. Quantifying App Store Dynamics: Longitudinal Tracking of Mental Health Apps

    PubMed Central

    Nicholas, Jennifer; Christensen, Helen

    2016-01-01

    Background For many mental health conditions, mobile health apps offer the ability to deliver information, support, and intervention outside the clinical setting. However, there are difficulties with the use of a commercial app store to distribute health care resources, including turnover of apps, irrelevance of apps, and discordance with evidence-based practice. Objective The primary aim of this study was to quantify the longevity and rate of turnover of mental health apps within the official Android and iOS app stores. The secondary aim was to quantify the proportion of apps that were clinically relevant and assess whether the longevity of these apps differed from clinically nonrelevant apps. The tertiary aim was to establish the proportion of clinically relevant apps that included claims of clinical effectiveness. We performed additional subgroup analyses using additional data from the app stores, including search result ranking, user ratings, and number of downloads. Methods We searched iTunes (iOS) and the Google Play (Android) app stores each day over a 9-month period for apps related to depression, bipolar disorder, and suicide. We performed additional app-specific searches if an app no longer appeared within the main search Results On the Android platform, 50% of the search results changed after 130 days (depression), 195 days (bipolar disorder), and 115 days (suicide). Search results were more stable on the iOS platform, with 50% of the search results remaining at the end of the study period. Approximately 75% of Android and 90% of iOS apps were still available to download at the end of the study. We identified only 35.3% (347/982) of apps as being clinically relevant for depression, of which 9 (2.6%) claimed clinical effectiveness. Only 3 included a full citation to a published study. Conclusions The mental health app environment is volatile, with a clinically relevant app for depression becoming unavailable to download every 2.9 days. This poses

  11. THE SEGUE K GIANT SURVEY. III. QUANTIFYING GALACTIC HALO SUBSTRUCTURE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janesh, William; Morrison, Heather L.; Ma, Zhibo

    2016-01-10

    We statistically quantify the amount of substructure in the Milky Way stellar halo using a sample of 4568 halo K giant stars at Galactocentric distances ranging over 5–125 kpc. These stars have been selected photometrically and confirmed spectroscopically as K giants from the Sloan Digital Sky Survey’s Sloan Extension for Galactic Understanding and Exploration project. Using a position–velocity clustering estimator (the 4distance) and a model of a smooth stellar halo, we quantify the amount of substructure in the halo, divided by distance and metallicity. Overall, we find that the halo as a whole is highly structured. We also confirm earliermore » work using blue horizontal branch (BHB) stars which showed that there is an increasing amount of substructure with increasing Galactocentric radius, and additionally find that the amount of substructure in the halo increases with increasing metallicity. Comparing to resampled BHB stars, we find that K giants and BHBs have similar amounts of substructure over equivalent ranges of Galactocentric radius. Using a friends-of-friends algorithm to identify members of individual groups, we find that a large fraction (∼33%) of grouped stars are associated with Sgr, and identify stars belonging to other halo star streams: the Orphan Stream, the Cetus Polar Stream, and others, including previously unknown substructures. A large fraction of sample K giants (more than 50%) are not grouped into any substructure. We find also that the Sgr stream strongly dominates groups in the outer halo for all except the most metal-poor stars, and suggest that this is the source of the increase of substructure with Galactocentric radius and metallicity.« less

  12. Semi-automted analysis of high-resolution aerial images to quantify docks in Upper Midwest glacial lakes

    USGS Publications Warehouse

    Beck, Marcus W.; Vondracek, Bruce C.; Hatch, Lorin K.; Vinje, Jason

    2013-01-01

    Lake resources can be negatively affected by environmental stressors originating from multiple sources and different spatial scales. Shoreline development, in particular, can negatively affect lake resources through decline in habitat quality, physical disturbance, and impacts on fisheries. The development of remote sensing techniques that efficiently characterize shoreline development in a regional context could greatly improve management approaches for protecting and restoring lake resources. The goal of this study was to develop an approach using high-resolution aerial photographs to quantify and assess docks as indicators of shoreline development. First, we describe a dock analysis workflow that can be used to quantify the spatial extent of docks using aerial images. Our approach incorporates pixel-based classifiers with object-based techniques to effectively analyze high-resolution digital imagery. Second, we apply the analysis workflow to quantify docks for 4261 lakes managed by the Minnesota Department of Natural Resources. Overall accuracy of the analysis results was 98.4% (87.7% based on ) after manual post-processing. The analysis workflow was also 74% more efficient than the time required for manual digitization of docks. These analyses have immediate relevance for resource planning in Minnesota, whereas the dock analysis workflow could be used to quantify shoreline development in other regions with comparable imagery. These data can also be used to better understand the effects of shoreline development on aquatic resources and to evaluate the effects of shoreline development relative to other stressors.

  13. Quantifying soil respiration at landscape scales. Chapter 11

    Treesearch

    John B. Bradford; Michael G. Ryan

    2008-01-01

    Soil CO2, efflux, or soil respiration, represents a substantial component of carbon cycling in terrestrial ecosystems. Consequently, quantifying soil respiration over large areas and long time periods is an increasingly important goal. However, soil respiration rates vary dramatically in space and time in response to both environmental conditions...

  14. Quantifying Parkinson's disease finger-tapping severity by extracting and synthesizing finger motion properties.

    PubMed

    Sano, Yuko; Kandori, Akihiko; Shima, Keisuke; Yamaguchi, Yuki; Tsuji, Toshio; Noda, Masafumi; Higashikawa, Fumiko; Yokoe, Masaru; Sakoda, Saburo

    2016-06-01

    We propose a novel index of Parkinson's disease (PD) finger-tapping severity, called "PDFTsi," for quantifying the severity of symptoms related to the finger tapping of PD patients with high accuracy. To validate the efficacy of PDFTsi, the finger-tapping movements of normal controls and PD patients were measured by using magnetic sensors, and 21 characteristics were extracted from the finger-tapping waveforms. To distinguish motor deterioration due to PD from that due to aging, the aging effect on finger tapping was removed from these characteristics. Principal component analysis (PCA) was applied to the age-normalized characteristics, and principal components that represented the motion properties of finger tapping were calculated. Multiple linear regression (MLR) with stepwise variable selection was applied to the principal components, and PDFTsi was calculated. The calculated PDFTsi indicates that PDFTsi has a high estimation ability, namely a mean square error of 0.45. The estimation ability of PDFTsi is higher than that of the alternative method, MLR with stepwise regression selection without PCA, namely a mean square error of 1.30. This result suggests that PDFTsi can quantify PD finger-tapping severity accurately. Furthermore, the result of interpreting a model for calculating PDFTsi indicated that motion wideness and rhythm disorder are important for estimating PD finger-tapping severity.

  15. Dissociation of quantifiers and object nouns in speech in focal neurodegenerative disease.

    PubMed

    Ash, Sharon; Ternes, Kylie; Bisbing, Teagan; Min, Nam Eun; Moran, Eileen; York, Collin; McMillan, Corey T; Irwin, David J; Grossman, Murray

    2016-08-01

    Quantifiers such as many and some are thought to depend in part on the conceptual representation of number knowledge, while object nouns such as cookie and boy appear to depend in part on visual feature knowledge associated with object concepts. Further, number knowledge is associated with a frontal-parietal network while object knowledge is related in part to anterior and ventral portions of the temporal lobe. We examined the cognitive and anatomic basis for the spontaneous speech production of quantifiers and object nouns in non-aphasic patients with focal neurodegenerative disease associated with corticobasal syndrome (CBS, n=33), behavioral variant frontotemporal degeneration (bvFTD, n=54), and semantic variant primary progressive aphasia (svPPA, n=19). We recorded a semi-structured speech sample elicited from patients and healthy seniors (n=27) during description of the Cookie Theft scene. We observed a dissociation: CBS and bvFTD were significantly impaired in the production of quantifiers but not object nouns, while svPPA were significantly impaired in the production of object nouns but not quantifiers. MRI analysis revealed that quantifier production deficits in CBS and bvFTD were associated with disease in a frontal-parietal network important for number knowledge, while impaired production of object nouns in all patient groups was related to disease in inferior temporal regions important for representations of visual feature knowledge of objects. These findings imply that partially dissociable representations in semantic memory may underlie different segments of the lexicon. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Rehabilitation Risk Management: Enabling Data Analytics with Quantified Self and Smart Home Data.

    PubMed

    Hamper, Andreas; Eigner, Isabella; Wickramasinghe, Nilmini; Bodendorf, Freimut

    2017-01-01

    A variety of acute and chronic diseases require rehabilitation at home after treatment. Outpatient rehabilitation is crucial for the quality of the medical outcome but is mainly performed without medical supervision. Non-Compliance can lead to severe health risks and readmission to the hospital. While the patient is closely monitored in the hospital, methods and technologies to identify risks at home have to be developed. We analyze state-of-the-art monitoring systems and technologies and show possibilities to transfer these technologies into rehabilitation monitoring. For this purpose, we analyze sensor technology from the field of Quantified Self and Smart Homes. The available sensor data from this consumer grade technology is summarized to give an overview of the possibilities for medical data analytics. Subsequently, we show a conceptual roadmap to transfer data analytics methods to sensor based rehabilitation risk management.

  17. Quantifying Forest Ecosystem Services Tradeoff—Coupled Ecological and Economic Models

    NASA Astrophysics Data System (ADS)

    Haff, P. K.; Ling, P. Y.

    2015-12-01

    Quantification of the effect of carbon-related forestland management activities on ecosystem services is difficult, because knowledge about the dynamics of coupled social-ecological systems is lacking. Different forestland management activities, such as various amount, timing, and methods of harvesting, and natural disturbances events, such as wind and fires, create shocks and uncertainties to the forest carbon dynamics. A spatially explicit model, Landis-ii, was used to model the forest succession for different harvest management scenarios at the Grandfather District, North Carolina. In addition to harvest, the model takes into account of the impact of natural disturbances, such as fire and insects, and species competition. The result shows the storage of carbon in standing biomass and in wood product for each species for each scenario. In this study, optimization is used to analyze the maximum profit and the number of tree species that each forest landowner can gain at different prices of carbon, roundwood, and interest rates for different harvest management scenarios. Time series of roundwood production of different types were estimated using remote sensing data. Econometric analysis is done to understand the possible interaction and relations between the production of different types of roundwood and roundwood prices, which can indicate the possible planting scheme that a forest owner may make. This study quantifies the tradeoffs between carbon sequestration, roundwood production, and forest species diversity not only from an economic perspective, but also takes into account of the forest succession mechanism in a species-diverse region. The resulting economic impact on the forest landowners is likely to influence their future planting decision, which in turn, will influence the species composition and future revenue of the landowners.

  18. A novel real time imaging platform to quantify macrophage phagocytosis.

    PubMed

    Kapellos, Theodore S; Taylor, Lewis; Lee, Heyne; Cowley, Sally A; James, William S; Iqbal, Asif J; Greaves, David R

    2016-09-15

    Phagocytosis of pathogens, apoptotic cells and debris is a key feature of macrophage function in host defense and tissue homeostasis. Quantification of macrophage phagocytosis in vitro has traditionally been technically challenging. Here we report the optimization and validation of the IncuCyte ZOOM® real time imaging platform for macrophage phagocytosis based on pHrodo® pathogen bioparticles, which only fluoresce when localized in the acidic environment of the phagolysosome. Image analysis and fluorescence quantification were performed with the automated IncuCyte™ Basic Software. Titration of the bioparticle number showed that the system is more sensitive than a spectrofluorometer, as it can detect phagocytosis when using 20× less E. coli bioparticles. We exemplified the power of this real time imaging platform by studying phagocytosis of murine alveolar, bone marrow and peritoneal macrophages. We further demonstrate the ability of this platform to study modulation of the phagocytic process, as pharmacological inhibitors of phagocytosis suppressed bioparticle uptake in a concentration-dependent manner, whereas opsonins augmented phagocytosis. We also investigated the effects of macrophage polarization on E. coli phagocytosis. Bone marrow-derived macrophage (BMDM) priming with M2 stimuli, such as IL-4 and IL-10 resulted in higher engulfment of bioparticles in comparison with M1 polarization. Moreover, we demonstrated that tolerization of BMDMs with lipopolysaccharide (LPS) results in impaired E. coli bioparticle phagocytosis. This novel real time assay will enable researchers to quantify macrophage phagocytosis with a higher degree of accuracy and sensitivity and will allow investigation of limited populations of primary phagocytes in vitro. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  19. Allergic Asthmatics Show Divergent Lipid Mediator Profiles from Healthy Controls Both at Baseline and following Birch Pollen Provocation

    PubMed Central

    Lundström, Susanna L.; Yang, Jun; Källberg, Henrik J.; Thunberg, Sarah; Gafvelin, Guro; Haeggström, Jesper Z.; Grönneberg, Reidar; Grunewald, Johan; van Hage, Marianne; Hammock, Bruce D.; Eklund, Anders; Wheelock, Åsa M.; Wheelock, Craig E.

    2012-01-01

    Background Asthma is a respiratory tract disorder characterized by airway hyper-reactivity and chronic inflammation. Allergic asthma is associated with the production of allergen-specific IgE and expansion of allergen-specific T-cell populations. Progression of allergic inflammation is driven by T-helper type 2 (Th2) mediators and is associated with alterations in the levels of lipid mediators. Objectives Responses of the respiratory system to birch allergen provocation in allergic asthmatics were investigated. Eicosanoids and other oxylipins were quantified in the bronchoalveolar lumen to provide a measure of shifts in lipid mediators associated with allergen challenge in allergic asthmatics. Methods Eighty-seven lipid mediators representing the cyclooxygenase (COX), lipoxygenase (LOX) and cytochrome P450 (CYP) metabolic pathways were screened via LC-MS/MS following off-line extraction of bronchoalveolar lavage fluid (BALF). Multivariate statistics using OPLS were employed to interrogate acquired oxylipin data in combination with immunological markers. Results Thirty-two oxylipins were quantified, with baseline asthmatics possessing a different oxylipin profile relative to healthy individuals that became more distinct following allergen provocation. The most prominent differences included 15-LOX-derived ω-3 and ω-6 oxylipins. Shared-and-Unique-Structures (SUS)-plot modeling showed a correlation (R2 = 0.7) between OPLS models for baseline asthmatics (R2Y[cum] = 0.87, Q2[cum] = 0.51) and allergen-provoked asthmatics (R2Y[cum] = 0.95, Q2[cum] = 0.73), with the majority of quantified lipid mediators and cytokines contributing equally to both groups. Unique structures for allergen provocation included leukotrienes (LTB4 and 6-trans-LTB4), CYP-derivatives of linoleic acid (epoxides/diols), and IL-10. Conclusions Differences in asthmatic relative to healthy profiles suggest a role for 15-LOX products of both ω-6 and ω-3 origin in allergic

  20. User guide : process for quantifying the benefits of research.

    DOT National Transportation Integrated Search

    2017-07-01

    The Minnesota Department of Transportation Research Services has adopted a process for quantifying the monetary benefits of research projects, such as the dollar value of particular ideas when implemented across the states transportation system. T...

  1. The Consonant-Weighted Envelope Difference Index (cEDI): A Proposed Technique for Quantifying Envelope Distortion

    ERIC Educational Resources Information Center

    Hoover, Eric C.; Souza, Pamela E.; Gallun, Frederick J.

    2012-01-01

    Purpose: The benefits of amplitude compression in hearing aids may be limited by distortion resulting from rapid gain adjustment. To evaluate this, it is convenient to quantify distortion by using a metric that is sensitive to the changes in the processed signal that decrease consonant recognition, such as the Envelope Difference Index (EDI;…

  2. Quantifying Discrete Fracture Network Connectivity in Hydraulic Fracturing Stimulation

    NASA Astrophysics Data System (ADS)

    Urbancic, T.; Ardakani, E. P.; Baig, A.

    2017-12-01

    Hydraulic fracture stimulations generally result in microseismicity that is associated with the activation or extension of pre-existing microfractures and discontinuities. Microseismic events acquired under 3D downhole sensor coverage provide accurate event locations outlining hydraulic fracture growth. Combined with source characteristics, these events provide a high quality input for seismic moment tensor inversion and eventually constructing the representative discrete fracture network (DFN). In this study, we investigate the strain and stress state, identified fracture orientation, and DFN connectivity and performance for example stages in a multistage perf and plug completion in a North American shale play. We use topology, the familiar concept in many areas of structural geology, to further describe the relationships between the activated fractures and their effectiveness in enhancing permeability. We explore how local perturbations of stress state lead to the activation of different fractures sets and how that effects the DFN interaction and complexity. In particular, we observe that a more heterogeneous stress state shows a higher percentage of sub-horizontal fractures or bedding plane slips. Based on topology, the fractures are evenly distributed from the injection point, with decreasing numbers of connections by distance. The dimensionless measure of connection per branch and connection per line are used for quantifying the DFN connectivity. In order to connect the concept of connectivity back to productive volume and stimulation efficiency, the connectivity is compared with the character of deformation in the reservoir as deduced from the collective behavior of microseismicity using robustly determined source parameters.

  3. Quantifying Discipline Practices Using Absolute vs. Relative Frequencies: Clinical and Research Implications for Child Welfare

    PubMed Central

    Lindhiem, Oliver; Shaffer, Anne; Kolko, David J.

    2014-01-01

    In the parent intervention outcome literatures, discipline practices are generally quantified as absolute frequencies or, less commonly, as relative frequencies. These differences in methodology warrant direct comparison as they have critical implications for study results and conclusions among treatments targeted at reducing parental aggression and harsh discipline. In this study, we directly compared the absolute frequency method and the relative frequency method for quantifying physically aggressive, psychologically aggressive, and nonaggressive discipline practices. Longitudinal data over a 3-year period came from an existing data set of a clinical trial examining the effectiveness of a psychosocial treatment in reducing parental physical and psychological aggression and improving child behavior (N = 139; Kolko et al., 2009). Discipline practices (both aggressive and nonaggressive) were assessed using the Conflict Tactics Scale (CTS; Straus et al., 1998). The two methods yielded different patterns of results, particularly for nonaggressive discipline strategies. We suggest that each method makes its own unique contribution to a more complete understanding of the association between parental aggression and intervention effects. PMID:24106146

  4. Quantifying six decades of fishery selection for size and age at maturity in sockeye salmon

    PubMed Central

    Kendall, Neala W; Hard, Jeffrey J; Quinn, Thomas P

    2009-01-01

    Life history traits of wild animals can be strongly influenced, both phenotypically and evolutionarily, by hunting and fishing. However, few studies have quantified fishery selection over long time periods. We used 57 years of catch and escapement data to document the magnitude of and trends in gillnet selection on age and size at maturity of a commercially and biologically important sockeye salmon stock. Overall, the fishery has caught larger fish than have escaped to spawn, but selection has varied over time, becoming weaker and less consistent recently. Selection patterns were strongly affected by fish age and sex, in addition to extrinsic factors including fish abundance, mesh size regulations, and fish length variability. These results revealed a more complex and changing pattern of selective harvest than the ‘larger is more vulnerable’ model, emphasizing the need for quantified, multi-year studies before conclusions can be drawn about potential evolutionary and ecological effects of fishery selection. Furthermore, the results indicate that biologically robust escapement goals and prevention of harvest of the largest individuals may help prevent negative effects of size-selective harvest. PMID:25567896

  5. Towards quantifying the arc-scale and global magmatic response to deglaciation

    NASA Astrophysics Data System (ADS)

    Watt, S. F.; Pyle, D. M.; Mather, T. A.

    2012-12-01

    There is a growing body of evidence that the retreat of ice sheets after the last glacial maximum resulted in temporarily enhanced levels of volcanism. This has been postulated on the scale of individual edifices, and on regional scales in intraplate and rift settings. It has been proposed that this pattern was of global significance in contributing to rising atmospheric CO2 concentrations, and thereby formed a feedback process for global warming. However, the impact of deglaciation on volcanic arcs has been incompletely explored. Volcanic arcs account for 90% of present-day subaerial volcanic eruptions, and for volcanically-sourced volatiles they are therefore of first-order significance. Without a proper understanding of fluctuations in arc volcanic output, an assessment of global changes in volcanic activity cannot be made. Here, we present the first systematic assessment of the response of glaciated volcanic arcs to deglaciation. By using comprehensive compilations of eruption records from southern Chile, augmented by records from the Cascade and Kamchatka arcs, we show that the post-glacial increase in volcanism was relatively small in comparison to non-arc volcano-tectonic settings. Where ice unloading was at its greatest, eruption frequency approximately doubled for ~5 kyr, but this pattern is at the limit of statistical significance. The same period coincides with a few notably large explosive eruptions. In less heavily glaciated regions, no pattern can be deduced at the resolution of available data. While eruption patterns are commonly episodic, the timing of increases in activity does not always show a clear link to deglaciation. In light of the above, we critically examine available eruption records in an effort to constrain global-scale changes in volcanic output. We show that great caution must be exercised when attempting to quantify variation in volcanism from such data. Due to extremely sparse sampling (i.e. highly incomplete records), both temporal

  6. Correlation between plasma endothelin-1 levels and severity of septic liver failure quantified by maximal liver function capacity (LiMAx test). A prospective study.

    PubMed

    Kaffarnik, Magnus F; Ahmadi, Navid; Lock, Johan F; Wuensch, Tilo; Pratschke, Johann; Stockmann, Martin; Malinowski, Maciej

    2017-01-01

    To investigate the relationship between the degree of liver dysfunction, quantified by maximal liver function capacity (LiMAx test) and endothelin-1, TNF-α and IL-6 in septic surgical patients. 28 septic patients (8 female, 20 male, age range 35-80y) were prospectively investigated on a surgical intensive care unit. Liver function, defined by LiMAx test, and measurements of plasma levels of endothelin-1, TNF-α and IL-6 were carried out within the first 24 hours after onset of septic symptoms, followed by day 2, 5 and 10. Patients were divided into 2 groups (group A: LiMAx ≥100 μg/kg/h, moderate liver dysfunction; group B: LiMAx <100 μg/kg/h, severe liver dysfunction) for analysis and investigated regarding the correlation between endothelin-1 and the severity of liver failure, quantified by LiMAx test. Group B showed significant higher results for endothelin-1 than patients in group A (P = 0.01, d5; 0.02, d10). For TNF-α, group B revealed higher results than group A, with a significant difference on day 10 (P = 0.005). IL-6 showed a non-significant trend to higher results in group B. The Spearman's rank correlation coefficient revealed a significant correlation between LiMAx and endothelin-1 (-0.434; P <0.001), TNF-α (-0.515; P <0.001) and IL-6 (-0.590; P <0.001). Sepsis-related hepatic dysfunction is associated with elevated plasma levels of endothelin-1, TNF-α and IL-6. Low LiMAx results combined with increased endothelin-1 and TNF-α and a favourable correlation between LiMAx and cytokine values support the findings of a crucial role of Endothelin-1 and TNF-α in development of septic liver failure.

  7. Quantifying light exposure patterns in young adult students

    NASA Astrophysics Data System (ADS)

    Alvarez, Amanda A.; Wildsoet, Christine F.

    2013-08-01

    Exposure to bright light appears to be protective against myopia in both animals (chicks, monkeys) and children, but quantitative data on human light exposure are limited. In this study, we report on a technique for quantifying light exposure using wearable sensors. Twenty-seven young adult subjects wore a light sensor continuously for two weeks during one of three seasons, and also completed questionnaires about their visual activities. Light data were analyzed with respect to refractive error and season, and the objective sensor data were compared with subjects' estimates of time spent indoors and outdoors. Subjects' estimates of time spent indoors and outdoors were in poor agreement with durations reported by the sensor data. The results of questionnaire-based studies of light exposure should thus be interpreted with caution. The role of light in refractive error development should be investigated using multiple methods such as sensors to complement questionnaires.

  8. Quantifying light exposure patterns in young adult students

    PubMed Central

    Alvarez, Amanda A.; Wildsoet, Christine F.

    2014-01-01

    Exposure to bright light appears to be protective against myopia in both animals (chicks, monkeys) and children, but quantitative data on human light exposure are limited. In this study, we report on a technique for quantifying light exposure using wearable sensors. Twenty-seven young adult subjects wore a light sensor continuously for two weeks during one of three seasons, and also completed questionnaires about their visual activities. Light data were analyzed with respect to refractive error and season, and the objective sensor data were compared with subjects’ estimates of time spent indoors and outdoors. Subjects’ estimates of time spent indoors and outdoors were in poor agreement with durations reported by the sensor data. The results of questionnaire-based studies of light exposure should thus be interpreted with caution. The role of light in refractive error development should be investigated using multiple methods such as sensors to complement questionnaires. PMID:25342873

  9. Constraining estimates of global soil respiration by quantifying sources of variability.

    PubMed

    Jian, Jinshi; Steele, Meredith K; Thomas, R Quinn; Day, Susan D; Hodges, Steven C

    2018-05-10

    Quantifying global soil respiration (R SG ) and its response to temperature change are critical for predicting the turnover of terrestrial carbon stocks and their feedbacks to climate change. Currently, estimates of R SG range from 68 to 98 Pg C year -1 , causing considerable uncertainty in the global carbon budget. We argue the source of this variability lies in the upscaling assumptions regarding the model format, data timescales, and precipitation component. To quantify the variability and constrain R SG , we developed R SG models using Random Forest and exponential models, and used different timescales (daily, monthly, and annual) of soil respiration (R S ) and climate data to predict R SG . From the resulting R SG estimates (range = 66.62-100.72 Pg), we calculated variability associated with each assumption. Among model formats, using monthly R S data rather than annual data decreased R SG by 7.43-9.46 Pg; however, R SG calculated from daily R S data was only 1.83 Pg lower than the R SG from monthly data. Using mean annual precipitation and temperature data instead of monthly data caused +4.84 and -4.36 Pg C differences, respectively. If the timescale of R S data is constant, R SG estimated by the first-order exponential (93.2 Pg) was greater than the Random Forest (78.76 Pg) or second-order exponential (76.18 Pg) estimates. These results highlight the importance of variation at subannual timescales for upscaling to R SG . The results indicated R SG is lower than in recent papers and the current benchmark for land models (98 Pg C year -1 ), and thus may change the predicted rates of terrestrial carbon turnover and the carbon to climate feedback as global temperatures rise. © 2018 John Wiley & Sons Ltd.

  10. Quantifying Systemic Risk by Solutions of the Mean-Variance Risk Model.

    PubMed

    Jurczyk, Jan; Eckrot, Alexander; Morgenstern, Ingo

    2016-01-01

    The world is still recovering from the financial crisis peaking in September 2008. The triggering event was the bankruptcy of Lehman Brothers. To detect such turmoils, one can investigate the time-dependent behaviour of correlations between assets or indices. These cross-correlations have been connected to the systemic risks within markets by several studies in the aftermath of this crisis. We study 37 different US indices which cover almost all aspects of the US economy and show that monitoring an average investor's behaviour can be used to quantify times of increased risk. In this paper the overall investing strategy is approximated by the ground-states of the mean-variance model along the efficient frontier bound to real world constraints. Changes in the behaviour of the average investor is utlilized as a early warning sign.

  11. Quantifying the Influence of Urbanization on a Coastal Floodplain

    NASA Astrophysics Data System (ADS)

    Sebastian, A.; Juan, A.; Bedient, P. B.

    2016-12-01

    The U.S. Gulf Coast is the fastest growing region in the United States; between 1960 and 2010, the number of housing units along the Gulf of Mexico increased by 246%, vastly outpacing growth in other parts of the country (NOAA 2013). Numerous studies have shown that increases in impervious surface associated with urbanization reduce infiltration and increase surface runoff. While empirical evidence suggests that changes in land use are leading to increased flood damage in overland areas, earlier studies have largely focused on the impacts of urbanization on surface runoff and watershed hydrology, rather than quantifying its influence on the spatial extent of flooding. In this study, we conduct a longitudinal assessment of the evolution of flood risk since 1970 in an urbanizing coastal watershed. Utilizing the distributed hydrologic model, Vflo®, in combination with the hydraulic model, HEC-RAS, we quantify the impact of localized land use/land cover (LULC) change on the spatial extent of flooding in the watershed and the underlying flood hazard structure. The results demonstrate that increases in impervious cover between 1970 and 2010 (34%) and 2010 and 2040 (18%) increase the size of the floodplain by 26 and 17%, respectively. Furthermore, the results indicate that the depth and frequency of flooding in neighborhoods within the 1% floodplain have increased substantially (see attached figure). Finally, this analysis provides evidence that outdated FEMA floodplain maps could be underestimating the extent of the floodplain by upwards of 25%, depending on the rate of urbanization in the watershed; and, that by incorporating physics-based distributed hydrologic models into floodplain studies, floodplain maps can be easily updated to reflect the most recent LULC information available. The methods presented in this study have important implications for the development of mitigation strategies in coastal areas, such as deterring future development in flood prone areas

  12. Gradient approach to quantify the gradation smoothness for output media

    NASA Astrophysics Data System (ADS)

    Kim, Youn Jin; Bang, Yousun; Choh, Heui-Keun

    2010-01-01

    We aim to quantify the perception of color gradation smoothness using objectively measurable properties. We propose a model to compute the smoothness of hardcopy color-to-color gradations. It is a gradient-based method that can be determined as a function of the 95th percentile of second derivative for the tone-jump estimator and the fifth percentile of first derivative for the tone-clipping estimator. Performance of the model and a previously suggested method were psychophysically appreciated, and their prediction accuracies were compared to each other. Our model showed a stronger Pearson correlation to the corresponding visual data, and the magnitude of the Pearson correlation reached up to 0.87. Its statistical significance was verified through analysis of variance. Color variations of the representative memory colors-blue sky, green grass and Caucasian skin-were rendered as gradational scales and utilized as the test stimuli.

  13. Quantifying Energetic Electron Precipitation And Its Effect on Atmospheric Chemistry

    NASA Astrophysics Data System (ADS)

    Huang, C. L.; Spence, H. E.; Smith, S. S.; Duderstadt, K. A.; Boyd, A. J.; Geoffrey, R.; Blake, J. B.; Fennell, J. F.; Claudepierre, S. G.; Turner, D. L.; Crew, A. B.; Klumpar, D. M.; Shumko, M.; Johnson, A.; Sample, J. G.

    2017-12-01

    In this study we quantify the total radiation belt electron loss through precipitation into the atmosphere, and simulate the electrons' contribution to changing the atmospheric composition. We use total radiation belt electron content (TRBEC) calculated from Van Allen Probes ECT/MagEIS data to estimate the precipitation during electron loss events. The new TRBEC index is a high-level quantity for monitoring the entire radiation belt and has the benefit of removing both internal transport and the adiabatic effect. To assess the electron precipitation rate, we select TRBEC loss events that show no outward transport in the phase space density data in order to exclude drift magnetopause loss. Then we use FIREBIRD data to estimate and constrain the precipitation loss when it samples near the loss cone. Finally, we estimate the impact of electron precipitation on the composition of the upper and middle atmosphere using global climate simulations.

  14. Alcohol Content in the 'Hyper-Reality' MTV Show 'Geordie Shore'.

    PubMed

    Lowe, Eden; Britton, John; Cranwell, Jo

    2018-05-01

    To quantify the occurrence of alcohol content, including alcohol branding, in the popular primetime television UK Reality TV show 'Geordie Shore' Series 11. A 1-min interval coding content analysis of alcohol content in the entire DVD Series 11 of 'Geordie Shore' (10 episodes). Occurrence of alcohol use, implied use, other alcohol reference/paraphernalia or branding was recorded. All categories of alcohol were present in all episodes. 'Any alcohol' content occurred in 78%, 'actual alcohol use' in 30%, 'inferred alcohol use' in 72%, and all 'other' alcohol references occurred in 59% of all coding intervals (ACIs), respectively. Brand appearances occurred in 23% of ACIs. The most frequently observed alcohol brand was Smirnoff which appeared in 43% of all brand appearances. Episodes categorized as suitable for viewing by adolescents below the legal drinking age of 18 years comprised of 61% of all brand appearances. Alcohol content, including branding, is highly prevalent in the UK Reality TV show 'Geordie Shore' Series 11. Two-thirds of all alcohol branding occurred in episodes age-rated by the British Board of Film Classification (BBFC) as suitable for viewers aged 15 years. The organizations OfCom, Advertising Standards Authority (ASA) and the Portman Group should implement more effective policies to reduce adolescent exposure to on-screen drinking. The drinks industry should consider demanding the withdrawal of their brands from the show. Alcohol content, including branding, is highly prevalent in the MTV reality TV show 'Geordie Shore' Series 11. Current alcohol regulation is failing to protect young viewers from exposure to such content.

  15. Quantifying differences in land use emission estimates implied by definition discrepancies

    NASA Astrophysics Data System (ADS)

    Stocker, B. D.; Joos, F.

    2015-11-01

    The quantification of CO2 emissions from anthropogenic land use and land use change (eLUC) is essential to understand the drivers of the atmospheric CO2 increase and to inform climate change mitigation policy. Reported values in synthesis reports are commonly derived from different approaches (observation-driven bookkeeping and process-modelling) but recent work has emphasized that inconsistencies between methods may imply substantial differences in eLUC estimates. However, a consistent quantification is lacking and no concise modelling protocol for the separation of primary and secondary components of eLUC has been established. Here, we review differences of eLUC quantification methods and apply an Earth System Model (ESM) of Intermediate Complexity to quantify them. We find that the magnitude of effects due to merely conceptual differences between ESM and offline vegetation model-based quantifications is ~ 20 % for today. Under a future business-as-usual scenario, differences tend to increase further due to slowing land conversion rates and an increasing impact of altered environmental conditions on land-atmosphere fluxes. We establish how coupled Earth System Models may be applied to separate secondary component fluxes of eLUC arising from the replacement of potential C sinks/sources and the land use feedback and show that secondary fluxes derived from offline vegetation models are conceptually and quantitatively not identical to either, nor their sum. Therefore, we argue that synthesis studies should resort to the "least common denominator" of different methods, following the bookkeeping approach where only primary land use emissions are quantified under the assumption of constant environmental boundary conditions.

  16. Quantifying thresholds for significant dune erosion along the Sefton Coast, Northwest England

    NASA Astrophysics Data System (ADS)

    Esteves, Luciana S.; Brown, Jennifer M.; Williams, Jon J.; Lymbery, Graham

    2012-03-01

    Field and model hindcast data are used to establish a critical dune erosion threshold for the Sefton Coast (NW England). Events are classified as causing significant erosion if they result in: (a) a mean dune retreat along the entire study area of > 2 m; (b) a dune retreat of ≥ 5 m along a coastal segment ≥ 2 km in length; and (c) an eroded area ≥ 20,000 m2. For the period 1996 to 2008, individual storms were characterised using hindcast results from a POLCOMS-WAM model and measured data from the Liverpool Bay Coastal Observatory. Results show that combined extreme surge levels (> 1.5 m) and wave heights (> 4 m), or tidal water levels above 9.0 m Chart Datum (CD), do not always result in significant dune erosion. Evidence suggests that erosion is more likely to occur when wave heights are > 2.6 m, peak water level is > 10.2 m CD at Liverpool and when consecutive tidal cycles provide 10 h or more of water levels above 9.4 m CD. However, lower water levels and wave heights, and shorter events of sustained water levels, can cause significant erosion in the summer. While the return period for events giving rise to the most severe erosion in the winter is > 50 years, significant erosion in the summer can be caused by events with return periods < 1 year. It is suggested that this may be attributable to a known reduction in the mean dune toe elevation c. 30 cm. Although the study shows it might be possible to characterise objectively storm events based on oceanographic conditions, the resultant morphological change at the coast is demonstrated to depend on the time and duration of events, and on other variables which are not so easy to quantify. Further investigation is needed to understand the influence of alongshore and seasonal variability in beach/dune morphology in determining the response to the hydrodynamic and meteorological conditions causing significant erosion. Improved monitoring pre- and post-storm of changes in beach/dune morphology is required to

  17. Tobacco-free economy: A SAM-based multiplier model to quantify the impact of changes in tobacco demand in Bangladesh.

    PubMed

    Husain, Muhammad Jami; Khondker, Bazlul Haque

    2016-01-01

    In Bangladesh, where tobacco use is pervasive, reducing tobacco use is economically beneficial. This paper uses the latest Bangladesh social accounting matrix (SAM) multiplier model to quantify the economy-wide impact of demand-driven changes in tobacco cultivation, bidi industries, and cigarette industries. First, we compute various income multiplier values (i.e. backward linkages) for all production activities in the economy to quantify the impact of changes in demand for the corresponding products on gross output for 86 activities, demand for 86 commodities, returns to four factors of production, and income for eight household groups. Next, we rank tobacco production activities by income multiplier values relative to other sectors. Finally, we present three hypothetical 'tobacco-free economy' scenarios by diverting demand from tobacco products into other sectors of the economy and quantifying the economy-wide impact. The simulation exercises with three different tobacco-free scenarios show that, compared to the baseline values, total sectoral output increases by 0.92%, 1.3%, and 0.75%. The corresponding increases in the total factor returns (i.e. GDP) are 1.57%, 1.75%, and 1.75%. Similarly, total household income increases by 1.40%, 1.58%, and 1.55%.

  18. Tobacco-free economy: A SAM-based multiplier model to quantify the impact of changes in tobacco demand in Bangladesh

    PubMed Central

    Husain, Muhammad Jami; Khondker, Bazlul Haque

    2017-01-01

    In Bangladesh, where tobacco use is pervasive, reducing tobacco use is economically beneficial. This paper uses the latest Bangladesh social accounting matrix (SAM) multiplier model to quantify the economy-wide impact of demand-driven changes in tobacco cultivation, bidi industries, and cigarette industries. First, we compute various income multiplier values (i.e. backward linkages) for all production activities in the economy to quantify the impact of changes in demand for the corresponding products on gross output for 86 activities, demand for 86 commodities, returns to four factors of production, and income for eight household groups. Next, we rank tobacco production activities by income multiplier values relative to other sectors. Finally, we present three hypothetical ‘tobacco-free economy’ scenarios by diverting demand from tobacco products into other sectors of the economy and quantifying the economy-wide impact. The simulation exercises with three different tobacco-free scenarios show that, compared to the baseline values, total sectoral output increases by 0.92%, 1.3%, and 0.75%. The corresponding increases in the total factor returns (i.e. GDP) are 1.57%, 1.75%, and 1.75%. Similarly, total household income increases by 1.40%, 1.58%, and 1.55%. PMID:28845091

  19. Quantifying the Accuracy of Digital Hemispherical Photography for Leaf Area Index Estimates on Broad-Leaved Tree Species.

    PubMed

    Gilardelli, Carlo; Orlando, Francesca; Movedi, Ermes; Confalonieri, Roberto

    2018-03-29

    Digital hemispherical photography (DHP) has been widely used to estimate leaf area index (LAI) in forestry. Despite the advancement in the processing of hemispherical images with dedicated tools, several steps are still manual and thus easily affected by user's experience and sensibility. The purpose of this study was to quantify the impact of user's subjectivity on DHP LAI estimates for broad-leaved woody canopies using the software Can-Eye. Following the ISO 5725 protocol, we quantified the repeatability and reproducibility of the method, thus defining its precision for a wide range of broad-leaved canopies markedly differing for their structure. To get a complete evaluation of the method accuracy, we also quantified its trueness using artificial canopy images with known canopy cover. Moreover, the effect of the segmentation method was analysed. The best results for precision (restrained limits of repeatability and reproducibility) were obtained for high LAI values (>5) with limits corresponding to a variation of 22% in the estimated LAI values. Poorer results were obtained for medium and low LAI values, with a variation of the estimated LAI values that exceeded the 40%. Regardless of the LAI range explored, satisfactory results were achieved for trees in row-structured plantations (limits almost equal to the 30% of the estimated LAI). Satisfactory results were achieved for trueness, regardless of the canopy structure. The paired t -test revealed that the effect of the segmentation method on LAI estimates was significant. Despite a non-negligible user effect, the accuracy metrics for DHP are consistent with those determined for other indirect methods for LAI estimates, confirming the overall reliability of DHP in broad-leaved woody canopies.

  20. Quantifying the Accuracy of Digital Hemispherical Photography for Leaf Area Index Estimates on Broad-Leaved Tree Species

    PubMed Central

    Gilardelli, Carlo; Orlando, Francesca; Movedi, Ermes; Confalonieri, Roberto

    2018-01-01

    Digital hemispherical photography (DHP) has been widely used to estimate leaf area index (LAI) in forestry. Despite the advancement in the processing of hemispherical images with dedicated tools, several steps are still manual and thus easily affected by user’s experience and sensibility. The purpose of this study was to quantify the impact of user’s subjectivity on DHP LAI estimates for broad-leaved woody canopies using the software Can-Eye. Following the ISO 5725 protocol, we quantified the repeatability and reproducibility of the method, thus defining its precision for a wide range of broad-leaved canopies markedly differing for their structure. To get a complete evaluation of the method accuracy, we also quantified its trueness using artificial canopy images with known canopy cover. Moreover, the effect of the segmentation method was analysed. The best results for precision (restrained limits of repeatability and reproducibility) were obtained for high LAI values (>5) with limits corresponding to a variation of 22% in the estimated LAI values. Poorer results were obtained for medium and low LAI values, with a variation of the estimated LAI values that exceeded the 40%. Regardless of the LAI range explored, satisfactory results were achieved for trees in row-structured plantations (limits almost equal to the 30% of the estimated LAI). Satisfactory results were achieved for trueness, regardless of the canopy structure. The paired t-test revealed that the effect of the segmentation method on LAI estimates was significant. Despite a non-negligible user effect, the accuracy metrics for DHP are consistent with those determined for other indirect methods for LAI estimates, confirming the overall reliability of DHP in broad-leaved woody canopies. PMID:29596376

  1. Clinical methods to quantify trunk mobility in an elite male surfing population.

    PubMed

    Furness, James; Climstein, Mike; Sheppard, Jeremy M; Abbott, Allan; Hing, Wayne

    2016-05-01

    Thoracic mobility in the sagittal and horizontal planes are key requirements in the sport of surfing; however to date the normal values of these movements have not yet been quantified in a surfing population. To develop a reliable method to quantify thoracic mobility in the sagittal plane; to assess the reliability of an existing thoracic rotation method, and quantify thoracic mobility in an elite male surfing population. Clinical Measurement, reliability and comparative study. A total of 30 subjects were used to determine the reliability component. 15 elite surfers were used as part of a comparative analysis with age and gender matched controls. Intraclass correlation coefficient values ranged between 0.95-0.99 (95% CI; 0.89-0.99) for both thoracic methods. The elite surfing group had significantly (p ≤ 0.05) greater rotation than the comparative group (mean rotation 63.57° versus 40.80°, respectively). This study has illustrated reliable methods to assess the thoracic spine in the sagittal plane and thoracic rotation. It has also quantified ROM in a surfing cohort; identifying thoracic rotation as a key movement. This information may provide clinicians, coaches and athletic trainers with imperative information regarding the importance of maintaining adequate thoracic rotation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. NREL, Johns Hopkins SAIS Develop Method to Quantify Life Cycle Land Use of

    Science.gov Websites

    Life Cycle Land Use of Electricity from Natural Gas News Release: NREL, Johns Hopkins SAIS Develop Method to Quantify Life Cycle Land Use of Electricity from Natural Gas October 2, 2017 A case study of time provides quantifiable information on the life cycle land use of generating electricity from

  3. Quantifying global dust devil occurrence from meteorological analyses

    PubMed Central

    Jemmett-Smith, Bradley C; Marsham, John H; Knippertz, Peter; Gilkeson, Carl A

    2015-01-01

    Dust devils and nonrotating dusty plumes are effective uplift mechanisms for fine particles, but their contribution to the global dust budget is uncertain. By applying known bulk thermodynamic criteria to European Centre for Medium-Range Weather Forecasts (ECMWF) operational analyses, we provide the first global hourly climatology of potential dust devil and dusty plume (PDDP) occurrence. In agreement with observations, activity is highest from late morning into the afternoon. Combining PDDP frequencies with dust source maps and typical emission values gives the best estimate of global contributions of 3.4% (uncertainty 0.9–31%), 1 order of magnitude lower than the only estimate previously published. Total global hours of dust uplift by dry convection are ∼0.002% of the dust-lifting winds resolved by ECMWF, consistent with dry convection making a small contribution to global uplift. Reducing uncertainty requires better knowledge of factors controlling PDDP occurrence, source regions, and dust fluxes induced by dry convection. Key Points Global potential dust devil occurrence quantified from meteorological analyses Climatology shows realistic diurnal cycle and geographical distribution Best estimate of global contribution of 3.4% is 10 times smaller than the previous estimate PMID:26681815

  4. Quantifying chemical reactions by using mixing analysis.

    PubMed

    Jurado, Anna; Vázquez-Suñé, Enric; Carrera, Jesús; Tubau, Isabel; Pujades, Estanislao

    2015-01-01

    This work is motivated by a sound understanding of the chemical processes that affect the organic pollutants in an urban aquifer. We propose an approach to quantify such processes using mixing calculations. The methodology consists of the following steps: (1) identification of the recharge sources (end-members) and selection of the species (conservative and non-conservative) to be used, (2) identification of the chemical processes and (3) evaluation of mixing ratios including the chemical processes. This methodology has been applied in the Besòs River Delta (NE Barcelona, Spain), where the River Besòs is the main aquifer recharge source. A total number of 51 groundwater samples were collected from July 2007 to May 2010 during four field campaigns. Three river end-members were necessary to explain the temporal variability of the River Besòs: one river end-member is from the wet periods (W1) and two are from dry periods (D1 and D2). This methodology has proved to be useful not only to compute the mixing ratios but also to quantify processes such as calcite and magnesite dissolution, aerobic respiration and denitrification undergone at each observation point. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Quantifying spatial distribution of spurious mixing in ocean models.

    PubMed

    Ilıcak, Mehmet

    2016-12-01

    Numerical mixing is inevitable for ocean models due to tracer advection schemes. Until now, there is no robust way to identify the regions of spurious mixing in ocean models. We propose a new method to compute the spatial distribution of the spurious diapycnic mixing in an ocean model. This new method is an extension of available potential energy density method proposed by Winters and Barkan (2013). We test the new method in lock-exchange and baroclinic eddies test cases. We can quantify the amount and the location of numerical mixing. We find high-shear areas are the main regions which are susceptible to numerical truncation errors. We also test the new method to quantify the numerical mixing in different horizontal momentum closures. We conclude that Smagorinsky viscosity has less numerical mixing than the Leith viscosity using the same non-dimensional constant.

  6. Matrix Dissolution Techniques Applied to Extract and Quantify Precipitates from a Microalloyed Steel

    NASA Astrophysics Data System (ADS)

    Lu, Junfang; Wiskel, J. Barry; Omotoso, Oladipo; Henein, Hani; Ivey, Douglas G.

    2011-07-01

    Microalloyed steels possess good strength and toughness, as well as excellent weldability; these attributes are necessary for oil and gas pipelines in northern climates. These properties are attributed in part to the presence of nanosized carbide and carbonitride precipitates. To understand the strengthening mechanisms and to optimize the strengthening effects, it is necessary to quantify the size distribution, volume fraction, and chemical speciation of these precipitates. However, characterization techniques suitable for quantifying fine precipitates are limited because of their fine sizes, wide particle size distributions, and low volume fractions. In this article, two matrix dissolution techniques have been developed to extract precipitates from a Grade100 (yield strength of 690 MPa) microalloyed steel. Relatively large volumes of material can be analyzed, and statistically significant quantities of precipitates of different sizes are collected. Transmission electron microscopy (TEM) and X-ray diffraction (XRD) are combined to analyze the chemical speciation of these precipitates. Rietveld refinement of XRD patterns is used to quantify fully the relative amounts of the precipitates. The size distribution of the nanosized precipitates is quantified using dark-field imaging in the TEM.

  7. Quantifying Diffuse Contamination: Method and Application to Pb in Soil.

    PubMed

    Fabian, Karl; Reimann, Clemens; de Caritat, Patrice

    2017-06-20

    A new method for detecting and quantifying diffuse contamination at the continental to regional scale is based on the analysis of cumulative distribution functions (CDFs). It uses cumulative probability (CP) plots for spatially representative data sets, preferably containing >1000 determinations. Simulations demonstrate how different types of contamination influence elemental CDFs of different sample media. It is found that diffuse contamination is characterized by a distinctive shift of the low-concentration end of the distribution of the studied element in its CP plot. Diffuse contamination can be detected and quantified via either (1) comparing the distribution of the contaminating element to that of an element with a geochemically comparable behavior but no contamination source (e.g., Pb vs Rb), or (2) comparing the top soil distribution of an element to the distribution of the same element in subsoil samples from the same area, taking soil forming processes into consideration. Both procedures are demonstrated for geochemical soil data sets from Europe, Australia, and the U.S.A. Several different data sets from Europe deliver comparable results at different scales. Diffuse Pb contamination in surface soil is estimated to be <0.5 mg/kg for Australia, 1-3 mg/kg for Europe, and 1-2 mg/kg, or at least <5 mg/kg, for the U.S.A. The analysis presented here also allows recognition of local contamination sources and can be used to efficiently monitor diffuse contamination at the continental to regional scale.

  8. A fuzzy Bayesian network approach to quantify the human behaviour during an evacuation

    NASA Astrophysics Data System (ADS)

    Ramli, Nurulhuda; Ghani, Noraida Abdul; Ahmad, Nazihah

    2016-06-01

    Bayesian Network (BN) has been regarded as a successful representation of inter-relationship of factors affecting human behavior during an emergency. This paper is an extension of earlier work of quantifying the variables involved in the BN model of human behavior during an evacuation using a well-known direct probability elicitation technique. To overcome judgment bias and reduce the expert's burden in providing precise probability values, a new approach for the elicitation technique is required. This study proposes a new fuzzy BN approach for quantifying human behavior during an evacuation. Three major phases of methodology are involved, namely 1) development of qualitative model representing human factors during an evacuation, 2) quantification of BN model using fuzzy probability and 3) inferencing and interpreting the BN result. A case study of three inter-dependencies of human evacuation factors such as danger assessment ability, information about the threat and stressful conditions are used to illustrate the application of the proposed method. This approach will serve as an alternative to the conventional probability elicitation technique in understanding the human behavior during an evacuation.

  9. A new paradigm of quantifying ecosystem stress through chemical signatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kravitz, Ben; Guenther, Alex B.; Gu, Lianhong

    Stress-induced emissions of biogenic volatile organic compounds (VOCs) from terrestrial ecosystems may be one of the dominant sources of VOC emissions world-wide. Understanding the ecosystem stress response could reveal how ecosystems will respond and adapt to climate change and, in turn, quantify changes in the atmospheric burden of VOC oxidants and secondary organic aerosols. Here we argue, based on preliminary evidence from several opportunistic measurement sources, that chemical signatures of stress can be identified and quantified at the ecosystem scale. We also outline future endeavors that we see as next steps toward uncovering quantitative signatures of stress, including new advancesmore » in both VOC data collection and analysis of "big data."« less

  10. [Psychological results of mental performance in sleep deprivation].

    PubMed

    Dahms, P; Schaad, G; Gorges, W; von Restorff, W

    1996-01-01

    To quantify the effects of sleep periods that have different lengths of time during continuous operations (CONOPS) 2 independent groups of subjects performed several cognitive tasks for 3 days. The 72 h trial period contained three 60-min sleep periods for the 10 subjects of the experimental group and three sleep periods of 4 h each for the 14 subjects of the control group. With the exception of only one subtest the statistical analyses of the test results of the 2 groups show no significant differences in cognitive performance. It is suggested that high motivation is responsible for comparable performance of the subjects, which was essentially obtained by a monetary pay system for successful test performance.

  11. Quantifying the direct use value of Condor seamount

    NASA Astrophysics Data System (ADS)

    Ressurreição, Adriana; Giacomello, Eva

    2013-12-01

    Seamounts often satisfy numerous uses and interests. Multiple uses can generate multiple benefits but also conflicts and impacts, calling, therefore, for integrated and sustainable management. To assist in developing comprehensive management strategies, policymakers recognise the need to include measures of socioeconomic analysis alongside ecological data so that practical compromises can be made. This study assessed the direct output impact (DOI) of the relevant marine activities operating at Condor seamount (Azores, central northeast Atlantic) as proxies of the direct use values provided by the resource system. Results demonstrated that Condor seamount supported a wide range of uses yielding distinct economic outputs. Demersal fisheries, scientific research and shark diving were the top-three activities generating the highest revenues, while tuna fisheries, whale watching and scuba-diving had marginal economic significance. Results also indicated that the economic importance of non-extractive uses of Condor is considerable, highlighting the importance of these uses as alternative income-generating opportunities for local communities. It is hoped that quantifying the direct use values provided by Condor seamount will contribute to the decision making process towards its long-term conservation and sustainable use.

  12. Islands in the oil: Quantifying salt marsh shoreline erosion after the Deepwater Horizon oiling.

    PubMed

    Turner, R Eugene; McClenachan, Giovanna; Tweel, Andrew W

    2016-09-15

    Qualitative inferences and sparse bay-wide measurements suggest that shoreline erosion increased after the 2010 BP Deepwater Horizon (DWH) disaster, but quantifying the impacts has been elusive at the landscape scale. We quantified the shoreline erosion of 46 islands for before and after the DWH oil spill to determine how much shoreline was lost, if the losses were temporary, and if recovery/restoration occurred. The erosion rates at the oiled islands increased to 275% in the first six months after the oiling, were 200% of that of the unoiled islands for the first 2.5years after the oiling, and twelve times the average land loss in the deltaic plain of 0.4%y(-1) from 1988 to 2011. These results support the hypothesis that oiling compromised the belowground biomass of the emergent vegetation. The islands are, in effect, sentinels of marsh stability already in decline before the oil spill. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. A Molecular Assay to Quantify Male and Female Plasmodium falciparum Gametocytes: Results From 2 Randomized Controlled Trials Using Primaquine for Gametocyte Clearance

    PubMed Central

    Stone, Will; Sawa, Patrick; Lanke, Kjerstin; Rijpma, Sanna; Oriango, Robin; Nyaurah, Maureen; Osodo, Paul; Osoti, Victor; Mahamar, Almahamoudou; Diawara, Halimatou; Woestenenk, Rob; Graumans, Wouter; van de Vegte-Bolmer, Marga; Bradley, John; Chen, Ingrid; Brown, Joelle; Siciliano, Giulia; Alano, Pietro; Gosling, Roly; Dicko, Alassane; Drakeley, Chris; Bousema, Teun

    2017-01-01

    Abstract Background Single low-dose primaquine (PQ) reduces Plasmodium falciparum infectivity before it impacts gametocyte density. Here, we examined the effect of PQ on gametocyte sex ratio as a possible explanation for this early sterilizing effect. Methods Quantitative reverse-transcription polymerase chain reaction assays were developed to quantify female gametocytes (targeting Pfs25 messenger RNA [mRNA]) and male gametocytes (targeting Pf3D7_1469900 mRNA) in 2 randomized trials in Kenya and Mali, comparing dihydroartemisinin-piperaquine (DP) alone to DP with PQ. Gametocyte sex ratio was examined in relation to time since treatment and infectivity to mosquitoes. Results In Kenya, the median proportion of male gametocytes was 0.33 at baseline. Seven days after treatment, gametocyte density was significantly reduced in the DP-PQ arm relative to the DP arm (females: 0.05% [interquartile range {IQR}, 0.0–0.7%] of baseline; males: 3.4% [IQR, 0.4%–32.9%] of baseline; P < .001). Twenty-four hours after treatment, gametocyte sex ratio became male-biased and was not significantly different between the DP and DP-PQ groups. In Mali, there was no significant difference in sex ratio between the DP and DP-PQ groups (>0.125 mg/kg) 48 hours after treatment, and gametocyte sex ratio was not associated with mosquito infection rates. Conclusions The early sterilizing effects of PQ may not be explained by the preferential clearance of male gametocytes and may be due to an effect on gametocyte fitness. PMID:28931236

  14. Quantifying Anthropogenic Dust Emissions

    NASA Astrophysics Data System (ADS)

    Webb, Nicholas P.; Pierre, Caroline

    2018-02-01

    Anthropogenic land use and land cover change, including local environmental disturbances, moderate rates of wind-driven soil erosion and dust emission. These human-dust cycle interactions impact ecosystems and agricultural production, air quality, human health, biogeochemical cycles, and climate. While the impacts of land use activities and land management on aeolian processes can be profound, the interactions are often complex and assessments of anthropogenic dust loads at all scales remain highly uncertain. Here, we critically review the drivers of anthropogenic dust emission and current evaluation approaches. We then identify and describe opportunities to: (1) develop new conceptual frameworks and interdisciplinary approaches that draw on ecological state-and-transition models to improve the accuracy and relevance of assessments of anthropogenic dust emissions; (2) improve model fidelity and capacity for change detection to quantify anthropogenic impacts on aeolian processes; and (3) enhance field research and monitoring networks to support dust model applications to evaluate the impacts of disturbance processes on local to global-scale wind erosion and dust emissions.

  15. Methodology for quantifying uncertainty in coal assessments with an application to a Texas lignite deposit

    USGS Publications Warehouse

    Olea, R.A.; Luppens, J.A.; Tewalt, S.J.

    2011-01-01

    A common practice for characterizing uncertainty in coal resource assessments has been the itemization of tonnage at the mining unit level and the classification of such units according to distance to drilling holes. Distance criteria, such as those used in U.S. Geological Survey Circular 891, are still widely used for public disclosure. A major deficiency of distance methods is that they do not provide a quantitative measure of uncertainty. Additionally, relying on distance between data points alone does not take into consideration other factors known to have an influence on uncertainty, such as spatial correlation, type of probability distribution followed by the data, geological discontinuities, and boundary of the deposit. Several geostatistical methods have been combined to formulate a quantitative characterization for appraising uncertainty. Drill hole datasets ranging from widespread exploration drilling to detailed development drilling from a lignite deposit in Texas were used to illustrate the modeling. The results show that distance to the nearest drill hole is almost completely unrelated to uncertainty, which confirms the inadequacy of characterizing uncertainty based solely on a simple classification of resources by distance classes. The more complex statistical methods used in this study quantify uncertainty and show good agreement between confidence intervals in the uncertainty predictions and data from additional drilling. ?? 2010.

  16. The reliability and validity of ultrasound to quantify muscles in older adults: a systematic review

    PubMed Central

    Scafoglieri, Aldo; Jager‐Wittenaar, Harriët; Hobbelen, Johannes S.M.; van der Schans, Cees P.

    2017-01-01

    Abstract This review evaluates the reliability and validity of ultrasound to quantify muscles in older adults. The databases PubMed, Cochrane, and Cumulative Index to Nursing and Allied Health Literature were systematically searched for studies. In 17 studies, the reliability (n = 13) and validity (n = 8) of ultrasound to quantify muscles in community‐dwelling older adults (≥60 years) or a clinical population were evaluated. Four out of 13 reliability studies investigated both intra‐rater and inter‐rater reliability. Intraclass correlation coefficient (ICC) scores for reliability ranged from −0.26 to 1.00. The highest ICC scores were found for the vastus lateralis, rectus femoris, upper arm anterior, and the trunk (ICC = 0.72 to 1.000). All included validity studies found ICC scores ranging from 0.92 to 0.999. Two studies describing the validity of ultrasound to predict lean body mass showed good validity as compared with dual‐energy X‐ray absorptiometry (r 2 = 0.92 to 0.96). This systematic review shows that ultrasound is a reliable and valid tool for the assessment of muscle size in older adults. More high‐quality research is required to confirm these findings in both clinical and healthy populations. Furthermore, ultrasound assessment of small muscles needs further evaluation. Ultrasound to predict lean body mass is feasible; however, future research is required to validate prediction equations in older adults with varying function and health. PMID:28703496

  17. Quantifying the effect of experimental design choices for in vitro scratch assays.

    PubMed

    Johnston, Stuart T; Ross, Joshua V; Binder, Benjamin J; Sean McElwain, D L; Haridas, Parvathi; Simpson, Matthew J

    2016-07-07

    Scratch assays are often used to investigate potential drug treatments for chronic wounds and cancer. Interpreting these experiments with a mathematical model allows us to estimate the cell diffusivity, D, and the cell proliferation rate, λ. However, the influence of the experimental design on the estimates of D and λ is unclear. Here we apply an approximate Bayesian computation (ABC) parameter inference method, which produces a posterior distribution of D and λ, to new sets of synthetic data, generated from an idealised mathematical model, and experimental data for a non-adhesive mesenchymal population of fibroblast cells. The posterior distribution allows us to quantify the amount of information obtained about D and λ. We investigate two types of scratch assay, as well as varying the number and timing of the experimental observations captured. Our results show that a scrape assay, involving one cell front, provides more precise estimates of D and λ, and is more computationally efficient to interpret than a wound assay, with two opposingly directed cell fronts. We find that recording two observations, after making the initial observation, is sufficient to estimate D and λ, and that the final observation time should correspond to the time taken for the cell front to move across the field of view. These results provide guidance for estimating D and λ, while simultaneously minimising the time and cost associated with performing and interpreting the experiment. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Quantifying reactive transport processes governing arsenic mobility in a Bengal Delta aquifer

    NASA Astrophysics Data System (ADS)

    Rawson, Joey; Neidhardt, Harald; Siade, Adam; Berg, Michael; Prommer, Henning

    2017-04-01

    Over the last few decades significant progress has been made to characterize the extent and severity of groundwater arsenic pollution in S/SE Asia, and to understand the underlying geochemical processes. However, comparably little effort has been made to merge the findings from this research into quantitative frameworks that allow for a process-based quantitative analysis of observed arsenic behavior and predictions of its future fate. Therefore, this study developed and tested field-scale numerical modelling approaches to represent the primary and secondary geochemical processes associated with the reductive dissolution of Fe-oxy(hydr)oxides and the concomitant release of sorbed arsenic. We employed data from an in situ field experiment in the Bengal Delta Plain, which investigated the influence of labile organic matter (sucrose) on the mobility of Fe, Mn, and As. The data collected during the field experiment were used to guide our model development and to constrain the model parameterisation. Our results show that sucrose oxidation coupled to the reductive dissolution of Fe-oxy(hydr)oxides was accompanied by multiple secondary geochemical reactions that are not easily and uniquely identifiable and quantifiable. Those secondary reactions can explain the disparity between the observed Fe and As behavior. Our modelling results suggest that a significant fraction of the released As is scavenged through (co-)precipitation with newly formed Fe-minerals, specifically magnetite, rather than through sorption to pre-existing and freshly precipitated iron minerals.

  19. A jackknife approach to quantifying single-trial correlation between covariance-based metrics undefined on a single-trial basis.

    PubMed

    Richter, Craig G; Thompson, William H; Bosman, Conrado A; Fries, Pascal

    2015-07-01

    The quantification of covariance between neuronal activities (functional connectivity) requires the observation of correlated changes and therefore multiple observations. The strength of such neuronal correlations may itself undergo moment-by-moment fluctuations, which might e.g. lead to fluctuations in single-trial metrics such as reaction time (RT), or may co-fluctuate with the correlation between activity in other brain areas. Yet, quantifying the relation between moment-by-moment co-fluctuations in neuronal correlations is precluded by the fact that neuronal correlations are not defined per single observation. The proposed solution quantifies this relation by first calculating neuronal correlations for all leave-one-out subsamples (i.e. the jackknife replications of all observations) and then correlating these values. Because the correlation is calculated between jackknife replications, we address this approach as jackknife correlation (JC). First, we demonstrate the equivalence of JC to conventional correlation for simulated paired data that are defined per observation and therefore allow the calculation of conventional correlation. While the JC recovers the conventional correlation precisely, alternative approaches, like sorting-and-binning, result in detrimental effects of the analysis parameters. We then explore the case of relating two spectral correlation metrics, like coherence, that require multiple observation epochs, where the only viable alternative analysis approaches are based on some form of epoch subdivision, which results in reduced spectral resolution and poor spectral estimators. We show that JC outperforms these approaches, particularly for short epoch lengths, without sacrificing any spectral resolution. Finally, we note that the JC can be applied to relate fluctuations in any smooth metric that is not defined on single observations. Copyright © 2015. Published by Elsevier Inc.

  20. Quantifying three dimensional reconnection in fragmented current layers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wyper, P. F., E-mail: peter.f.wyper@nasa.gov; Hesse, M., E-mail: michael.hesse-1@nasa.gov

    There is growing evidence that when magnetic reconnection occurs in high Lundquist number plasmas such as in the Solar Corona or the Earth's Magnetosphere it does so within a fragmented, rather than a smooth current layer. Within the extent of these fragmented current regions, the associated magnetic flux transfer and energy release occur simultaneously in many different places. This investigation focusses on how best to quantify the rate at which reconnection occurs in such layers. An analytical theory is developed which describes the manner in which new connections form within fragmented current layers in the absence of magnetic nulls. Itmore » is shown that the collective rate at which new connections form can be characterized by two measures; a total rate which measures the true rate at which new connections are formed and a net rate which measures the net change of connection associated with the largest value of the integral of E{sub ||} through all of the non-ideal regions. Two simple analytical models are presented which demonstrate how each should be applied and what they quantify.« less